Sense-making – moving from quality assurance to quality growth

HE is seemingly being ever more exposed to the use of metrics. This is most obviously the case in the current development of the TEF (Teaching Excellence Framework) where often vague, tangential datasets are to be used as a measure of the quality of teaching. One of the stated purposes of such frameworks is to offer institutions insights into how to improve their processes. However, one of the main problems with the use of any type of summative evaluation is that it might offer insights into patterns, and hence ‘what’s’, but has little to offer in terms of ‘how’ or ‘why’. Evaluations can also begin to pervert the processes they are intended to improve as they become part of the ‘accountability-complex’,

The paradox is that the accountability fervor meant to assure performance can have direct and indirect consequences that undermine it.’

(Halachmi, 2014)

A number of different approaches to programme and module evaluation have started to emerge, including the inclusion of student perspectives. Here, I outline a view which takes this as a starting point and considers the potential of a theoretical framework called Normalization Process Theory (which was developed within the health and social care area) to help develop holiploigic practice. The process I am currently developing starts from the definition of sense-making of Klein, given by Snowden,

‘Sensemaking is the ability or attempt to make sense of an ambiguous situation. More exactly, sensemaking is the process of creating situational awareness and understanding in situations of high complexity or uncertainty in order to make decisions. It is “a motivated, continuous effort to understand connections (which can be among people, places, and events) in order to anticipate their trajectories and act effectively.’

This stresses the ongoing nature of sense-making in an attempt to understand the evolving complexity of a context. In the case of a masters module, sense-making becomes a process of understanding experiences and perceptions of students as their work develops within a module, rather than waiting until the end of the module to gain retrospective perspectives.

To develop a framework for sense-making, I suggest here the use of Normalization Process Theory (NPT). This theory might not always work for sense-making activities, but where the focus is on embedding a change or process, it is ideal. NPT was developed by May and Finch (2009) as a way of understanding and assessing innovational change in health and social care contexts. It distinguishes between implementation (a relatively straight forward process), and normalization such that the innovation or change becomes embedded (a very difficult shift to achieve). It is in the gap between implementation and normalization that ‘zombie innovation’ (Wood, 2017) occurs, senior leaders believing that organizational proclamation leads to embedded day to day practice. But often, such proclamations merely lead to initial implementation, followed by ‘compliance under surveillance’ – i.e. the change will be present in official documents and assurance of practice, but not in day to day work. NPT is structured into four elements:

1.      Coherence this is the element of using a new practice which involves understanding how the new practice is different to what is currently done, and also being able to clearly understand and operationalize the aims and objectives of the new practice.

2.      Cognitive participation – this is the work individuals do to develop a collaborative approach to the change which is being undertaken. Are they able to create a successful community of practice?   

3.      Collective action – this relates to the resourcing and collective work done by a community to embed practice. It includes the development of new knowledge, understanding how the facets of a change can be brought together and generation of new practices.

4.      Reflexive monitoring – this is the appraisal work a group and individuals do to understand the processes and outputs of a change, as well as considering how localized changes might be developed further to ensure successful embedding of new practice.

By using these elements to sense how learning across a module is developed, it might be possible to understand how learning and skills are becoming embedded as the module is being experienced. This leads to the potential for changes and development in real time.

The example here is a module in an MA International Education programme. Early in the course all students complete a core research methods module. This module allows students, many of whom have never encountered research methods in their prior university experiences, to gain a foundation across approaches in education. An assignment concludes the module, based on asking students to create a research project plan, before piloting a single data collection technique and evaluating it.

Whilst the research methods module offers a positive initial experience of research methods, it is a large jump from this to a dissertation study of 20,000 words. As a consequence, those students undertaking an optional pathway in innovation and reform in education are asked to complete a research module (30 credits). They work in pairs to develop and complete a small-scale research project based on an issue relating to innovation and/or reform. Sessions are led as group tutorials, covering and developing issues the students feel they need further help with, as well as reporting back to the group on a regular basis to discussion ideas, plans and execution.

Given the challenging nature of the project for many, whilst it would be possible to evaluate the module at the end of the process, it would be far better to sense-make throughout the module. Therefore, given that the nature of the project is to help students embed new practices as they move towards their dissertation, I am currently beginning to think about the potential for NPT to act as a positive framework. The intention is to use four short questionnaires at points over the course of the module, followed by a focus group on each occasion as a way of understanding the nature of student learning and practice development. The first questionnaire, focused on issues of coherence, is given below as an example,      


The intention of this phase is to ensure that the students understand what the purpose and aims of the research project are. If students do not understand this then we are building on a poor foundation from the very start of the process. By investigating this early on I can work with the students to sense the level of confidence, knowledge and conceptual understanding on which they can baser their work in the coming weeks.


Once we have started this process, in a couple of weeks’ time, we will move on to consider and develop a sense of participants’ emerging work together, and the degree to which the taught sessions are helping them become part of a wider research community.


Halachmi A (2014) Accountability Overloads in M Bovens, R.E. Goodin & T. Schillemans, The Oxford Handbook of Public Accountability, Oxford University Press.

May C and Finch T (2009) Implementing, embedding, and integrating practices: An outline of Normalization Process Theory. Sociology 43(3): 535–554.

Wood P (2017) Overcoming the problem of embedding change in educational organizations: A perspective from Normalization Process Theory. Management in Education 31(1): 33-38.



Thinking About Action Research and Time

In researching pedagogy, one of the methodological approaches I favour is action research. In some quarters action research is out of favour. It is seen as being too ‘local’, too ‘small scale’, and has little to say in the rush towards finding pedagogic ‘solutions’, the bread and butter of the ‘what works’ movement. But perhaps one of the main perceived problems with action research is its potential for bias, and a lack of structured evidence. These criticisms may be correct – in some instances, but why? I wonder to what degree time and the perception of time in education might be responsible for this.

Action research was initially popularised on the back of the work of Kurt Lewin. As a result of his work, the often summarised ‘process’ of action research became identified by the diagram below:


This diagram can give those wanting a quick fix of research an oversimplified notion of what is involved in action research. It has the potential for a very powerful methodology, but if taken at face value can lead to a seriously atrophied version of the process. Here, the cycle can be read as one focusing on a teacher, or teachers, reflecting on their practice, deciding what deficit might exist, planning for change and then enacting that change before reflecting on how successful they thought it was. But there is no explicit consideration here of the data capture which might be used, the degree of claim to be made at the far end of a cycle of research, or whether the focus chosen was based on personal bias or wider evidence. If it is taken as a personal or group-based reflective process, it becomes a ‘quick’ process. Reflection, planning, acting and observing can become a quick process and can give the impression of moving forward at a rapid rate. This is the illusion of action research as ‘rapid innovation’. However, it also panders to the current vogue in education for making rapid shifts, showing accelerated change with equally certain proclamations of success. But I would argue that true transformation and change is paradoxically a slow, measured process, but one which is also to a great extent contextualised.

I think action research has a huge potential utility in bringing positive change and for acting as a basis for informed discussion of pedagogic practice and change. However, to act as a useful and nuanced tool it is important that action research is approached in the same way as any other research methodology – with care and time. All too often it is seen as a ‘soft’ and ‘easy’ option, something that can be utilised as long as Lewin’s cycle above is followed at face value. I think one useful step we could take in moving action research forward is to abandon Lewin’s cycle within popular accounts and discussion, and replace it with the cycle developed by Andy Townsend (2010). The diagram below is a summary graphic of my interpretation of his framework:


This cycle begins with a consideration of an area for work, with further discussion to refine that idea to one that can become the focus of a piece of research. Importantly, a reconnaissance stage is included. This stage is intended to explore the chosen issue further, do others see the same problem as the person/people who are conducting the research? Does some form of baseline data help characterise the issue in the context in which it is being explored? In discussion with some Chinese ELT tutors, we even discussed the idea of looking at larger-scale quantitative data from within and beyond the organisation involved. This stage helps us to begin to gain a more in-depth and critical understanding of the context we wish to explore and intervene in – but it takes time. Having reflected on the initial focus in relation to this reconnaissance data, we can develop a more focused and meaningful action, or if we find that our initial ideas were misplaced, we might go back to redefining an initial focus, starting the process over. The cycle then involves an intervention, and from this to a reflection and evaluation of the change involved. The evaluation is important as it emphasises the need to include in the planning for action phase a coherent and meaningful framework for data collection. If a coherent data collection framework is developed, this also leads to the need for a coherent data analysis/interpretation framework from which reflection and evaluation has more critical meaning and offers more well-founded insights for future work.

This alternative way of understanding action research makes the need for a deeper and more critical approach much more explicit. But the essential feature for me is the need for a greater amount of time. It is reflective throughout, critical and considered. It makes explicit the need for a data collection framework which extends well beyond ‘reflecting on practice’, and a proper consideration of data interrogation. This model of action research is a slower process; in recently submitted projects as part of a PGCert in action research, students only completed one or two cycles of action research over the course of an academic year. However, the insights they gained were based explicitly in the data they had collected, and also recognised the contextual and nuanced messages their research could offer. By making the complexity of action research more explicit, and by repositioning it as a slower, data-based process the insights we gain may actually lead to more rapid and meaningful change and innovation.


Townsend, A., 2010. Action Research. In: Hartas, D, ed., Educational Research and Inquiry: Qualitative and Quantitative Approaches. Continuum. 131-145.

Also, read the following as a great introduction to action research:

Townsend, A, 2013. Action research: the challenges of understanding and changing practice Maidenhead : Open University Press.

What makes good research? Some reflections

Some ideas concerning the features of good research in education developed through dialogue with colleagues and international masters students:

  • a focus on a definable issue or problem. Research needs to be focused and have a clear area for exploration. If it is too broad it becomes too unwieldy and difficult to collect meaningful data. In attempting to develop a coherent focus for research the use of research questions is extremely important;
  • the need for an ethical approach. All research in education should be developed with an explicit understanding that it should be an ethical process. The vast majority of research in this field includes human participants in some way. Our research should always protect the well-being and dignity of both the participants and researchers. This is often the stated purpose of research ethics, the ‘legal’ aspects which are often the focus of review panels. However, we also stress that ethical research should also focus on the need for honest and transparent reporting so that the work completed can be read critically and fairly by peers. This includes the reporting of research approaches, any conflicts of interest and the context of the research. It also requires that when we rely on the work of others we reference them fully so that they are given due recognition for their work;
  • give a clear outline of the context of research. The process of education is highly complex. Therefore, when writing about research it is always important to give readers a clear context (albeit anonymised) for the research. If a small-scale study is completed with a class of 12 and 13 year olds, in an inner-city school, composed predominantly of more able students then it is important the reader has this information so that they can understand the context of the research data gained. This also allows the reader to consider the degree of relevance of the research to their own situation. It is a central part of honest and transparent educational reporting and debate;
  • making use of research literature to inform the research design. The vast majority of research builds on work already done. It is important to begin to gain an understanding of the research which has been published previously in an area of interest. We need to be good at reading and assessing research so that we can judge the degree of evidence on which we might build our own work;
  • gives a clear outline/discussion of the methodology and methods which have been used to collect data. Ethical research should make the methodology and methods which have been used to collect data transparent. Readers need to know how our research has been carried out as this is crucial to being able to interpret data, and therefore engage critically with any claims which are made. Decisions concerning preferred methodologies gives an insight into the way the research is positioned and the nature of claims made.An account of the data collection tools (methods) used are equally important for the same reasons. If a study has used interviews, are the questions reported so that we can judge the level of neutrality? Where observations are used, is the focus and method of data capture explained? If these issues are not thought through and reported then a considered, critical reading of the research cannot be achieved. Where research occurs at a meta-level, through the use of literature reviews for example, it should include a methodology outlining search criteria, filtering processes and how publications have been analysed. If the literature review merely presents an area of research with no methodology, it needs to be read with caution as we have no way of assessing its validity;
  • uses appropriate methods which clearly link back to the initial issues/problems and research questions. Well-conceived research will be able to make clear as to where particular methods help in investigating the chosen issues/research questions; this gives the research coherence;
  • analyses collected data in a transparent way. In the same way as it is important to carefully consider the reporting of methodology and methods, so it is the case with analysing the data which has been collected. Analysis is often not considered to the same level of detail as methodology and data collection, but it is crucial in ensuring a reasoned and valid consideration of the data, particularly in trying to minimise biases and selective use of data. To make the process transparent it is again important to report how data has been analysed;
  • develops explanations and discussion derived from the data. Good research develops a clear discussion of the data which has been collected. This is at the centre of reporting research as it is where the interpretation of the project is developed. It is crucial that explanations emerge from the data provided and is not dissonant with the evidence provided. In addition the discussion of the data should be related to the literature which you have engaged with and which is the foundation upon which the research study rests;
  • offers measured insights/conclusions. Finally, good research is measured in the claims made. Small-scale research cannot easily make claims which can be scaled up to a large scale, in other words an analysis of one cycle of action research focusing on improving questioning practices in one class cannot act as the basis for national policy. However, small-scale research can still provide extremely important insights for further study and for practitioners by providing useful information as to where good practice might be found. Where conclusions include polemic and assertive language, it can often be the first sign we need to explore the study and its messages further. Many large-scale research projects rely on quantitative analyses. Insights are often based on statistical manipulations and offer a great deal of useful exploration of patterns and trends. However, in-depth explanations are sometimes more problematic as this type of research is often much stronger in providing answers to the ‘what’ rather than the ‘why’. All research has potential shortcomings as no approach is perfect or has all of the answers to an area of interest. Often, deep insights occur through a long-term application of a number of both qualitative and quantitative approaches, used to augment understanding, and giving progressively fuller and more critical perspectives on the issue of interest.

Book Review: An Introduction to Qualitative Research Synthesis

Major, C.H. & Savin-Baden, M. (2010) An Introduction to Qualitative Research Synthesis: Managing the Information Explosion in Social Science Research. Abingdon: Routeldge.

Over recent years there has been a valourisation of large-scale, quantitative research from some quarters. In some ways this is no surprise as there is a ready appeal to see ‘generalised’ patterns in data which can then be used for decision-making and policy formation. However, in this scramble for the use of ‘big data’ there has also been some criticism of qualitative research as being ‘anecdotal’, too focused on the particular, and therefore of little use when it comes to decision-making and policy generation; this has unfortunately also led to a shift in educational research funding which often appears to follow this logic. This book instead focuses on qualitative research and provides a very well argued case for the synthesis of qualitative studies as an additional route to providing insights for practitioners and policy-makers.

The book has three parts, the first two of which (‘Arguing for qualitative research synthesis’ and ‘Doing qualitative research synthesis’) outline and discussion the approach, whilst the third offers examples and frameworks for carrying out qualitative research synthesises (QRS). The first section includes a very clear argument for the use of QRS as an approach for combining and interpreting qualitative research studies. A refreshing element of the first chapter is a clear engagement with the possible problems and restrictions of QRS under the title ‘Top ten criticisms of the approach, point and counterpoint’. This helps develop an honest debate about the potential limitations of the approach whilst making transparent the philosophical and methodological foundations of the approach. This critical voice is retained throughout the book and I think provides an excellent example to those new to educational research in how to build arguments whilst being transparent about both approach and possible restrictions and problems. As the authors state at one point in the book, no methodology is perfect and honest discussion of the limitations and problems which occur as research is undertaken is important. The second chapter in this first section then goes on to locate QRS within the wider field of research syntheses, discussing how it is linked to, but different from, traditional literature reviews and structured reviews such as those developed by EPPI.

The second section goes on to outline the stages in carrying out a QRS, from development of a question around which the synthesis is structured, through designing and completing a search, analysing, synthesising and interpreting the data to presenting the outcomes. The overall explanations are very clear and with the examples provided in section three, there is a good overall explanation of the approach. Importantly, the outlining of QRS is not given in the form of a ‘recipe’, as the process involves a lot of reflexivity, and therefore it is only possible to outline principles and general structures rather than giving a step-by-step ‘how to’ guide. I particularly liked the sections on establishing plausibility, including validity and trustworthiness, which stress the need for qualitative research to be clear in developing contextual information as well as clear explanations of methodology, data collection and approaches to analysis so that the coherence and quality of evidence can be transparently assessed. One table which is provided to highlight this need is given on page 61, based on a template from the Joanna Briggs Institute and could act as a more general starting point for qualitative researchers as they develop reports of their research.

A rating schedule to establish rigour in findings

  • Unequivocal

Findings supported with clear and compelling evidence

  • Credible

Findings that are plausible given the weight of evidence

  • Unsupported

Findings that are suggested but not supported by data

Reading this section, I was struck by the degree to which it would act as a useful starting point for discussion into considering quality research writing not only in QRS, but in qualitative research writing more generally.

Section three goes on to provide some interesting and useful examples of QRS studies which can be considered alongside the earlier sections to see how the process can be understood through the final product.

This book is a very useful introduction to QRS and offers both a critical and clear overview of how qualitative research reports can be synthesised and interpreted to provide broader insights into educational problems and issues. In my opinion, it sits well alongside Pope et al’s (2007) book focusing on the synthesis of qualitative and quantitative evidence in health. The approaches are different, a point which is stressed by Major and Savin-Baden, but together these two books offer critical approaches to bringing together evidence from across the research spectrum to offer new and interesting insights into educational issues.


Pope, C.; Mays, N & Popay, J. (2007) Synthesizing Qualitative and Quantitative Health Evidence: A Guide to Methods. Maidenhead: Open University Press.

Teaching Research methods – Some initial reflections

On March 20th, we finally finished teaching a research methods module which is a core element of our MA International Education (MAIE) course. As I outlined in this blog last autumn (here, here and here), I have been working with a colleague in the School of Education to develop a new approach to our research methods course.

Having finished the course and the data collection we have captured from a parallel research project of our own on the module, it feels like a good time to consider some initial reflections about our work. This is obviously an initial perception, we need to spend many months analysing the very rich dataset we have collected. Any reflections can’t be taken as a detailed and accurate account. However, several issues seem to have emerged across the module:

1) Thinking about threshold concepts. As we began to develop a curriculum framework we discussed possible threshold concepts in research methods as a basis for instructional design. In a past post, I listed threshold concepts identified by Kiley and Wisher (2010). They saw the threshold concepts relating to research methods as being:

  • argument
  • theory
  • frameworks
  • knowledge creation
  • analysis
  • paradigm

We started from this point, but through discussion emphasised the following concepts as being both central to understanding research methods and also having the potential to be transformatory. Consequently, our list of threshold concepts became:

  • criticality
  • theory
  • methodology
  • ethics
  • analysis
  • epistemology/ontology

In the event, we spent less time on theory as a concept than we had expected, but all of the other concepts became a major part of the course. In student interviews criticality was seen as central to developing an ability to read research and from this to writing well considered and careful texts. Methodology and analysis were also seen as being important for assessing papers as well as being central to a critical and deep understanding of how to carry out research. One student reflected that previously she had read the ‘start and end’ of papers to engage with the main messages; now she first engages with the ‘middle’ to assess the degree to which the research could be used or trusted. Ontology and epistemology were the most difficult concepts to tackle and at the end of the module I would argue that some are still in liminal space in this respect. Some students reflected that at undergraduate level the nature of reality and knowledge, as well as paradigms, were assumed and hence never discussed. As an interdisciplinary pursuit education needs to engage in these debates as researchers from many different traditions meet at this particular crossroads and there is therefore a level of philosophical complexity. Methodology, analysis and ethics were all equally important in aiding students to gain a deeper and holistic understanding on which to base their expanding knowledge and practical experience.

One additional concept which we had not included in our original list but which I would be minded to include having completed the module is that of ‘sampling’. Some struggle with this and yet good understanding often acted as a basis for logical, well considered and critical bridges between methodology and data collection tools. Where sampling was not well understood this bridge was less, if at all, secure and logical explanation of research design began to default to general description and a lack of criticality.

2) Importance of language. We have started to see the research methods module more and more as a language course. This is not only the result of developing a course which predominantly attracts international students, although this is obviously important. We have a number of English speaking students and yet they often commented on the difficulty of engaging with the language. Research methods language is conceptually rich and difficult; we are teaching this language and regardless of student origin, we need to ensure that students understand the language and the concepts underlying it.

3) Research methods as an applied activity. In our planning, we also developed a pedagogic model which sees conceptualisation, knowledge and application as equally important, and intertwined.

understanding elements of learning for a master's RM programme

At the end of the module I feel this is a very useful framework and has aided in developing a critical approach to the module. Conceptualisation is vital as a basis for constructing and developing knowledge. However, where these began to really make sense for students was when they actually enacted their ideas. The application of research methods started from day one of the course and revolved around two practical exercises. Firstly, students acted in pairs to consider the characteristics of good interviewing before developing a set of group research questions based on a research problem given to them by ourselves. From the research questions they discussed and agreed interview questions before splitting into pairs to complete their interviews. Once complete, the pairs then transcribed and encoded their data. This process shadowed their work in face-to-face sessions and therefore their emerging understanding of the module. The final exercise focused on comparing codes across the group to identify re-occurring themes as well as outliers.

Students then moved on to complete a module assignment which asked them to develop an area for research, develop research questions from this, before creating a research design which was then piloted. Subsequent to the pilot, students were then asked to reflect on their experience and how they would change their research design as a prelude to developing their dissertation work.

This proved very challenging, but also, according to some of the students, allowed them to consider how far their understanding of research methods developed.

I am currently discussing the potential for a new master’s degree focusing on praxis-based approaches to education. Having developed our work on research methods I fully intend to embed an emergent element of research methods across all modules of the programme, leading towards a specialist research methods module. Research methods needs to be engaged with over a period of time and within different contexts to give a wide critical and experiential basis for discussion and theoretical understanding.

These are some of the basic reflections from the course, but as I said above, these are only initial and need to be considered in far more detail as we begin to engage with the very large amount of data we have collected from this course. In my next post, I will continue my reflection, by considering the process of researching this module and the utility of considering the learning environment as being a complex adaptive system.

Participatory Lesson Study – Making the capture of data in Lesson Study more explicit

Whilst Learning Study makes explicit use of variation theory (Cheng and Lo, n.d.) as a basis for analysing and understanding the process of learning, Lesson Study can be vague in establishing a link between learning and methods of analysis. Cerbin and Kopp (2006) use an approach called ‘cognitive empathy’ by developing approaches to teaching in the research seminar which make student thinking ‘visible’, in part by attempting to plan from a student perspective. Lewis (2002) considers the need to watch eyes and faces, and capturing discussion between students. Whilst both of these approaches are important and positive, in neither case will they capture the complexity of the learning process which students experience.

In attempting to base data collection on a more critical foundation regarding the learning process, we have considered the work of Nuthall (2003) and Illeris (2007). Nuthall (2007:158) emphasises the complex process of learning and its relation to teaching,

‘…how students learn from classroom activities is not simply a result of teacher-managed activities, but also the result of students’ ongoing relationships with other students and of their own self-created activities or use of resources.’

This means that a series of levels interact to make each student’s learning highly individualised:

  1. A visible layer which is that which is public and teacher-led
  2. A semi-visible layer which is the student-led culture, relationships and interaction
  3. An invisible layer which is that of the mental processes such as prior learning and working memory which is central to individual sense making.

This last layer is not visible and therefore we need to seriously consider our definition of learning as a starting point for developing a meaningful and critical set of methods for data collection.

Here, we have used the learning theory of Illeris (2007) as a basis for our understanding and capture of the learning process experienced by students. He characterises learning as being the amalgamation of a cognitive dimension which is concerned with content and individual cognitive processes, an emotional dimension which includes elements such as motivation, emotion and a will to learn, what Illeris (2007, 24) terms the ‘… mental energy.. needed to carry out a learning process’, and a social dimension which focuses on interaction between the learner and their social and material environment. This means that data capture based on approaches such as observation are still important as they are essential for gaining insight into the social aspects of learning. However, observation of individuals and their behaviours is not able to search inside the individual to gain insights into their cognitive (and often emotional) processes. The result of taking this stance is that we must say explicitly that any capture and analysis of the learning process will always be incomplete; to complete research on teaching and learning is always to work with the partial, the incomplete. Whilst we feel that this approach is appropriate, we believe there needs to be greater explicit discussion within the lesson study research community concerning the processes of learning which inform our understandings of this central issue.

Our alignment with Illeris’ (2007) theory of learning has direct implications for the methods used to gain insights into the process of learning, and also underpins our desire to develop participative approaches. As shown in Figure 3

PLS data capture

The inclusion of student focus groups is seen as helping the lecturers gain an explicit understanding of student prior learning and also which elements in their learning they believe are important for them to take further at a given point in time. The stimulated recall interviews, using artefacts from research seminars as a basis for discussion begin to give insight into the ‘invisible’ worlds (Nuthall, 2007) of students as they engage with the teaching and learning in the research seminar as well as offering extra insights through student afterthoughts. Any discussion which occurs will obviously be incomplete as not all elements of the learning experience will be recounted or remembered and some of the experience may well have been subconscious, or will only be made sense of more fully over time. However, to gain direct testimony from students, particularly when triangulated against research seminar artefacts is an important addition to analysis. These interviews also give the potential to consider the emotional dimension of the learning process, as our experience of this approach to interviewing makes explicit the affective reactions of students to their learning. Meetings, focus groups and interviews are all recorded and sent for transcribing. Transcriptions are then considered thematically to begin to analyse and understand the main insights which a project uncovers.

Other methods used to capture the social dimension of learning include the use of video and audio recording as well as observation itself. We stress that this is a process of augmenting data capture rather than the loss of one approach to be replaced by another.


Cerbin and Kopp (2006) ‘Lesson Study as a Model for Building Pedagogical Knowledge and Improving Teaching.’ International Journal of Teaching and Learning in Higher Education. 18(3), 250-257.

Cheng, E.C.K. & Lo, M.L. (n.d.) The Approach of Learning Study: Its origin and implications.

Illeris, K. (2007) How We Learn: Learning and non-learning in school and beyond. Abingdon: Routledge.

Lewis, C. (2002). Lesson Study: A Handbook of Teacher Led Instructional Change. Philadelphia: Research for Better Schools.

Nuthall, G. (2007) The Hidden Lives of Learners. Wellington: NZCER Press.

Designing a complex curriculum-reflections on knowledge, understanding, concepts and skills

If we are to develop an emergentist curriculum, as suggested in the last post, we need to make room for the emergence of meaning within the seminar room. But in defining a process of meaning making as emergent we cannot have ready-made goals, other than perhaps a loose field of interest within which we construct our work (i.e. the link between coherence and freedom in Davis and Sumara’s (2006) conditions for emergence). One critique that might be made of this approach is that it could lead to a form of ‘radical relativism’ with individuals following any direction they feel is warranted and ending up with very little to show for their endeavour. However, this is to fundamentally misunderstand an emergentist agenda. The coherence element as a foundation for for emergence ensures limits to the field of interest, but within this admits freedom. In addition, by questioning the meaning which individuals develop, as suggested by Osberg and Biesta (2008), the teacher is required to use information and knowledge to challenge thinking and understandings through a mixture of appropriate pedagogic strategies. Thus the goals of the curriculum might not be set closely, but this does not mean knowledge is not sought. I see knowledge as central to the emergence of meaning, but how that knowledge is understood and how it also emerges in the individual needs consideration.

Knowledge is central to any curriculum. But if this is the case then knowledge itself needs defining. The definition of knowledge at one level can be very simple, being the facts, information, and skills acquired through experience or education. However, this hides a very complex area of debate as the search for a definition of knowledge is a central strand of philosophical study and over millennia, has not managed to create a definitive statement which all can agree on, and which stands the test of philosophical scrutiny. The definition of knowledge is also made even more complex by the debate as to the degree to which it stands apart from, or acts an overarching term for, the notions of ‘concepts’, ‘understanding’, and ‘skills’. Each of these terms can be taken as a subset of knowledge (as a concept!). However, how they relate is again a contested area.

Van Camp (2014: 97) sees understanding is a type of knowledge, but nevertheless feels it important to distinguish it as an explicit idea, as he states,

‘To a large extent, much of the aversion to giving understanding any philosophical prominence comes from conflating concepts simply because of linguistic poverty.’

There is a debate over whether understanding is a form of knowledge or something different, and definitions of understanding themselves vary. For example, Kvanvig (2003:192) states

understanding requires the grasping of explanatory and other coherence-making relationships in a large and comprehensive body of information. One can know many unrelated pieces of information, but understanding is achieved only when informational items are pieced together by the subject in question.’

Likewise, Zagzabski (2001: 241) defines understanding as

‘involves seeing the relation of parts to other parts and perhaps even the relation of part to a whole.’

Both of these definitions see understanding as more than basic knowledge. It is characterised by a qualitatively different aspect, the development of a structure within knowledge which is relational. Van Camp (2014) suggests that this view of understanding is, therefore, incremental, and an individual can have more or less understanding depending on the degree to which relational connections have been made. He then goes on to argue that understanding is central to our development of causation,

On my account of understanding, information is better understood if it fits into that network of knowledge, and in tension with fundamental causal beliefs if it does not. So, while causation is not necessary for understanding in principle (other types of explanation, such as unification, can make connections in our knowledge), as a fundamental-perhaps native-worldview, phenomena which are not fitted to a causal framework remain conspicuously outside a comprehensive body of information, and thus not fully understood.’

Therefore, whilst understanding might be a form of knowledge, I would argue that it makes sense to retain it as differentiated from knowledge as a concept as it emphasises the explicit purpose of denoting the links and developing network of knowledge which we gain as we learn.

A simple diagrammatic way of showing this is


Concepts are likewise difficult to define. At a very simple level concepts can be defined as mental representations of classes of things (Murphy, 2004) inside the head. Mead and Gray (2010) develop this simple definition by considering how concepts might be understood within the wider context of ‘threshold concepts’. They consider the form and role of concepts within disciplines, emphasising the difference between private and public conceptions (or mental representations). They differentiate between the concepts we have inside our own heads, which are prone to change, and those which are shared (disciplinary) and which tend to be much more stable as change here requires negotiation and debate. They see concepts is providing the ‘underlying logic’ (p.99) used to develop and structure knowledge. Perkins (2006) in his discussion of troublesome knowledge uses Foucault’s notion of ‘episteme’ (any historical period’s way of configuring knowledge), referring to ‘a system of ideas or way of understanding that allows us to establish knowledge.’ (p.41-2). Concept is therefore positioned as a logical framework or system which allows us to structure knowledge in a way that supports and promotes understanding. Concepts by this definition become the foundation on which we structure and make sense of knowledge and understanding. As such I argue that they should also be the basis for building curricula. To add to the diagrammatic structure given above, concepts can be seen as underpinning knowledge and understanding.


Finally, there is the issue of skills. Skills again can be defined as knowledge – procedural knowledge, which is the knowledge exercised in the performance of a task. What is important here, regardless of the term used is the idea of application. Skills/procedural knowledge is concerned with the performance of something, be it driving a car (rather than just knowing how a car works), or being able to successfully search for information; procedural knowledge is therefore of a different form of knowledge when compared to declarative knowledge (knowledge about something).

In developing and enacting an emergent curriculum, I will define ‘knowledge’ as equating to declarative knowledge, which is made increasingly useful by the relational growth of understanding. How these nodes and relationships are given a structure occurs through the underpinning power of disciplinary concepts which provide the overarching logical framework for disciplinary knowledge and understanding. Finally, given that I have retained the term knowledge to refer to declarative knowledge, I use the term ‘skill’ rather than procedural knowledge to refer to the application of knowledge, understanding and concepts. Therefore, in developing the terms of an emergentist curriculum, the following conceptual diagram becomes a useful structure for thinking about the detail in developing such an approach.


In the next post, I will consider some of the practical ramifications of defining these processes in the way presented here, and how they interact with notions of curriculum and assessment to give a coherent approach to programme development.


Davis, B. & Sumara, D. (2006) Complexity and Education: Inquiries into Learning, Teaching, and Research. New York: Routledge.

Kvanvig, J. (2003) The value of knowledge and the pursuit of understanding. Cambridge: Cambridge University Press.

Mead, J. & Gray, S. (2010) ‘Contexts for Threshold Concepts (1) A conceptual Structure for Localizing Candidates.’ In J.H.F. Meyer, R. Land and C. Baillie (eds.) Threshold Concepts and Transformational Learning. pp. 97-113.Rotterdam: Sense Publishers.

Murphy, G. (2004) The Big Book of Concepts. London: The MIT Press.

Osberg, D. &Biesta, G. (2008) ‘the emergent curriculum: navigating complex course between unguided learning and planned enculturation.’ Journal of Curriculum Studies, 40(3): 313-328.

Perkins, D. (2006) ‘Constructivism and troublesome knowledge.‘ in J. Meyer and R. Land (eds.) Overcoming barriers to student understanding: Threshold concepts and troublesome knowledge. Pp. 33-47. Abingdon: Routledge.

Van Camp, W. (2014) ‘Explaining understanding (or understanding explanation).’ European Journal of Philosophy of Science, 4(1): 95-114.

Zagzebski, L. (2001) ‘Recovering understanding’ in M. Steup (ed.) Knowledge, truth and duty, pp.235-251. Oxford: Oxford University Press.