Lesson Study – thinking through the possibility of a distance learning variant

Over the past two or three years, I have worked as part of the Lesson Study Research Group at the School of Education, University of Leicester. Over this time we have developed the use of lesson study in a number of contexts, one of which has been with post-graduate groups in education. Our use of the approach in this context has given us a lot of useful insights particularly in relation to the learning of international students with whom we work.

Over the same time period I have also been increasingly involved in designing, delivering, tutoring and innovating on distance learning (DL) courses at masters level. DL presents a series of new and interesting pedagogic challenges as the nature of the contact between tutors and students can vary widely within and between courses. Many DL masters courses do include some collaborative elements to learning, be it through collaborative writing tasks, discussion board exercises or through the use of skype or other video-orientated media. These give us some glimpse into the thinking and learning processes of students, but to a great extent DL remains opaque to understanding the processes students are engaged in, particularly when related to specific areas of the curriculum we believe they might find challenging; after all DL by definition tends to lead to tutors being as interested in summative pieces and outcomes as the day to day processes of student learning.

The complexity of capturing student learning processes is increased due to the varied professional contexts of students. Our students work in different educational contexts, from primary, to special education to higher education and across all time zones. This makes capturing and understanding learning difficult in any systematic way other than through the submission of assignment drafts and final pieces. However, to develop DL activities and curricula we need to begin to access other activities in a more systematic and critical way.

Lesson study works by identifying ‘learning challenges’, i.e. specific areas of a curriculum students struggle with, and then collaboratively discussing and planning enhanced and/or new lessons with the specific aim of understanding the nature of the challenge and overcoming it to aid students’ learning. In a face to face context the process of doing this might take the form shown in the diagram below, discussed in an earlier post.


Is it possible to develop a variant of this approach for use with distance learning? We can replicate the identification of the learning challenge based on past experience and past submitted assignments. One example is the continued challenge of helping students understand the concepts of ontology > epistemology > paradigms in research methods/literacy modules. Having identified the learning challenge, it is then possible to collaboratively create a set of activities to be completed online. Hence, the focusing and planning elements of lesson study remain the same for DL as they do for face to face applications. Where the main variation would occur is in the observation of learning. In our work on lesson study we advocate the use of observation of case students during a session, but accept that the insights are partial and incomplete. This is why we routinely record artefacts from students’ learning and carry out stimulated recall interviewing, as these give different, and often deeper, levels of insight into the learning process. For lesson study to work in a DL context this is the area where we would need to think about, data capture. The following is suggested as a possible way forward:

  1. The activities developed would require some form of process capture. This might be notes, concept mapping, the development of an artefact, such as a questionnaire, a mixture of these, or any other relevant outcomes.
  2. The students would complete the activities, but then would be asked to capture how they had completed the activities through some form of self-explanation. The easiest way of achieving this would be to use some form of screen capture software such as http://screencast-o-matic.com/home . Students would be given a series of prompts through which they would explain the process they had undertaken to gain the outcomes in their work. We would ask them to send both a copy of the work and their video for us to analyse, and then we would carry out short stimulated recall interviews to supplement our understanding of their experience and learning. Towards the end of these interviews we could also include some evaluative elements so as to consider further task design development in a wider sense.
  3. Having gained all of the evidence, we would then evaluate the activities as we would normally do for a lesson study cycle.

This DL variant would be a relatively simple framework to develop and test, but would potentially give us a huge amount of data on the ways in which different students interact with materials and how this helps or hinders their learning. As a consequence, we would not only begin to develop specific elements of the courses we are involved in but a more global set of ideas, principles and task designs might begin to emerge from insights and data gained.


What do we mean by pedagogy? (Part 4) Thinking about Curriculum 2

In my last post I suggested that the masters framework developed by the QAA for HE in Scotland (2013) offered a very useful basis for developing pedagogy. I also argued that curriculum needs to be more than a list of content, instead seeing the roles of emergence and process as crucial to the work of masters students and therefore in designing curricula. In developing this perspective on curriculum, how might a practical framework look?

priciples of curriculum design

This model is an attempt to capture the complexity and process orientation towards curriculum which is informed by the work of Knight (2001). This model starts from a position of seeing knowledge as a central element of any curriculum. Knowledge is the building blocks on which debate and argumentation are based. Therefore, it is a crucial element in constructing any curriculum. However, by itself it is not enough. Of equal importance is the structure which supports these building blocks – the explicit discussion of concepts. Threshold concepts (Meyer, Land and Baillie, 2010) have become a useful basis for developing the overarching framework for a course, and indeed modules (whilst accepting that in any given module threshold concepts for many may remain liminal). At masters level there is every chance that students will move from a core area of knowledge to pursue and specialise in particular spheres within a module. The explicit use of threshold concepts allows this process to occur within a coherent, wider ‘field’ of study; whilst individuals may begin to investigate different subject areas and contexts the concepts ensure a level of coherence and allow a common point of contact for discussion and engagement with the work of others. The use of an explicit conceptual framework also gives a general scheme for the process of learning to operate within. In this sense, the interplay of a conceptual schema (boundary settings) with individual investigation and growth (freedom) is in keeping with Davis and Sumara’s (2006) identification of factors necessary for emergence in their work on complexity theory.

If knowledge and concepts are engaged with alone however, then there is a deficit in the applied/practical use of the emerging learning. Hence, application is also important as this is where schema, a developing knowledge base and understanding are utilised and ‘tested’.

It is at the intersection of the three dimensions of knowledge, concepts and application where curriculum as process (Knight, 2001; Stenhouse, 1975) can be made real. Together, they give the possibility for emerging understanding (here used in the way I’ve interpreted Van Camp (2014) to emphasise the connection of ideas and knowledge in networks) and application based on engagement with knowledge, concepts and their application. It is in this emerging interpenetration (Byrne and Callaghan, 2014) of these systems that both new insights and new knowledge can emerge. But at this level, this is a personal journey for each student with different contexts, interests and applications driving learning. Hence, curriculum as product (Stenhouse, 1975) makes little sense as the possible outcomes are hugely diverse whilst still operating within a loose framework and from common starting points (For an example of how this model for curriculum has been used in our work so far see an earlier post & our research methods pedagogy website)

As suggested in earlier posts, to understand and develop curricula where diversity and process are key, we need to have a clear understanding of the role of assessment in aiding the emergent process model, but as I’ll also reflect upon in future posts, interpenetration has major ramifications for the way we understand learning and teaching; to reiterate, to suggest that pedagogy can be the study of teaching alone makes little sense.


Byrne, D. & Callaghan, G. (2014) Complexity Theory and the Social Sciences: The state of the art. Abingdon: Routledge.

Davis, B. & Sumara, D. (2006) Complexity and Education: Inquiries into Learning, Teaching, and Research. New York: Routledge.

Knight, P.T. (2001) ‘Complexity and Curriculum: A process approach to curriculum-making.’ Teaching in Higher Education, 6(3), 369-381.

Meyer, J.H.F., Land, R. and Baillie, C. (2010) Threshold Concepts and Transformational Learning, (eds), Rotterdam: Sense Publishers.

Quality Assurance Agency for Higher Education (2013) What is mastersness? Discussion Paper. Retrieved from: http://www.enhancementthemes.ac.uk/docs/report/what-is-mastersness.pdf [Last accessed 5/7/15]

Stenhouse, L. (1975) An Introduction to Curriculum Research and Development. London: Heinemann.

Van Camp, W. (2014) ‘Explaining understanding (or understanding explanation).’ European Journal of Philosophy of Science, 4(1): 95-114.

Thinking About Action Research and Time

In researching pedagogy, one of the methodological approaches I favour is action research. In some quarters action research is out of favour. It is seen as being too ‘local’, too ‘small scale’, and has little to say in the rush towards finding pedagogic ‘solutions’, the bread and butter of the ‘what works’ movement. But perhaps one of the main perceived problems with action research is its potential for bias, and a lack of structured evidence. These criticisms may be correct – in some instances, but why? I wonder to what degree time and the perception of time in education might be responsible for this.

Action research was initially popularised on the back of the work of Kurt Lewin. As a result of his work, the often summarised ‘process’ of action research became identified by the diagram below:


This diagram can give those wanting a quick fix of research an oversimplified notion of what is involved in action research. It has the potential for a very powerful methodology, but if taken at face value can lead to a seriously atrophied version of the process. Here, the cycle can be read as one focusing on a teacher, or teachers, reflecting on their practice, deciding what deficit might exist, planning for change and then enacting that change before reflecting on how successful they thought it was. But there is no explicit consideration here of the data capture which might be used, the degree of claim to be made at the far end of a cycle of research, or whether the focus chosen was based on personal bias or wider evidence. If it is taken as a personal or group-based reflective process, it becomes a ‘quick’ process. Reflection, planning, acting and observing can become a quick process and can give the impression of moving forward at a rapid rate. This is the illusion of action research as ‘rapid innovation’. However, it also panders to the current vogue in education for making rapid shifts, showing accelerated change with equally certain proclamations of success. But I would argue that true transformation and change is paradoxically a slow, measured process, but one which is also to a great extent contextualised.

I think action research has a huge potential utility in bringing positive change and for acting as a basis for informed discussion of pedagogic practice and change. However, to act as a useful and nuanced tool it is important that action research is approached in the same way as any other research methodology – with care and time. All too often it is seen as a ‘soft’ and ‘easy’ option, something that can be utilised as long as Lewin’s cycle above is followed at face value. I think one useful step we could take in moving action research forward is to abandon Lewin’s cycle within popular accounts and discussion, and replace it with the cycle developed by Andy Townsend (2010). The diagram below is a summary graphic of my interpretation of his framework:


This cycle begins with a consideration of an area for work, with further discussion to refine that idea to one that can become the focus of a piece of research. Importantly, a reconnaissance stage is included. This stage is intended to explore the chosen issue further, do others see the same problem as the person/people who are conducting the research? Does some form of baseline data help characterise the issue in the context in which it is being explored? In discussion with some Chinese ELT tutors, we even discussed the idea of looking at larger-scale quantitative data from within and beyond the organisation involved. This stage helps us to begin to gain a more in-depth and critical understanding of the context we wish to explore and intervene in – but it takes time. Having reflected on the initial focus in relation to this reconnaissance data, we can develop a more focused and meaningful action, or if we find that our initial ideas were misplaced, we might go back to redefining an initial focus, starting the process over. The cycle then involves an intervention, and from this to a reflection and evaluation of the change involved. The evaluation is important as it emphasises the need to include in the planning for action phase a coherent and meaningful framework for data collection. If a coherent data collection framework is developed, this also leads to the need for a coherent data analysis/interpretation framework from which reflection and evaluation has more critical meaning and offers more well-founded insights for future work.

This alternative way of understanding action research makes the need for a deeper and more critical approach much more explicit. But the essential feature for me is the need for a greater amount of time. It is reflective throughout, critical and considered. It makes explicit the need for a data collection framework which extends well beyond ‘reflecting on practice’, and a proper consideration of data interrogation. This model of action research is a slower process; in recently submitted projects as part of a PGCert in action research, students only completed one or two cycles of action research over the course of an academic year. However, the insights they gained were based explicitly in the data they had collected, and also recognised the contextual and nuanced messages their research could offer. By making the complexity of action research more explicit, and by repositioning it as a slower, data-based process the insights we gain may actually lead to more rapid and meaningful change and innovation.


Townsend, A., 2010. Action Research. In: Hartas, D, ed., Educational Research and Inquiry: Qualitative and Quantitative Approaches. Continuum. 131-145.

Also, read the following as a great introduction to action research:

Townsend, A, 2013. Action research: the challenges of understanding and changing practice Maidenhead : Open University Press.

What do we mean by pedagogy? thinking through some conceptual frameworks. (Part 1)

‘Pedagogy: the methods and practice of teaching, especially as an academic subject or theoretical concept.’

Over the past year I’ve been involved in a research project developing masters level research methods provision with a colleague in the School of Education. Almost by default we started to refer to what we have been researching and evolving as ‘research methods pedagogies’. Informally, this became a convenient way to refer to our work. This week we ran a one day workshop, trying to open up a space-time for interested staff to reflect on their own approach to research methods and consider how they might take their own practice forward.

Towards the end of the day we asked the group to reflect on some of the ideas we had been discussing. One question we offered for consideration was ‘Is there a separate research methods pedagogy?’. This led to a lot of discussion and exposed some of the complexities concerning what we might mean by pedagogy, and how it, as a concept, fits within wider educational discussion. Following the definition at the start of this post, if pedagogy is specifically the method and practice of teaching it might be agreed that it is a generally generic activity; the context or content of the teaching might change, be it research methods, leadership, policy or inclusion but the underlying pedagogic approach remains overwhelmingly unchanged. This may characterise pedagogy as the consequence of a set of universal principles, such as the utilisation of a set of cognitive principles, e.g. working memory, etc. This opens up the potential for a reductive philosophy based on the principle that we can find the ‘most efficient process/pattern’ for embedding information and knowledge into students which apply for all aspects and contexts in teaching. From this position the idea of a ‘research methods pedagogy’ is highly problematic as it is not a unique approach which is being taken as the pedagogy is probably similar, if not identical, to any other pedagogic context taken by any individual teacher or teachers. Any difference which does occur is due to other elements of a module such as assessment or curriculum; the pedagogy is a constant.

A slightly different argument is that of Shulman’s (2005) ‘signature pedagogies’. Here, there might be commonalities between clusters of disciplines, but at the same time disciplines have particular teaching approaches which are distinct to them and which are responsible in part for the development of habits of mind within each discipline. This sees pedagogy as a hybrid, part particular to a field, part more generic. But where does research methods pedagogy fit within this scheme? Is it a generic pedagogy, similar across clusters of disciplines, or are research methods frameworks actually particular to disciplines? Does it matter? In reply to the first question, does the fact that education is an interdisciplinary field as opposed to a discipline have an impact on how we define and understand research methods pedagogy?

So pedagogy can be seen as a process focused wholly on the act of teaching, underpinned by a series of ‘universals’ relating to learning, or it can be seen as at least partially contextual at the disciplinary level. However, there is another wholly different perspective we can develop. If we go back to the initial definition of pedagogy as the method and practice of teaching, what is the true utility of such a statement if restricted by the statement? If the idea of pedagogy as teaching method is seen in isolation, taken in the context of a masters module on research methods, it becomes ultimately meaningless. If teaching is seen as a system, composed of a series of elements and relationships it can only become useful when considered as interpenetrating with other systems. Cilliers (2001: 143) states that:

‘The cross-communications between hierarchies are not accidental, but part of the adaptability of the system.’

If pedagogy is synonymous with teaching, what are the systems into which it interpenetrates? Teaching is senseless without interpenetration to learning, curriculum and assessment. These are the main systems which together go to make up the seminar-centred process of helping students to gain new conceptual understanding, knowledge and skills. And in each case, they are themselves complex systems rather than unitary features. But if we see these systems as interpenetrating and accept that they need to be treated as such, what is the overarching term for their interplay? One doesn’t exist, but I suggest that we could use the term ‘pedagogy’ as a way of identifying this interpenetrating series of systems, i.e. pedagogy as composed of:


However, in turn, none of these systems make sense without the agency and expertise of teachers and students who make sense of the interpenetration of the systems. Hence, pedagogy might be shown diagrammatically as:


Pedagogy can be defined as the interpenetration of curriculum, teaching, learning and assessment whose contextual emergence depends on the agency of teachers and students. This then suggests that a research methods pedagogy does exist but also that any notion even of signature pedagogies at the disciplinary level are too ‘coarse-grained’ and generic if trying to understand practice and its change. Instead, characteristics, concepts and perspectives relating to each of the systems, such as curriculum design, aims, etc, the cognitive dimensions of learning, or the formative process in assessment all exist at a general level. But the character and process of interpenetration which emerges in time-space due to the agency of teachers and students leads to many and varied pedagogies which are at the same time based upon more general principles and concepts, leading to contingent and emergent pedagogic approaches. In a future post I’ll discuss how such a perspective on pedagogy might help in researching and gaining some understanding of it.


Cilliers, P. (2001) ‘Boundaries, Hierarchies and Networks in Complex Systems.’ International Journal of Innovation Management, 5(2), 135-147.

Shulman, L.S. (2005) ‘Signature pedagogies in the professions.’ Daedalus, 134(3), 52-59.

What makes good research? Some reflections

Some ideas concerning the features of good research in education developed through dialogue with colleagues and international masters students:

  • a focus on a definable issue or problem. Research needs to be focused and have a clear area for exploration. If it is too broad it becomes too unwieldy and difficult to collect meaningful data. In attempting to develop a coherent focus for research the use of research questions is extremely important;
  • the need for an ethical approach. All research in education should be developed with an explicit understanding that it should be an ethical process. The vast majority of research in this field includes human participants in some way. Our research should always protect the well-being and dignity of both the participants and researchers. This is often the stated purpose of research ethics, the ‘legal’ aspects which are often the focus of review panels. However, we also stress that ethical research should also focus on the need for honest and transparent reporting so that the work completed can be read critically and fairly by peers. This includes the reporting of research approaches, any conflicts of interest and the context of the research. It also requires that when we rely on the work of others we reference them fully so that they are given due recognition for their work;
  • give a clear outline of the context of research. The process of education is highly complex. Therefore, when writing about research it is always important to give readers a clear context (albeit anonymised) for the research. If a small-scale study is completed with a class of 12 and 13 year olds, in an inner-city school, composed predominantly of more able students then it is important the reader has this information so that they can understand the context of the research data gained. This also allows the reader to consider the degree of relevance of the research to their own situation. It is a central part of honest and transparent educational reporting and debate;
  • making use of research literature to inform the research design. The vast majority of research builds on work already done. It is important to begin to gain an understanding of the research which has been published previously in an area of interest. We need to be good at reading and assessing research so that we can judge the degree of evidence on which we might build our own work;
  • gives a clear outline/discussion of the methodology and methods which have been used to collect data. Ethical research should make the methodology and methods which have been used to collect data transparent. Readers need to know how our research has been carried out as this is crucial to being able to interpret data, and therefore engage critically with any claims which are made. Decisions concerning preferred methodologies gives an insight into the way the research is positioned and the nature of claims made.An account of the data collection tools (methods) used are equally important for the same reasons. If a study has used interviews, are the questions reported so that we can judge the level of neutrality? Where observations are used, is the focus and method of data capture explained? If these issues are not thought through and reported then a considered, critical reading of the research cannot be achieved. Where research occurs at a meta-level, through the use of literature reviews for example, it should include a methodology outlining search criteria, filtering processes and how publications have been analysed. If the literature review merely presents an area of research with no methodology, it needs to be read with caution as we have no way of assessing its validity;
  • uses appropriate methods which clearly link back to the initial issues/problems and research questions. Well-conceived research will be able to make clear as to where particular methods help in investigating the chosen issues/research questions; this gives the research coherence;
  • analyses collected data in a transparent way. In the same way as it is important to carefully consider the reporting of methodology and methods, so it is the case with analysing the data which has been collected. Analysis is often not considered to the same level of detail as methodology and data collection, but it is crucial in ensuring a reasoned and valid consideration of the data, particularly in trying to minimise biases and selective use of data. To make the process transparent it is again important to report how data has been analysed;
  • develops explanations and discussion derived from the data. Good research develops a clear discussion of the data which has been collected. This is at the centre of reporting research as it is where the interpretation of the project is developed. It is crucial that explanations emerge from the data provided and is not dissonant with the evidence provided. In addition the discussion of the data should be related to the literature which you have engaged with and which is the foundation upon which the research study rests;
  • offers measured insights/conclusions. Finally, good research is measured in the claims made. Small-scale research cannot easily make claims which can be scaled up to a large scale, in other words an analysis of one cycle of action research focusing on improving questioning practices in one class cannot act as the basis for national policy. However, small-scale research can still provide extremely important insights for further study and for practitioners by providing useful information as to where good practice might be found. Where conclusions include polemic and assertive language, it can often be the first sign we need to explore the study and its messages further. Many large-scale research projects rely on quantitative analyses. Insights are often based on statistical manipulations and offer a great deal of useful exploration of patterns and trends. However, in-depth explanations are sometimes more problematic as this type of research is often much stronger in providing answers to the ‘what’ rather than the ‘why’. All research has potential shortcomings as no approach is perfect or has all of the answers to an area of interest. Often, deep insights occur through a long-term application of a number of both qualitative and quantitative approaches, used to augment understanding, and giving progressively fuller and more critical perspectives on the issue of interest.

Book Review: An Introduction to Qualitative Research Synthesis

Major, C.H. & Savin-Baden, M. (2010) An Introduction to Qualitative Research Synthesis: Managing the Information Explosion in Social Science Research. Abingdon: Routeldge.

Over recent years there has been a valourisation of large-scale, quantitative research from some quarters. In some ways this is no surprise as there is a ready appeal to see ‘generalised’ patterns in data which can then be used for decision-making and policy formation. However, in this scramble for the use of ‘big data’ there has also been some criticism of qualitative research as being ‘anecdotal’, too focused on the particular, and therefore of little use when it comes to decision-making and policy generation; this has unfortunately also led to a shift in educational research funding which often appears to follow this logic. This book instead focuses on qualitative research and provides a very well argued case for the synthesis of qualitative studies as an additional route to providing insights for practitioners and policy-makers.

The book has three parts, the first two of which (‘Arguing for qualitative research synthesis’ and ‘Doing qualitative research synthesis’) outline and discussion the approach, whilst the third offers examples and frameworks for carrying out qualitative research synthesises (QRS). The first section includes a very clear argument for the use of QRS as an approach for combining and interpreting qualitative research studies. A refreshing element of the first chapter is a clear engagement with the possible problems and restrictions of QRS under the title ‘Top ten criticisms of the approach, point and counterpoint’. This helps develop an honest debate about the potential limitations of the approach whilst making transparent the philosophical and methodological foundations of the approach. This critical voice is retained throughout the book and I think provides an excellent example to those new to educational research in how to build arguments whilst being transparent about both approach and possible restrictions and problems. As the authors state at one point in the book, no methodology is perfect and honest discussion of the limitations and problems which occur as research is undertaken is important. The second chapter in this first section then goes on to locate QRS within the wider field of research syntheses, discussing how it is linked to, but different from, traditional literature reviews and structured reviews such as those developed by EPPI.

The second section goes on to outline the stages in carrying out a QRS, from development of a question around which the synthesis is structured, through designing and completing a search, analysing, synthesising and interpreting the data to presenting the outcomes. The overall explanations are very clear and with the examples provided in section three, there is a good overall explanation of the approach. Importantly, the outlining of QRS is not given in the form of a ‘recipe’, as the process involves a lot of reflexivity, and therefore it is only possible to outline principles and general structures rather than giving a step-by-step ‘how to’ guide. I particularly liked the sections on establishing plausibility, including validity and trustworthiness, which stress the need for qualitative research to be clear in developing contextual information as well as clear explanations of methodology, data collection and approaches to analysis so that the coherence and quality of evidence can be transparently assessed. One table which is provided to highlight this need is given on page 61, based on a template from the Joanna Briggs Institute and could act as a more general starting point for qualitative researchers as they develop reports of their research.

A rating schedule to establish rigour in findings

  • Unequivocal

Findings supported with clear and compelling evidence

  • Credible

Findings that are plausible given the weight of evidence

  • Unsupported

Findings that are suggested but not supported by data

Reading this section, I was struck by the degree to which it would act as a useful starting point for discussion into considering quality research writing not only in QRS, but in qualitative research writing more generally.

Section three goes on to provide some interesting and useful examples of QRS studies which can be considered alongside the earlier sections to see how the process can be understood through the final product.

This book is a very useful introduction to QRS and offers both a critical and clear overview of how qualitative research reports can be synthesised and interpreted to provide broader insights into educational problems and issues. In my opinion, it sits well alongside Pope et al’s (2007) book focusing on the synthesis of qualitative and quantitative evidence in health. The approaches are different, a point which is stressed by Major and Savin-Baden, but together these two books offer critical approaches to bringing together evidence from across the research spectrum to offer new and interesting insights into educational issues.


Pope, C.; Mays, N & Popay, J. (2007) Synthesizing Qualitative and Quantitative Health Evidence: A Guide to Methods. Maidenhead: Open University Press.

Teaching Research methods – Some initial reflections

On March 20th, we finally finished teaching a research methods module which is a core element of our MA International Education (MAIE) course. As I outlined in this blog last autumn (here, here and here), I have been working with a colleague in the School of Education to develop a new approach to our research methods course.

Having finished the course and the data collection we have captured from a parallel research project of our own on the module, it feels like a good time to consider some initial reflections about our work. This is obviously an initial perception, we need to spend many months analysing the very rich dataset we have collected. Any reflections can’t be taken as a detailed and accurate account. However, several issues seem to have emerged across the module:

1) Thinking about threshold concepts. As we began to develop a curriculum framework we discussed possible threshold concepts in research methods as a basis for instructional design. In a past post, I listed threshold concepts identified by Kiley and Wisher (2010). They saw the threshold concepts relating to research methods as being:

  • argument
  • theory
  • frameworks
  • knowledge creation
  • analysis
  • paradigm

We started from this point, but through discussion emphasised the following concepts as being both central to understanding research methods and also having the potential to be transformatory. Consequently, our list of threshold concepts became:

  • criticality
  • theory
  • methodology
  • ethics
  • analysis
  • epistemology/ontology

In the event, we spent less time on theory as a concept than we had expected, but all of the other concepts became a major part of the course. In student interviews criticality was seen as central to developing an ability to read research and from this to writing well considered and careful texts. Methodology and analysis were also seen as being important for assessing papers as well as being central to a critical and deep understanding of how to carry out research. One student reflected that previously she had read the ‘start and end’ of papers to engage with the main messages; now she first engages with the ‘middle’ to assess the degree to which the research could be used or trusted. Ontology and epistemology were the most difficult concepts to tackle and at the end of the module I would argue that some are still in liminal space in this respect. Some students reflected that at undergraduate level the nature of reality and knowledge, as well as paradigms, were assumed and hence never discussed. As an interdisciplinary pursuit education needs to engage in these debates as researchers from many different traditions meet at this particular crossroads and there is therefore a level of philosophical complexity. Methodology, analysis and ethics were all equally important in aiding students to gain a deeper and holistic understanding on which to base their expanding knowledge and practical experience.

One additional concept which we had not included in our original list but which I would be minded to include having completed the module is that of ‘sampling’. Some struggle with this and yet good understanding often acted as a basis for logical, well considered and critical bridges between methodology and data collection tools. Where sampling was not well understood this bridge was less, if at all, secure and logical explanation of research design began to default to general description and a lack of criticality.

2) Importance of language. We have started to see the research methods module more and more as a language course. This is not only the result of developing a course which predominantly attracts international students, although this is obviously important. We have a number of English speaking students and yet they often commented on the difficulty of engaging with the language. Research methods language is conceptually rich and difficult; we are teaching this language and regardless of student origin, we need to ensure that students understand the language and the concepts underlying it.

3) Research methods as an applied activity. In our planning, we also developed a pedagogic model which sees conceptualisation, knowledge and application as equally important, and intertwined.

understanding elements of learning for a master's RM programme

At the end of the module I feel this is a very useful framework and has aided in developing a critical approach to the module. Conceptualisation is vital as a basis for constructing and developing knowledge. However, where these began to really make sense for students was when they actually enacted their ideas. The application of research methods started from day one of the course and revolved around two practical exercises. Firstly, students acted in pairs to consider the characteristics of good interviewing before developing a set of group research questions based on a research problem given to them by ourselves. From the research questions they discussed and agreed interview questions before splitting into pairs to complete their interviews. Once complete, the pairs then transcribed and encoded their data. This process shadowed their work in face-to-face sessions and therefore their emerging understanding of the module. The final exercise focused on comparing codes across the group to identify re-occurring themes as well as outliers.

Students then moved on to complete a module assignment which asked them to develop an area for research, develop research questions from this, before creating a research design which was then piloted. Subsequent to the pilot, students were then asked to reflect on their experience and how they would change their research design as a prelude to developing their dissertation work.

This proved very challenging, but also, according to some of the students, allowed them to consider how far their understanding of research methods developed.

I am currently discussing the potential for a new master’s degree focusing on praxis-based approaches to education. Having developed our work on research methods I fully intend to embed an emergent element of research methods across all modules of the programme, leading towards a specialist research methods module. Research methods needs to be engaged with over a period of time and within different contexts to give a wide critical and experiential basis for discussion and theoretical understanding.

These are some of the basic reflections from the course, but as I said above, these are only initial and need to be considered in far more detail as we begin to engage with the very large amount of data we have collected from this course. In my next post, I will continue my reflection, by considering the process of researching this module and the utility of considering the learning environment as being a complex adaptive system.