Towards a research design
In the last 2 posts on developing learning within a research methods module (here and here) we’ve set out our view of learning, the assumptions on which the work rests, and the conceptualisation of learning and curriculum as a practical outcome. This post turns towards research methods we intend to use to help us understand how the module, learning, pedagogy and assessment work in reality.
As suggested previously, in developing a research design some conceptual assumptions have to be made. These have been, in part, the focus of the previous two posts. However, one assumption is worth developing here a little more as it is central to the resultant research design. Given the number of spaces, pedagogic approaches, and the breadth and depth of content and conceptualisation of research methods to be taught/learned, not to mention the diverse cultural, educational and disciplinary backgrounds of the students involved, we argue that this module acts as a complex adaptive system (CAS). Such systems have a number of features, the most important of which are discussed by Cilliers (1998) who identifies them as characterised by:
- a large number of elements with many interactions;
- interactions which are non-linear, i.e. large-scale causes can have small-scale impacts and vice versa;
- interactions which lead to feedback loops, both negative and positive;
- an ‘open’ system, having interactions with elements in external environments beyond the immediate system;
- elements which interact with their environment making the identification of boundaries difficult;
- a system which is far from equilibrium and therefore needs a constant energy flow for it to operate;
- the importance of history, past processes playing a role in forming the present, often unpredictably;
- each element only acting on local information rather than information from the whole system.
Cilliers (1998: 13) goes on to argue that such systems are so complex that any total representation of them would have to be as large as the system itself – a practical impossibility;
‘In building representations of open systems, we are forced to leave things out, and since the effects of these omissions are nonlinear, we cannot predict their magnitude.’
Consequently, Richardson et al (2007) refer to CASs as ‘incompressible’, and argue that to understand them, at last in part, we need to use different perspectives to build ever richer, if incomplete, models of the system we are interested in. To some, this might be an excuse not to bother; why research something we cannot understand in its entirety? For others, there is the temptation to use experimental approaches which isolate single variables and assume they operate in the same way in a complex context. But in both cases, the complexity of the system is lost and in many experimental approaches interactive processes are assumed to have little, or no, impact. This is a huge assumption to make. Alternatively, Richardson and Tait (2010: 92-93) make the case that,
‘Just because a complex system is incompressible it does not follow that there are (incomplete) representations of the system that cannot be useful – otherwise how would we have knowledge of anything, however limited? Incompressibility is not an excuse for not bothering.’
To research learning, pedagogy, curriculum and assessment in natural settings is to triangulate and extrapolate from a number of ‘glimpses’, which whilst not perfect, are not an excuse for not bothering!
The following methods will be used as a way of capturing different perspectives of the ‘research methods pedagogic system’. Some of them are also summarised in the course diagram which appeared at the end of post 2 and is reproduced here (figure 1)
Figure 1 Research methods module with data capture tools.
N.B. It is important to state that all data capture will be completed on the basis of ethical approval and student consent. Where individuals do not give active consent, data will not be analysed, or captured beyond normal group activity.
1) Baseline data: This will focus on prior learning about research methods, and on expectations and pre-existing knowledge and conceptualisation. The data here will be captured through three channels:
i) student applications: baseline data concerning degree disciplines and exposure to research methods training
ii) baseline questionnaire: focusing on demographic data and prior learning as well as expectations of the course.
iii) semi-structured interviews: using purposive sampling to gain a range of views on past experience of research methods. Consideration of student views as to what they believe a research methods course is likely to include.
2) Lesson Study: Used with four sessions throughout the course (marked LS1,2,3, and 4). This will allow us to consider in detail what we are attempting to develop in terms of concepts, knowledge and application. It will also reflect back to us our assumptions and viewpoints in terms of planning, execution and evaluation of learning in the module. We will predict student approaches to learning in areas of the subject matter where they find it to gain proficiency (e.g. epistemology, research design, etc). Used in conjunction with other data capture techniques it will also give us useful information on the process of learning itself, particularly in relation to the social learning dimension (Illeris, 2007). To augment and triangulate observation insights we will carry out stimulated recall interviews (Gass and Mackey, 2000) with the students who have been observed, and we will make copies of any notes they have made. Both of these will act as useful, if imperfect, sources of data on the cognitive dimension of learning.
3) Staged capture of experiences: At three points over the course of the module, we will run a questionnaire focusing on the concepts, content and application of research methods within the course and student confidence in each of these areas. This will then be expanded on with the use of further, more general stimulated recall interviews. We will also ask those who are interviewed if they will allow us to copy their course notes (both formal and informal).
4) Participatory focus groups: Given we are interested in developing a better course of study, and given our work within a complex adaptive system, we intend to hold three participatory focus groups at the same time as the ‘staged capture of experiences’. These will be in the form of discussions to give the students an opportunity to help shape the form and detail of the module as it unfolds (in a similar form to that used by Wood and Butt, 2014), so that their insights become part of the ongoing planning process.
5) Documentary evidence: The collection of assignment work, our planning and resources documents and eventually, copies of dissertations will all be useful in comparing student outcomes against other data channels.
6) Self-explanation videos: Self-explanation theory (for example see Renkl, 2002; Kuhn & Katz, 2009; Chamberland et al 2013) rests on the notion that asking students to explain to themselves how they are undertaking a task as they do it enhances their understanding and learning. The evidence for the utility of this approach to learning is uncertain with mixed results. In this research design we intend to shift the focus of its use to act as a reflective tool rather than as a process used in the action of learning itself. Towards the end of each area of study, we will ask students to produce some form of short summary, be it a series of images, a list or a concept map. Once they have completed these on their laptops, we will ask them to open Screecast-O-Matic (http://www.screencast-o-matic.com/) and record an audio narrative over screen capture (maximum of 5 minutes), explaining to themselves what they have learned, how it links across to other facets of the module, and what they believe they still need to work on. Once their narratives are complete they will be asked to save them for their own use, and also to send a copy to us. This will give us the opportunity to understand some of their conceptualisation and understanding, as well as incomplete, emerging and misconceptions. Over the whole module it will also allow us to track individual learning trajectories, particularly if used in conjunction with interviewing and work scrutiny.
The data capture approaches described above will result in a very large volume of data which will take a considerable amount of time and effort to analyse and cross-reference – what we are currently referring to as a ‘thick description mixed methods’ approach. However, what it will give us is a fine grained, if incomplete, perspective on the interplay of learning, pedagogy, curriculum and assessment. By capturing a number of different perspectives it will still only offer the potential for an incomplete model of the research methods module system, but one which will potentially give us a number of useful and important insights nevertheless.
Chamberland, M. et al (2013) ‘Students’ self-explanations while solving unfamiliar cases: the role of biomedical knowledge.’ Medical Education, 47, 1109-1116.
Cilliers, P. (1998) Complexity and Postmodernism: understanding complex systems. London: Routledge.
Gass, S.M. & Mackey, A. (2000) Stimulated Recall Methodology in Second Language Research. Abingdon: Routledge.
Illeris, K. (2007) How We Learn: Learning and non-learning in school and beyond. Abingdon: Routledge.
Kuhn, D. & Katz, J. (2009) ‘Are self-explanations always beneficial?’ Journal of Experimental Child Psychology, 103, 386-394.
Renkl, A. (2002) ‘Worked-out examples: instructional explanations support learning by self-explanations.’ Learning and Instruction, 12, 529-556.
Richardson, K.A. & Tait, A. (2010) ‘The Death of the Expert?’ E:CO, 12(2), 87-97.
Wood, P. & Butt, G. (2014) ‘Exploring the use of complexity theory and action research as frameworks for curriculum change, Journal of Curriculum Studies, http://www.tandfonline.com/doi/full/10.1080/00220272.2014.921840#.U6xU47BwaUk