What do we mean by pedagogy? (Part 2) Conceptualising assessment.

As outlined in part one of this strand of posts, I argue that we should see pedagogy as an overarching concept which positions teaching, curriculum, learning and assessment as interpenetrating systems which create a complex whole, making little sense when treated separately. Here, I want to consider how assessment might relate to other elements of pedagogy in master’s level study, particularly in education courses.

The explicit link between learning and assessment has become an increasingly accepted element of pedagogy at HE level, there is even a journal given over to the role of assessment at this level. Assessment, related to both teaching and learning, should begin at the start of a module, using diagnostic assessment to gauge and understand the complexities of prior learning within a group of students. How this is achieved is not the focus of this post, although there are a number of ways in which it can be developed. The diagnosis of prior learning is crucial as the insights gained should feed into both curriculum and teaching development, particularly challenging when working with widely diverse groups of students. By doing this, pedagogy becomes to a degree emergent, developing in response to student need rather than as prescribed by tutor assumption and preference.

Beyond initial diagnostic assessment is formative assessment, a vehicle for helping students take forward their learning, particularly through the use of targeted and well considered feedback/feed forward. The use of formative assessment is well embedded in higher educational practices, but can it be taken further? Dann (2002), working in the primary education phase, takes a view that formative assessment does not go far enough in redefining the link between learning and assessment. She argues that assessment should be fully embedded within learning, describing this as Assessent AS Learning. She sees this as a natural development of formative assessment, consisting of:

– assessment whilst teaching, leading to the directing and modification of that teaching;

– assessment by teaching, derived from an interpretation of Vygotsky’s zone of proximal development, where once the task has been set, the teacher gauges the amount and type of help required to ensure success and modifies teaching to give the greatest chance of this occurring.

This outline begins to blur the boundary between what counts as teaching and what counts as assessment. Indeed, this view of assessment in many ways suggests that the main focus is actually the process of teaching as informal and minute-to-minute reflection and assessment to inform the emergent direction and structure of the session itself. Hence, the interface between assessment and learning becomes dynamic, complex and collaborative; student involvement in assessment becomes a feature of learning.

The suggestion from the outline above is that there should be a clear synergy between teaching, the conceptualisation of learning and feedback discourses. The table below is a synthesis of the discussion developed by Askew and Lodge (2000) which links the role and approach of the teacher, the associated conceptualisation of learning, and the resultant tone for feedback discourse. To develop a feedback system where feedback is more informal (most of the time), and becomes an embedded element of learning, where a constant dialogue takes place to aid development and progress that is dialogic and iterative, a movement towards the Co-constructive end of the spectrum is a natural trend. In contrast, at the Receptive-transmission end of the spectrum, it can become necessary for staged, written input to make the ‘gift’ of tutor knowledge worthwhile and this may translate to a seminar room dynamic where grades begin to play a major role in feedback with little associated feed forward; students are told what the correct answers should be, with the degree of success mainly demonstrated through a summative indicator. The Receptive-transmission model can become more closely aligned with students playing only a passive role in extending their own learning.

Assessment

Masters level study, is in part, characterised by a drive towards increased independence and an ability to play a part in a community of inquiry. Given this intention, a Receptive-transmission approach seems out of place, and with it, more traditional forms of assessment. Instead, a more critical and appropriate approach is one which is co-constructive. But for this to be coherent there needs to be a synergy between teaching, views of learning and feedback/feed forward processes in both diagnostic and formative assessment.

References

Askew, S. and Lodge, C. (2000) ‘Gifts, ping-pong and loops – linking feedback and learning’ in S. Askew (ed.) Feedback for Learning, London: Routledge-Falmer.

Dann, R. (2002) Promoting Assessment as Learning: Improving the Learning Process, London: Routledge.

What do we mean by pedagogy? thinking through some conceptual frameworks. (Part 1)

‘Pedagogy: the methods and practice of teaching, especially as an academic subject or theoretical concept.’

Over the past year I’ve been involved in a research project developing masters level research methods provision with a colleague in the School of Education. Almost by default we started to refer to what we have been researching and evolving as ‘research methods pedagogies’. Informally, this became a convenient way to refer to our work. This week we ran a one day workshop, trying to open up a space-time for interested staff to reflect on their own approach to research methods and consider how they might take their own practice forward.

Towards the end of the day we asked the group to reflect on some of the ideas we had been discussing. One question we offered for consideration was ‘Is there a separate research methods pedagogy?’. This led to a lot of discussion and exposed some of the complexities concerning what we might mean by pedagogy, and how it, as a concept, fits within wider educational discussion. Following the definition at the start of this post, if pedagogy is specifically the method and practice of teaching it might be agreed that it is a generally generic activity; the context or content of the teaching might change, be it research methods, leadership, policy or inclusion but the underlying pedagogic approach remains overwhelmingly unchanged. This may characterise pedagogy as the consequence of a set of universal principles, such as the utilisation of a set of cognitive principles, e.g. working memory, etc. This opens up the potential for a reductive philosophy based on the principle that we can find the ‘most efficient process/pattern’ for embedding information and knowledge into students which apply for all aspects and contexts in teaching. From this position the idea of a ‘research methods pedagogy’ is highly problematic as it is not a unique approach which is being taken as the pedagogy is probably similar, if not identical, to any other pedagogic context taken by any individual teacher or teachers. Any difference which does occur is due to other elements of a module such as assessment or curriculum; the pedagogy is a constant.

A slightly different argument is that of Shulman’s (2005) ‘signature pedagogies’. Here, there might be commonalities between clusters of disciplines, but at the same time disciplines have particular teaching approaches which are distinct to them and which are responsible in part for the development of habits of mind within each discipline. This sees pedagogy as a hybrid, part particular to a field, part more generic. But where does research methods pedagogy fit within this scheme? Is it a generic pedagogy, similar across clusters of disciplines, or are research methods frameworks actually particular to disciplines? Does it matter? In reply to the first question, does the fact that education is an interdisciplinary field as opposed to a discipline have an impact on how we define and understand research methods pedagogy?

So pedagogy can be seen as a process focused wholly on the act of teaching, underpinned by a series of ‘universals’ relating to learning, or it can be seen as at least partially contextual at the disciplinary level. However, there is another wholly different perspective we can develop. If we go back to the initial definition of pedagogy as the method and practice of teaching, what is the true utility of such a statement if restricted by the statement? If the idea of pedagogy as teaching method is seen in isolation, taken in the context of a masters module on research methods, it becomes ultimately meaningless. If teaching is seen as a system, composed of a series of elements and relationships it can only become useful when considered as interpenetrating with other systems. Cilliers (2001: 143) states that:

‘The cross-communications between hierarchies are not accidental, but part of the adaptability of the system.’

If pedagogy is synonymous with teaching, what are the systems into which it interpenetrates? Teaching is senseless without interpenetration to learning, curriculum and assessment. These are the main systems which together go to make up the seminar-centred process of helping students to gain new conceptual understanding, knowledge and skills. And in each case, they are themselves complex systems rather than unitary features. But if we see these systems as interpenetrating and accept that they need to be treated as such, what is the overarching term for their interplay? One doesn’t exist, but I suggest that we could use the term ‘pedagogy’ as a way of identifying this interpenetrating series of systems, i.e. pedagogy as composed of:

Ped1

However, in turn, none of these systems make sense without the agency and expertise of teachers and students who make sense of the interpenetration of the systems. Hence, pedagogy might be shown diagrammatically as:

ped2

Pedagogy can be defined as the interpenetration of curriculum, teaching, learning and assessment whose contextual emergence depends on the agency of teachers and students. This then suggests that a research methods pedagogy does exist but also that any notion even of signature pedagogies at the disciplinary level are too ‘coarse-grained’ and generic if trying to understand practice and its change. Instead, characteristics, concepts and perspectives relating to each of the systems, such as curriculum design, aims, etc, the cognitive dimensions of learning, or the formative process in assessment all exist at a general level. But the character and process of interpenetration which emerges in time-space due to the agency of teachers and students leads to many and varied pedagogies which are at the same time based upon more general principles and concepts, leading to contingent and emergent pedagogic approaches. In a future post I’ll discuss how such a perspective on pedagogy might help in researching and gaining some understanding of it.

References

Cilliers, P. (2001) ‘Boundaries, Hierarchies and Networks in Complex Systems.’ International Journal of Innovation Management, 5(2), 135-147.

Shulman, L.S. (2005) ‘Signature pedagogies in the professions.’ Daedalus, 134(3), 52-59.

Thinking through lesson study for task design and learning insights

Today was one of those days where being able to slow down and reflect can lead to some new insights and ideas for future change. On Friday I’m running a day-long workshop together with Joan, my partner-in-crime on our MA International Education research methods module considering issues around research methods pedagogy. We’ve spent a great day putting together what we hope will be a reflective consideration of our research into this area, and as with all such events we’ll spend some time discussing possible ways forward.

Over the past year, we’ve gained a number od ideas and insights from the use of Lesson Study. We’ve found this approach invaluable as part of a much wider action research approach exploring the module development we’ve undertaken, and have no doubt that it has been a worthwhile activity in opening up new thinking and understanding concerning some (but no where near all) of the pedagogic changes we’ve made. As we started to think about some of what we’ve learned this year, and how we might gain further insights next year, we started to think about the difficulty some students have in understanding abstract concepts such as ‘ontology’, particularly for those whose own languages do not have a similar notion/concept. At this point, we started to think about how we can gain a better understanding of student learning of this, and how we can help develop better approaches to help deeper conceptual understanding. Normally, in Lesson Study the unit of analysis is the lesson, focusing on a leaning challenge across that period. But why can’t the unit of analysis be much shorter, a single task?

With this in mind we’re beginning to think about the idea of using a modified version of lesson study, possibly even bringing in elements of learning study (as variation theory and phenomenographic perspectives might be useful) to develop a deep planning approach, with observation and stimulated recall interviewing in an attempt to open up the process of thought and learning going on in that single task. The process might look something like this:

1. Begin by interviewing three students from different educational/cultural backgrounds to establish their implicit ‘ontologies’ and also the degree to which they have been exposed to/understand the concept.

2. Work in a team of three including the two of us who run the module and a colleague from the English Language Teaching Unit who is an expert in English for Academic Purposes (EAP). We would develop a single task designed to open up student thinking and activity on ontology, considering initial reflections from students as a starting point and combining this with academic understanding of the concept.

3. Complete observation of the task, with the observers in close proximity of the three students, noting observable activity, recording any dialogue and videoing with VEO to time stamp evidence. As soon as the activity is finished, the observers would then conduct short (10-15 minute) stimulated recall interviews with the students to discuss how they approached the task, what they were thinking and how they now understand the concept of ontology.

4. Two days after the session, we would ask the students to complete a second interview asking them what they understand by the term ‘ontology’ and how they would use it in explaining their wider understanding of research methods. This is to see which elements of change in their schema is longer term – if any.

The reason for discussing this with students both immediately and after a gap serves two purposes:

– the immediate discussion helps understand interaction with the task and how the student makes sense of it. Does it help them begin to gain a better understanding? How well are they able to navigate the task? What elements of ontology as a concept are they interacting with? How are they beginning to relate these elements? As a consequence, how well has the task worked in opening up the concept to analysis, interpretation and synthesis? Are there other ‘external’ factors at play such as experience or prior learning?

– the longer term interview allows us to begin to understand which elements have remained as a residual effect, but also to understand what intervening processes might be responsible. Has the student simply remembered some aspects? Have they gone on to read more about the concept? Have they tried to utilise the concept in follow up work? In other words what are the complex interplay of factors and experiences the students have (or have not) used in stabilising and/or augmenting their understanding and use of the concept of ontology.

Beyond the second interview the team can then come back to evaluate what they have learned from both the task and the aftermath as a way of beginning to consider wider, longer-term interventions and structures for helping stabilise and augment conceptual understanding further.

If this gives us useful insights, we might then begin to think about task design for relating concepts!!