Fishwick, L. (2008) Giving Students a voice: Creating Spaces For Assessment Dialogues, hea academy Conference, Harrogate. Challenges of Higher Education and the Assessment Disease

Download 50 Kb.
Size50 Kb.
Fishwick, L. (2008) Giving Students A Voice: Creating Spaces For Assessment Dialogues, HEA Academy Conference, Harrogate.

Challenges of Higher Education and the Assessment Disease

As student numbers increase within Higher Education new mechanisms have arisen to give the students a voice amid claims of a faceless mass production of graduates. The National Student Survey is one such mechanism and the students have indeed spoken and academics have begun to take notice. The groundswell of student dissatisfaction about assessment in general, and feedback specifically has fuelled a growing debate on the effectiveness of feedback. Student dissatisfaction in this area was predictable given that as far back as 1994, Knight and Brown indicated that assessment defines what students regard as important. It is not unusual to hear some colleagues bemoan this tunnel vision of students and to reflect back to more “traditional” University education that was about the process of scholarship rather than product oriented assessment focus. This “blaming the victim” approach is also evident when academics highlight that the effectiveness of assessment feedback is dependant on what the learner does with it. The implicit assumption is that students generally ignore feedback as they are only interested in the grade itself. There is further evidence in the literature that the majority of students in the UK do not understand the meaning of the marks and the grades they receive, especially in terms of how to improve their work (Weeden et al., 2002; Weaver, 2006).

Some researchers have thrown down the gauntlet over the evidence on the effectiveness of feedback, and instead of blaming the students have suggested that one of the current challenges in Higher Education is for the development of much higher levels of assessment literacy amongst educational professionals (Broadfoot, 2008). Key to this challenge is to examine the role of feedback in terms of the impact on student learning. A gap between theory and practice is evident as Handley et al. (2007) note the prominence of feedback in pedagogic theory is not matched by practice as evidenced by the sorry tale told by the literature on student experiences of feedback. This paper reports on innovative practices in a specific degree route over the last 2 years aimed at creating spaces for dialogue between students and tutors focused on assessment and feedback. Opening up the dialogue about assessment does have inherent risks. Tutors may struggle with the subsequent demands from the students. However, the findings of the current evaluation of practice suggest that engaging in assessment dialogues helps clarify many implicit expectations and also helps generate appropriate student learning activities.

Enhancing Student Engagement with Assessment Feedback

Case Study 1 Mock Take Home Exam


The aim of the module on Sport on the Cultural Agenda is for students to learn the skills of cultural analysis and be able to apply abstract social theory to understand sport and inform practice. The assessment is a take-home exam (handed out Week 10 and submitted week 14). There are 30 students on the Level 6 module and the teaching delivery is a lecture/seminar format.


Students were given week 7 as an independent study, asked if they wished to work individually or in pairs, and then asked to complete a mock take-home exam based on Week 8 readings. The question format mirrored the actual exam with Q1 focused on definitions, Q2 on theoretical analysis, and Q3 application to practice in the form of preparing a presentation. The tutor set up a discussion board on eLP over this 2 week period.

Assessment Workshop

The feedback session was structured as follows:

  • Q1 was peer reviewed using given definitions and collated sport examples on whiteboard

  • Q2 was self-assessed against a model answer

  • Q3 four groups were randomly selected and presented the materials which were marked by students and tutor.

Assessment Dialogues

The initial dialogue in week 6 consisted of the tutor explaining the format of the mock and reviewing the learning outcomes for the whole module in relation to the exam. The students then formed groups to outline guidelines for usage of the eLP discussion board (anonymous posting, type of questions, level of student-tutor contribution). The discussion board was extremely successful with 122 threads of discussion, several which were visited over 50 times.

In the assessment workshop the peer marking of definitions created both subject specific discussions about concepts, and discussions about how to write good answers. The self-grading from a model answer proved a more difficult task and the dialogue focused on the skill of editing and being objective about your own work as well as the key points for the answer. The presentations were given oral feedback by both the tutor and students focusing on how well they had applied theory to practice and the effectiveness of the presentation.

The workshop finished with an evaluation of the task. The students indicated that the take-home exam was a challenging task that encouraged them to delve more deeply into the set readings. The amount of time spent on the mock varied between 4 hours and 36 hours. All the students indicated that they now had a much clearer idea of the expectations and requirements for the summative assessment. They appreciated the opportunity to practice the style of writing and receive feedback on their performance. There was an excellent level of student engagement in the workshop and about 50% contributed to the discussion board. The assessment dialogue helped clarify learning outcomes, review expectations about academic writing style and also had “value-added” benefits of encouraging, motivating and reassuring the students about the assessment. The students voiced their concern that they still wanted individual feedback and an agreement was reached that the tutor would mark one definition from Q1 and review the Powerpoint slides from Q3 and hand these back next week.

Case Study 2 Separating Grades From Feedback


The aim of the Sport Subculture module is to examine the cultural aspects of sport using ethnographic and content analysis research methodologies. The assessment is 2 research reports worth 50% each (submitted in Week 8 and week 15). There are 30 students on this Level 5 module and the teaching delivery is a lecture/seminar format.


The students wrote up the ethnographic research report and then attended a two-hour assessment workshop in week 10.

Assessment Workshop

In the workshop the students were handed back their scripts that were fully annotated but contained no mark. The students were then asked to read the tutor’s comments, review the marking criteria for the report and then write out their own standard Divisional front feedback sheet. When they had completed this task the student could then receive their grade from the tutor and were asked to comment on the accuracy of their self-reflections and asked to outline ways to improve for the second research report.

Assessment Dialogues

The students found the task extremely challenging and the initial discussions focused on encouraging them to just give the exercise “a go” and on the tutor highlighting the marking criteria for the specific research report and outlining the generic criteria for 1st class to failing assignments. As the students were completing the task the tutor had time to talk to each student. The students fell into 3 categories in terms of their self-reflections on their performance: hypercritical, accurate or overly-generous. The next phase of the dialogue focused on self-reflection skills and objectivity in editing one’s own work as well as the high level of difficulty of doing this task. Students came up individually to receive their actual mark and engage in a relatively informal conversation in terms of their level of accuracy and their explanations for their grade. The remainder of the workshop focused on discussions about what they had learnt during the exercise and then to open up two way communication of setting a challenging standard to ask what they have to do to gain a first class mark in the next report.

Case Study 3 Content Analysis of Presentation Feedback


The aim of the Sport in the Community module is for the student to engage in real work evaluation research with sports development providers. The assessment is a presentation (20%) and research report (80%). It is a year long module with a presentation in week 8 semester 1 followed by a research report in week 10 semester 2. There are 50 students in this Level 5 module.


The students formally presented a proposal in groups of 3 in seminar. The presentation was peer-marked on prepared feedback sheets as well as brief verbal comments given outlining 3 things the group had done well and 3 things the group could improve. All peer feedback sheets were handed in to the tutor. The tutor provided verbal feedback during the seminar and then met with other seminar tutors to discuss the written feedback.

Assessment Workshop

The week after the presentation an assessment workshop was conducted in a lecture which consisted of a recap of the key elements of evaluation studies and then a series of student activities focused on feedback on the presentation of the proposals. The main activity was based on the tutor’s content analysis of both peer feedback and tutor feedback. The students were handed a page of quotes from the feedback comments and asked devise categories to code them. Next the presentation groups were asked to indicate the comments they felt captured their presentations and to estimate their marks. The lecture concluded with a brief summary of the literature on student engagement with feedback and asking students to evaluate the current session in terms of usefulness for their current assignment.

Assessment Dialogues

The initial dialogue focused on categorising the comments in terms of subject specific content on evaluation studies and community sport development, and on a series of categories for presentation skills. Students indicated what worked well in several of the presentations and how some of the groups had been very impressive in terms of their knowledge. The top groups appeared to motivate other groups in terms of setting a clear benchmark in what students can achieve. The categories were then linked to the marking criteria to clarify the standards and expectations of the assessment. In the final part, students were given the tutor feedback sheet and their actual grade. The discussions at this point focused on ways to improve the study and lessons learnt for future presentations. If the students had a large gap between their estimated grade and their actual grade they were encourage to sign up for a group tutorial with their seminar leader to clarify any points concerning the grading.

Case Study 4 Seen Essay Question and Mock Multiple Choice Questions


The aim of the Sport in Society: Issues and Controversies module is to introduce the students to socio-cultural context of sport. The assessment is a 2 hour exam (100%) in week 14. There are 80 students on this Level 4 module.


The assessment task was to compose their own multiple choice questions for seminar and prepare a response to a seen essay question (handed out week 10). The tutor set up a discussion board from Week 11- week 13 and students were asked to post and reply to comments concerning the exam.

Assessment Workshop

The workshop was in the form of a lecture summarising the module that was prepared completely as multiple choice questions. The following seminar was designed as a quiz based on the student prepared multiple choice questions.

Assessment Dialogues

The lecture became student centred as they provided the answers to the multiple choice questions that summarised the module content. In posing the questions, the tutor was able to talk through the development of the questions and the nuance of some answers “more right” than others. This led to discussion that unpacked the complexities of the subject matter as well as highlighting key skills in eliminating wrong responses from multiple choice options. The students indicated that the seen essay question allowed them to engage with their revision in a more focussed way. They tentatively asked question about the phrasing of the question and sought clarification over a key elements. However, on this instance there was little take-up of the eLP Discussion Board (approximately 10 threads visited on average about 30 times).

Lessons Learnt From Creating Spaces For Assessment Dialogues

It is evident from the case studies that creating space for assessment dialogues fosters engagement and highlights the importance of the relationship between learners and educational staff. The dialogues helped create safe and challenging learning opportunities. As facilitator of the dialogue as the tutor I used questioning and listening skills such as redirecting, paraphrasing for checking and developing questions to encourage the students to express their concerns. The tone of the dialogue was important in attempting to minimise the power dynamic and encourage the students to participate in the discussions. Reassurance and confidence building were outcomes from the dialogues and Boud (2000) indicates that fostering learner confidence will affect achievement. Many of the questions from the students could be categorised as seeking reassurance but also seeking ways forward to improve. The student comments highlighted the complex impact of this on their learning and outlined key issues in the process such as sensitivity, emotional impact, trustworthiness and power relations. There was also the more difficult task of challenging the students to consider peer and self-assessment as equally worthwhile in terms of feedback rather than relying on the default tutor one-to-one feedback which is their most preferred option. There was only limited success in this regard especially at Level 6. Perhaps the positive impact of developing skills of self and peer assessment will be seen over time as activities progress from Level 4, 5 and 6 in the curricula.

A key process within the assessment dialogues is the negotiated diagnosis of where the learner is and wants to be and built into this is developing the skills of self-reflection. Separating comments from grades engages the students with feedback and develops learner self-assessment abilities. As Boud (2000) and Nicols and Macfarlane (2006) note similar activities in their lists of resources for sustainable assessment that clarifies what good performance is (goals, criteria, standards) and delivers high quality feedback that enables students to monitor and self-reflect. It is crucial that the assessment dialogues are student-centred consisting of tasks that are creative, self-directed and problem based. Such tasks allow the students to develop reflective capacities and critical self-awareness which in turn develop their own agency. The case studies capture key dynamics of formative assessment as suggested by Jones (2008) in terms of designing activities that enable learners to identify ways to move forward in relation to their own understandings of standards for assessment. In this respect, another key aspect of the dialogue is for the tutor to have the opportunity to communicate clear and high expectations. As well as the verbal conversations in class, the Discussion Board is an excellent forum in which to clarify expectations. Additionally, posing questions as well as answers in response to student queries also helps help students map out a way forward with the substantive material.

One of the main concerns of many academics is that spending time on assessment actually detracts from the substantive content of a course. However, in all of the case studies the activities attended to both the learning process and substantive content for the module. In general the findings of the case studies support the tenet of less is more. An effective strategy adopted has been to require less summative assessment but design more productive feedback sessions into the modules. In three of the case studies the amount of summative assessment was reduced and replaced with formative assessment activities. In the fourth case study the tutors are searching for ways of assessing the process of completing the research as well as the product (i.e. the report) to capture more f the dynamics of the learning process. This supports Knight and Yorke (2003) plea to use formative assessment to enhance learning rather than allowing summative assessment to drive the learning process. The case studies also reflect Jones’ (2008) definition of formative assessment developed from a sample of experts as “deliberately using a pedagogical relationship to enable learners to identify ways to move forward in relation to their own understandings of their world, to the standards of the workplace and the professional field of practice” (p1). The evaluation of the case studies also supports Gibbs and Simpson (2002) claim that formative assessment is one of the most powerful tools for improving student learning.


Academics such as Broadfoot (2008) have concluded that assessment is not working, or at least not working as it should because the current obsession with summative assessment has reduced much of the today’s educational activity to a mechanistic activity (p212). She calls for a fundamental reconception of the core focus of the educational process in terms of ‘the development of learning’ and to develop assessment procedures that are capable of helping build both an individual’s enthusiasm for learning and their capacity to do so. The findings from the current case studies provide evidence that a strategy of creating space for assessment dialogues fostered student engagement in the feedback and learning process. The students took on responsibility for evaluating and improving their own performance and acted on range of feedback. Overall the findings indicate that education is undergoing a fundamental change and moving to an assessment culture. The challenge is to devise innovative practice to explicitly link assessment tasks for productive learning. The danger in not doing so was highlighted by Biggs (1996) who stated that education innovation will fail if there is not a concomitant innovation of assessment. These case studies are presented as one way of fostering assessment dialogue. The challenge is to continually consider different tasks s a focus for reflective activity. Lessons learnt indicate that relevant tasks align the assessment criteria with self and peer assessment to involve the students in assessment dialogues.


Biggs, J. (1996) Assessing learning quality: Reconciling institutional, staff and educational demands. Assessment and Evaluation in Higher Education, 21(1), 5-15.

Boud, D. (2000) Sustainable assessment: re-thinking assessment for the learning society, Studies in Continuing Education, 22(1). Cited in Jones, L. (2008). The Heart of Formative Assessment. Paper presented at HEA, Stirling, May.

Broadfoot, P. (2008) Chapter 16. Assessment for learners: Assessment literacy and the development of learning power. In Havnes, A. And L. McDowell, Balancing Dilemmas in Assessment and Learning in Contemporary Education, Routledge.

Gibbs, G. and C. Simpson (2002) Does your assessment support student learning? FDTL. Centre For Higher Education Practice, the Open University.

Handley et al. (2007) When less is more: Student experiences of assessment feedback. Paper presented at the Higher Education Academy, July, 2007.

Knight, P. and Brown, S. (1994) Assessing Learners in Higher Education, Routledge.

Knight P. and Yorke, M. (2003) Assessment, Learning and Employability. Open University Press.

Jones, L. (2008). The Heart of Formative Assessment. Paper presented at HEA, Stirling, May.

Nicols. D. and Macfarlane-Dick, D. (2006) Formative assessment and self regulated learning: a model and seven principles of good practice. Studies in Higher Education, 31(2): 199-218.

Weeden et al. (2002) Assessment: What’s in it for schools? Routledge.

Weaver, M. (2006) Do students value feedback? Student perceptions of tutors’ written responses. Assessment and Evaluation in Higher Education, 31(3), 379-394.

Download 50 Kb.

Share with your friends:

The database is protected by copyright © 2020
send message

    Main page