The usefulness of self-assessment, peer assessment and academic feedback mechanisms



Download 0.67 Mb.
Page1/2
Date15.02.2017
Size0.67 Mb.
  1   2


Evaluation of the usefulness of self-assessment, peer assessment and academic feedback mechanisms
This exploratory quantitative case study examined students’ perceptions of the usefulness of assessment feedback provided by academics using ReMarksPDF, student self-assessment and student peer assessment. ReMarksPDF is an enterprise level e-marking system designed by the author integrated with Blackboard 9.1 and Moodle 2.1 – see www.remarkspdf.com. The ReMarksPDF workflow includes e-submission, allocation to markers, marking, moderation, and return of annotated PDF assessment submissions to students and marks into Grade Center. A summary is presented of the positive and negative aspects of different types of feedback annotations and how feedback may be improved. 78.7% of students found ReMarksPDF feedback better than that they have received in the past and 70.2% of students agreed or strongly agreed that other courses should adopt the ReMarksPDF system. Students found written comments, text boxes, assessment rubric, underlining, ticks, colour coding, spider chart (with average), spider chart, and smileys to be significantly valuable feedback in that order of preference. Students indicated that ReMarksPDF feedback was easy to read and understand, it was beneficial to have comments appear in a side column note and to have a visual breakdown of results according to assessment criteria. Students were ambivalent about inclusion of either audio or video comments and did not prefer audio to written comments. Students appear to prefer text-based rather than more abstract presentations of feedback. Students who completed a self-assessment rubric or peer assessment rubric reported that this assisted in their understanding of the associated marking rubric and the results they received for their assessment. In contrast, students who received a peer assessment and also completed a self-assessment rubric were negative in their assessment of whether these two measures in concert assisted in their understanding of the marking rubric and the results they received for their assessment. There was only a weak level of accuracy in student self-assessment and peer-assessment ability compared with professional marking. E-marking software, such as ReMarksPDF, will have a positive effect on student engagement and perceptions of feedback mechanisms by enabling markers to efficiently provide detailed individual feedback, outlining strengths and weaknesses of the student assessment submission and avenues for self-improvement.

Keywords: Assessment feedback, self-assessment, peer assessment, smileys, audio and video feedback, ReMarksPDF.

Introduction
Assessment drives student learning and effort (Kendle & Northcote, 2000) and in turn influences the direction and quality of student learning (Maclellan, 2004). Numerous literature reviews indicate that feedback is critical to improving the standard of student work and learning (Black & William1998a; Hattie 1999; Heinrich 2006, Huber & Mowbray 2011) and that both formative and summative assessment directly affect student engagement. The structure of assessment designs often includes formative feedback for self-improvement rather than summative feedback concerned with marks and grades. Feedback, at its best is individual in focus, outlining strengths and weaknesses and avenues for self-improvement (Linn & Miller, 2005; Heinrich 2006). Electronic feedback management systems such as ReMarksPDF are promoted on the basis that they offer opportunities for improvement in assessment practice and outcomes for students, including:


  • E-submission, allocation, marking, moderation and assessment return via a learning management system.

  • Extensive annotation and commentary features, including rubrics, stamps, electronic dashboards and charts.

  • Links to electronic portfolios classified by learning outcomes or graduate attributes.

  • Quality management including consistency among markers, reporting or results, and self-reflection amongst markers.

ReMarksPDF, Blackboard 9.1 and TurnitIn all contain enterprise level feedback systems. ReMarksPDF is the most comprehensive of the three – in terms of annotation types and sophistication, moderation and on and off-line capabilities.


ReMarksPDF has extensive PDF annotation capabilities including:



  • Automatic insertion of text based comments, known as Auto Text;

  • Automatic insertion of sound based comments (enabling mark by voice), also known as Sounds;

  • Automatic insertion of video based comments (enabling links to streamed video);

  • Share text and sound comment libraries with colleagues over the Internet;

  • Associate marks, criteria and comments with student assessment;

  • Automatic addition of marks;

  • Highlight colours with designated meanings, or in other words, Colour code your documents;

  • Specialist stamps designed for marking, showing the emotion of the marker for more personalised
    feedback to students;

  • Ability to designate macros for Auto Text, Sounds, and Video links;

  • Import and export .cvs database files, linking marking to student documents, and uploading to a
    reporting system.

  • Drag and drop graph gallery, indicating individual and relative student performance.

  • Style tool specifically designed to rapidly incorporate English Style and Grammar comments for
    essays, plus the ability to build specialist comment libraries in any discipline;

  • Advanced moderation capabilities enabling statistical and visual comparison of markers, individual and
    global moderation of student assessment; and

  • Integration with Learning Management Systems – Blackboard 9.1 and Moodle 2.1.


Examples of annotation types provided by the ReMarksPDF system appear in Figure 1.



Figure 1: Spider chart, Spider chart (with average), Smiley scale
A lecturer simply downloads the software, installs it on their desktop or tablet PC, creates an assessment in their Learning Management System for e-submission, allocates submissions to markers, opens a student assessment submission and starts marking. Once marking and moderation is complete the marked assessment is automatically returned to students and marks stored in Grade Center. The student logs into their Learning Management System and can retrieve and view all the marking annotations made on their PDF assessment submission. The e-marking workflow is complete. The objective is to provide high quality, consistent feedback for a reasonable exertion of effort by the academic and cost to the academic unit or department. Lecturers and markers can save considerable time by reusing annotations relating to common student errors, and in moderation practices.
The primary purpose of this exploratory case study was to examine student perceptions of self-assessment, peer assessment and assessment by an academic marker as alternative and cumulative approaches to student feedback using the ReMarksPDF Feedback Management System. The study produced a range of statistics useful for planning a later mixed method research project exploring assessment feedback, particularly that employing enterprise level automated systems, such as ReMarksPDF.
Lew et al (2010) found that overall correlations between the scores of self-, peer and tutor assessments suggested weak to moderate accuracy of student self-assessment ability, together with an ‘ability effect’; i.e., students judged as more academically competent were able to self-assess with higher accuracy than their less competent peers. Lew concluded that there is no significant association between student beliefs about the utility of self-assessment and the accuracy of their self-assessments.
A subsidiary purpose of this exploratory case study was to examine Lunt and Curran (2010) results that suggest electronic audio feedback has advantages compared with written feedback and that students were 10 times more likely to listen to audio commentary than open written comments. Student opinions were sought on their preference for audio feedback.
Method
Third year Law students enrolled in LS377 Information Technology Law (n= 60) at the University of New England voluntarily completed a survey on feedback received in relation to an assignment submitted in PDF format in satisfaction of 30% of their grade. A single marker marked student submissions using ReMarksPDF without knowledge of each student’s self or peer assessment marks. All assessment was based on 8 criteria and marked on a 5 - point LIKERT scale from excellent to very poor. Volunteer students were randomly assigned to one of four groups. Each group had 15 participants. Refer Table 1.

Table 1: Groups, Feedback type and Response rate





Groups

Feedback type

n

Group 1

Self-assessment

15

Group 2

Peer assessment of Group 3

15

Group 3

Self-assessment + Peer assessment from Group 2

15

Group 4

Control group

15

There is nothing to suggest that the implications of this exploratory study would not be generalizable to students of other disciplines. The assessment task was an essay on a topic of student choice in the field of Information Technology Law. This is not unlike any essay requiring student discipline based research. The focus of this research is on feedback not discipline specific content.


All students were provided with academic feedback consisting of an assessment rubric (Appendix A), colour coding of their text according to a colour key (Appendix B), pre-prepared comments based on a marking guide, and a final mark. Marking was done electronically using ReMarksPDF <www.remarkspdf.com>.
A survey instrument was prepared and administered on-line using Qualtrics <www.qualitics.com>. All four groups were evaluated on their perceptions of the usefulness of the feedback they received.
Results
Students were asked to rate the overall value to them of the types of feedback annotations received. The results appear in Figure 2. A one-sample t test indicated all were significant at the 5% level (p = 0.000). A one-way ANOVA and post-hoc multiple comparisons did not reveal any association with sex, age, mark received, type of smiley, mode (full-time, part-time), equivalent full-time year of study or age, or group. In total, 76.1 % of students of all groups combined reported that the feedback received was above average or excellent. There was no significant difference between groups.

Figure 2: Overall value of Feedback – All Groups


Students rated the different types of annotations they received on a 5-point LIKERT scale from 1 Useless, 3 Neutral, through to 5 Very Useful. The average ratings are shown in Table 2. The table indicates that all forms of annotations were significantly more useful than a neutral value of 3.


Table 2 ranks the annotations from the most useful to the least according to student perceptions. Written comments are rated as most important, with more abstract annotations, such as colour coding and statistical representations rated of lower importance. The most abstract annotation in the form of a smiley indicating the emotional rating given by the marker was rated as least valued, though still significant.
Table 2: Annotation data



Annotation

n

1

2

3

4

5

Mean

t

p*

Written comments

47

0 (0.0%)

4 (8.5%)

16 (34.0%)

27 (57.4%)

0 (0.0%)

4.49

15.585

0.000

Text boxes

45

0 (0.0%)

0 (0.0%)

4 (8.9%)

19 (42.2%)

22 (48.9%)

4.40

14.368

0.000

Assessment rubric

47

0 (0.0%)

2 (4.3%)

6 (12.8%)

26 (55.3%)

13 (27.1%)

4.06

9.554

0.000

Underlining

40

1 (2.5%)

1 (2.5%)

7 (17.5%)

21 (52.5%)

10 (25.0%)

3.95

6.862

0.000

Ticks

43

2 (4.7%)

11 (25.6%)

20 (46.5%)

10 (23.3%)

0 (0.0%)

3.88

7.045

0.000

Colour coding

46

0 (0.0%)

4 (8.7%)

11 (23.9%)

18 (39.1%)

13 (28.3%)

3.87

6.318

0.000

Spider Chart with average

46

2 (4.3%)

3 (6.5%)

9 (19.6%)

22 (47.8%)

10 (21.7%)

3.76

5.084

0.000

Spider Chart

46

2 (4.3%)

5 (10.9%)

11 (23.9%)

18 (39.1%)

10 (21.7%)

3.63

3.950

0.000

Smiley

47

3 (6.4%)

4 (8.5%)

19 (40.4%)

15 (31.9%)

6 (12.8%)

3.36

2.406

0.020


Download 0.67 Mb.

Share with your friends:
  1   2




The database is protected by copyright ©sckool.org 2020
send message

    Main page