Where angels fear to tread: online peer-assessment in a large first-year class
- Authors: Mostert, Markus , Snowball, Jeanette D
- Date: 2012
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/69289 , vital:29480 , https://doi.org/10.1080/02602938.2012.683770
- Description: In the context of widening participation, large classes and increased diversity, assessment of student learning is becoming increasingly problematic in that providing formative feedback aimed at developing student writing proves to be particularly laborious. Although the potential value of peer assessment has been well documented in the literature, the associated administrative burden, also in relation to managing anonymity and intellectual ownership, makes this option less attractive, particularly in large classes. A potential solution involves the use of information and communication technologies to automate the logistics associated with peer assessment in a time-efficient way. However, uptake of such systems in the higher education community is limited, and research in this area is only beginning. This case study reports on the use of the Moodle Workshop module for formative peer assessment of students’ individual work in a first-year introductory macro-economics class of over 800 students. Data were collected through an end-of-course evaluation survey of students. The study found that using the feature-rich Workshop module not only addressed many of the practical challenges associated with paper-based peer assessments, but also provided a range of additional options for enhancing validity and reliability of peer assessments that would not be possible with paper-based systems.
- Full Text: false
- Date Issued: 2012
Dancing with the devil: formative peer assessment and academic performance
- Authors: Mostert, Markus , Snowball, Jeanette D
- Date: 2013
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/69301 , vital:29483 , https://doi.org/10.1080/07294360.2012.705262
- Description: Peer assessment can be important in developing active and independent learners, as well as providing more and faster feedback in large classes, compared to marking done by tutors. In addition, the evaluative, critical stance required by students in order to assess their peers' work encourages the development of higher-order cognitive skills. Changing roles from being assessed to being an assessor can also improve students' ability to judge and improve on their own work. However, peer assessment does have potential problems and there is some debate as to the appropriate academic level at which to implement it, the kinds of feedback that are given and the ways in which students respond. In addition, there is little evidence that peer assessment has an impact on academic performance. This research reports the results of an online peer assessment exercise for a macroeconomics essay conducted in a large Economics 1 class at Rhodes University. Of the 800 students, about half participated in the peer assessment exercise. Data were collected from students via a formal course evaluation. In addition, a sample of 50 essays was evaluated in terms of the relationship between peer marks and final (tutor) marks received and the impact that peer assessment had on the quality of the final essay submitted. An Ordinary Least Squares regression was used to investigate the impact of peer assessment participation on marks. Results showed that peer marks tended to ‘bunch’ in the 60–68% range, indicating the reluctance of peers to give very high or low marks. In general, peers gave more useful feedback on technical aspects, such as presentation and referencing (which were also the categories in which students most often made improvements), than on content. Regression analysis showed that peer assessment participation was not a significant determinant of final essay mark, but that economics ability and English language proficiency were.
- Full Text: false
- Date Issued: 2013
Where angels fear to tread: online peer-assessment in a large first-year class
- Authors: Mostert, Markus , Snowball, Jeanette D
- Date: 2013
- Language: English
- Type: text , article
- Identifier: http://hdl.handle.net/10962/68845 , vital:29330 , https://doi.org/10.1080/02602938.2012.683770
- Description: Publisher version , In the context of widening participation, large classes and increased diversity, assessment of student learning is becoming increasingly problematic in that providing formative feedback aimed at developing student writing proves to be particularly laborious. Although the potential value of peer assessment has been well documented in the literature, the associated administrative burden, also in relation to managing anonymity and intellectual ownership, makes this option less attractive, particularly in large classes. A potential solution involves the use of information and communication technologies to automate the logistics associated with peer assessment in a time-efficient way. However, uptake of such systems in the higher education community is limited, and research in this area is only beginning. This case study reports on the use of the Moodle Workshop module for formative peer assessment of students’ individual work in a first-year introductory macro-economics class of over 800 students. Data were collected through an end-of-course evaluation survey of students. The study found that using the feature-rich Workshop module not only addressed many of the practical challenges associated with paper-based peer assessments, but also provided a range of additional options for enhancing validity and reliability of peer assessments that would not be possible with paper-based systems.
- Full Text: false
- Date Issued: 2013