This was a single lecture, which included a formative assessment task. The task involved three short answer questions. In the past I have set the questions, collected the submissions, marked and feedback the work. However, I’ve always question the effectiveness of the learning activity and what do they do with the feedback. Especially, as this is a one-off piece of work, outside of their discipline context.
Therefore, this year I changed my approach and made this a peer assessed piece of work. The intention being if they marked other people’s work, they would (based on Race 2006)
- practice softer skills, eg constructive criticism
- learn from each other and place their own work
- compare themselves with their peers
- engage with the marking criteria
- engage in deep learning eg evaluation
In the past I’d worried about the administrative workload aspect of peer assessment. However, with the Self & Peer Assessment Tool in LearnUCS, the administration has been reduced.
My observations of the process are;
- The technology is very straightforward.
- The system managed allowed me to enter the questions, publish the marking criteria for these questions, allow me to set the timings for the key stages (student submit their work, student marks peers work, student collects they marked work, grades are released through gradebook).
- At all stages I could check who had been doing what, I could access the submissions, the evaluations and the results
- students didn’t seem to have a problem completing the process. Although a proportion must have been strategic, observed it was purely formative and didn’t complete it.
The following describes the broad process.
Create the Peer Assessment submission using the Self and Peer Assessment Tool, which is available within the assessment area. When you create this you need to consider a number of options, for instance; the submission period, the evaluation period, the number of scripts a student will need to mark, and if you’d like the feedback to be anonymous. When you add the individual questions, you can include a model answer steer which will be made available during the evaluation period. For each question you can add a number of marking criteria.
Create time in the lecture plan to allow students to engage with the marking criteria. I released part of my lecture as a video and set this as a pre-requiste to attending. This allowed me to deliver the core knowledge which I would apply in the session. It also created time in the face to face session to set a group activity around the marking criteria and marking some scripts.
The submission point was available and students submitted their work. When the evaluation stage started I announced this on my LearnUCS module, and released a video of marking a number of scripts against the marking criteria.
At the end of the evaluation period, the evaluated copies became available to students. Given this was a formative exercise I read the submissions and scanned the comments. Based on their submissions I drew together some generic feedback.
I announced through LearnUCS and suggested they reflect on their assignment and feedback to answer the question “I have read the feedback provided on my assignment and I have identified areas I can improve. These include ...”
The following video is a walk through of the area in the LearnUCS module, which gives you a sense of how I approached it.
If you have any questions about how you might use peer assessment using LearnUCS, please email the Elevate Team.