PeerWise shows benefits in practice

Laptop computers

© iStock

PeerWise provided an effective peer-learning environment for a student-led revision exercise

The use of PeerWise as a tool to promote peer-learning has been discussed previously in Education in Chemistry,1 and the system is now used in over 700 educational establishments. In this article, Galloway and Burns present evidence of its effectiveness based on evaluation of a two year study at the University of Nottingham.2

PeerWise allows students to write multiple choice questions (MCQs) for their peers along with feedback in the form of model answers. After answering a question, students are able to rate it on difficulty and quality, with a comment function allowing further discussion and clarification of ideas. The authors suggest that the processes of generating, answering and rating MCQs encourages self-reflection and the development of communication and problem-solving skills as well as boosting subject knowledge. This study addresses research questions relating to the quality of student generated content and the nature of student contribution and engagement throughout the process.

PeerWise was implemented as a synoptic revision exercise at the end of a first year module, with the activity representing 5% of the module mark to encourage engagement. After introduction of the assignment, academic involvement was passive allowing students to take full ownership of the material. Participation exceeded all expectations, with 500 questions posted in each year of the trial (n = 163 and 182 respectively) and a total of over 30,000 answers submitted.

To facilitate investigation of the impact of PeerWise on performance, students were initially divided into quartiles based on their scores in a pre-test. After analysis of PeerWise data, the students within each quartile were classified as being in high or low PeerWise activity groups (HPA and LPA). Analysis of summative assessment data at the end of the module showed that students in HPA groups consistently outperformed those in LPA groups across the entire attainment spectrum.

Questions were classified in terms of cognitive demand according to Bloom’s taxonomy, while model answers were rated on a number of criteria relating to the discussion of relevant chemistry, solution strategy and plausibility of distractors. Student generated content was found to be of a very high standard, with 86% of questions classified as ‘high quality’.

The data clearly supports the authors’ assertion that PeerWise provided an effective peer-learning environment for conducting this student-led revision exercise, which may interest others working with learning technology.