Teachers Christine and Peter Sneddon introduce their novel feedback method for quicker and more effective marking in secondary and higher education
Ask any teacher – whether in primary, secondary, further or higher education – what part of the job they don’t enjoy so much and marking is likely to be mentioned. Ideally, the time taken for a teacher to mark should be less than for the student to complete the assessment. But when it comes to written comments for an entire class, it can be close.
We know feedback is important and a written comment conveys more than a mark. Yet how often have we spent time writing comments on every book, only to see those ignored in favour of the mark? Worse, how often have we seen the same mistakes crop up the next time, as the pupil hasn’t fully taken on board the instructions we’re giving?
Is there a way to make the feedback clearer, to ensure the student engages with it, and to monitor the effect over time? And is there an efficient way to provide such feedback on a large scale? This article explores one such method – the use of hashtags. While our examples are drawn from physics settings, this system is independent of discipline and easily tailored to chemistry too. We also suspect non-physicists may see some familiar issues in the lists below.
Christine: At secondary level
In physics there are several simple errors that students make regularly, including missing units, wrong formula manipulation and incorrect conversion of units – all of which can result in significant loss of marks.
To tackle these issues, when marking my students’ homework I assigned a hashtag number to the most common errors. For my class these were:
#1 Missing units
#2 Wrong conversion of units
#3 Wrong physics
#4 Incorrect rearrangement of formula
#5 Maths error
When I came across one of these mistakes in the pupil’s work, I wrote the corresponding hashtag at the bottom, but didn’t write in the specific error. I gave a mark for the homework as usual.
The next day I went over the homework before returning the books, ensuring pupils were listening and not just checking their grades. When the homework jotters were handed back, I posted the hashtags on the board with their meaning. Each pupil then checked their work, found where the error occurred, and corrected it. Through whole class discussion and explicitly flagging the common errors, every member of the class is reminded of the importance of checking answers carefully before submission.
As the term progressed, I saw a reduction in the frequency of several of the common mistakes. This was also true in the prelim exam, where the pupils dropped fewer marks than in previous years for these simple errors. This, along with other routines in the classroom such as regular spaced practice, helps prepare the class to achieve their best grades in the exam.
Peter: In higher education
Listening to my wife describe this # approach to providing feedback, I was struck by how it could be extremely useful to me as a teacher in higher education, but with a different focus. I could use it as a mechanism to provide a form of semi-personalised feedback.
Teachers in higher education will know that students want personalised feedback on their work, but there is a problem of scale. A physics or chemistry class in a Scottish school can have no more than 20 students; a first-year physics or chemistry class at university can have over 300. And while there is more than one teacher responsible for delivering that teaching, providing more than just a mark to such a large class, especially repeatedly over an academic year, is a nigh-on impossible task. I saw the # approach as a way to address this, while also helping our students get out of the habit of making simple mistakes.
In response to a request for more opportunities to practise their problem-solving skills, we introduced weekly homework exercises. Over the first three, I created a list of #s based around the recurring problems. These #s, and a final mark, were all that went on the students’ work. When they collected their sheets, they had to actively engage with them by using the list of #s that explained where they had lost marks.
The final list used was as follows, which had some overlap with the list used in school:
#1 Too many decimal places/significant figures used
#2 The wrong number was used in a calculation
#3 The wrong result was obtained from a calculation
#4 The wrong equation was used
#5 The submitted work was too messy to understand/follow/read
#6 Units were missing from an answer
#7 The question attempt was incomplete
#8 No working was shown, so impossible to know where answer came from
#9 The wrong units were used in the answer
#10 Did not answer the question that was asked (eg calculated the wrong variable)
#11 Provided an incorrect factual answer
Using this system allowed me to mark 250+ submissions per week on 10 occasions, taking no more than around four hours for each. It was an extremely efficient means to provide feedback to the class – feedback they had requested.
The response to this scheme from the students was positive. Sadly, I had to abandon my original plan to run a full, detailed evaluation due to the Covid-19 shutdown in March 2020, but before that anecdotal feedback from the students was uniformly positive.
Ultimately, the # system provides a mechanism for helping students learn to avoid frequent pitfalls, as well as to provide detailed feedback to larger groups in an efficient manner. Why not try it out with your classes?
Hear Christine and Peter talk more about their # feedback method in their VicePhec 2020 talk: