Benchmarking students’ ability after any transition between levels of study allows tailored teaching, but requires careful question design. Katherine Haxton shares her tips for where to begin

Motivation and ability vary widely within first year undergraduate chemistry cohorts. The range of entry qualifications and routes to higher education available in the UK mean assumptions cannot be made about prior knowledge. Most first year programmes assume a low level of prior knowledge of chemistry, electing to cover a wide range of general chemistry topics quickly. However, this can overwhelm students with less experience, and risks alienating higher achievers whose knowledge has survived the summer.

Rainbow coloured question marks

Source: bellabrend / iStock / Getty Images Plus

Diagnostic testing is an efficient way to benchmark a class’s understanding at the beginning of a course

Diagnostic testing is an efficient way to benchmark a class’s understanding at the beginning of a course. It provides a means to tailor teaching to the needs and misconceptions of the cohort. Developing diagnostic tests has been extremely useful in my development as a teacher of new undergraduates. It is also useful after any transition between levels of study, such as primary to secondary education, GCSE to A-level or even first to second year of university.

Targeted questions

I deliver my diagnostic tests through Google Forms. This format also offers an option to automatically mark and return scores and feedback to the student.

Finding or writing good multiple choice questions (MCQs) is essential. Questions should evaluate one or two concepts. The concepts or ideas students always struggle with, or common errors noted in class, are obvious ones to test. MCQs should have sufficient distractor answers to accommodate the most common misinterpretations of concepts. It can take several iterations to get a question right and testing them is useful.

Experienced teachers with lots of pedagogical content knowledge (PCK) may find writing questions straightforward. The act of finding ideas for MCQs and distractor answers may boost PCK for others.

A good start for MCQ ideas are existing written responses to questions. Analyse and categorise errors and establish the most common. Discard a mistake made by a single student, and any vague, self-contradictory or ambiguous answers. These aren’t helpful for writing diagnostic MCQs. Note where students misinterpret questions or omit required parts of answers. You can use these notes to improve students’ exam technique. Categorise any procedural issue, such as rearranging equations incorrectly, as systematic error. Categorise any answers that show conceptions inconsistent with appropriate scientific concepts and ideas as alternative conceptions. The alternative conceptions and systematic errors are good sources for writing MCQs provided there is a sufficient range of alternative conceptions around a single idea.

Another source of inspiration for MCQs are common alternative conceptions reported in literature.

Gauging confidence

Pencil and multiple choice test answer paper

Source: Vixit / Shutterstock.com

Multiple choice questions should have sufficient distractor answers to accommodate the most common misinterpretations of concepts

While lots of information can be gained from answers to carefully selected MCQs, even more can be gleaned from how confident a student is in their answer. You could include a ‘don’t know’ option, but its use for assessing students’ ability is debatable. I prefer to force students to pick an answer and indicate on a confidence scale whether they have guessed or are certain. After each MCQ, I add ‘Please rate how confident you are in your answer’. Students are given the options: ‘I am guessing’; ‘Not confident’; Neither confident nor unconfident’; Fairly confident’ and ‘Very confident’.

Three groups will emerge from the confidence data: students with lots of confidence in their correct answers; students with little confidence in their correct answers; and students with lots of confidence in their incorrect answers. The first group may have the strongest foundation for new learning but may be resistant to, or frustrated with, engaging with known, familiar things. The second group may need more consolidation of the key concepts to boost their confidence and knowledge. The third group may struggle to see the need to address issues in their understanding. Low-performing students overestimating their skill level is known as the Dunning–Kruger effect.

The next level

In first year diagnostic tests, I cover topics that should be familiar and will be covered again in first semester teaching. However, in second year tests, I set questions that get students to extrapolate from familiar topics into new territory. For example, I ask them to apply their understanding of proton NMR to phosphorus and fluorine NMR. From this I can judge which students understand the principles of proton NMR interpretation, and which have memorised certain scenarios.

This kind of diagnostic testing has changed my teaching. I now spend time revising how to deduce the shape of molecules with second year students because I know they struggle to apply valence shell electron pair repulsion (VSEPR) theory. Through a similar process, I established that third years struggle to count d-electrons in transition metal ions. Diagnostic tests enable me to teach better, but also make me a better teacher.

 Resources for writing diagnostic tests

 Resources for writing diagnostic tests

  • Common misconceptions about bonding and how they often arise: rsc.li/2AthPkk
  • Students’ misconceptions in eleven conceptual areas of chemistry: rsc.li/2PsDOAA
  • The book Chemical misconceptions discusses misconception prevention, diagnosis and cure: rsc.li/2AsHSZ1