The assessment of the new GCSE Science curriculum, introduced in September 2006, is dominated by fixed-response formats, including multiple-choice tests. Of what value is this type of assessment, and does it reflect the spirit and aims of the new curriculum?
Fixed-response items can be used to test knowledge and the application of knowledge
- Higher skills of analysis, synthesis and evaluation cannot be assessed adequately by fixed-response tests
Until the 1950s and early 1960s, only a minority of young people took O-levels in science subjects and there was no serious ambition to provide a worthwhile science education for all 14-16-year-olds. Typical O-level chemistry questions emphasised recall, see, for example, Box 1. To encourage more students in grammar schools to take the sciences post-16, a new curriculum was developed with the support of the Nuffield Foundation in the mid-1960s.
Box 1 - Examples of O-level chemistry questions
Q State the law of constant composition. Starting from metallic copper, outline the method by which you could obtain: (a) a solution of copper nitrate; (b) copper hydroxide. State how, from each of these products, it is possible to obtain copper oxide. How would you now proceed to use these two samples of copper oxide to verify the law you have stated?
Q Describe, with a sketch of the apparatus, how you would prepare several gas jars of dry ammonia. Starting from ammonia, how would you show that it contains nitrogen and hydrogen? Give one large-scale use, different in each case, for ammonium chloride, ammonium sulphate and ammonium nitrate.
The Nuffield O-levels: a radical approach
The Nuffield O-level courses in biology, chemistry and physics were characterised by a reliance on practical work carried out by students and a spirit of inquiry that infused the teaching. The aim of these courses was to encourage conceptual understanding and to reduce the emphasis on factual recall. However, if they were to be successful, then new approaches to assessment, which reflected the aims of the new courses, were essential and need to be developed.
Project teams for the Nuffield-funded O-level programmes, working in partnership with the three leading examining boards (now called awarding bodies), took up the challenge. The Nuffield chemistry team collaborated with John Matthews and other leading members of the London Schools Examination Board (one of the precursors of Edexcel) to develop new models of assessment for the chemistry course.1
Aims and impact of assessment
At the time all exams were taken at the end of the course. Matthews and his team, concerned about the impact of styles of examining on teaching, wanted to do as much as they could to ensure that the new examinations would encourage teachers to adopt the aims and methods of the Nuffield programme. To provide a structure to their thinking, they adopted a simplified version of Bloom's taxonomy of educational objectives.2 They wanted to ensure that the exams not only rewarded knowledge in the form of the recall of facts, procedures, patterns and principles but also tested:
comprehension - the ability to translate information as well as routine problem-solving in familiar situations;
application - testing the skills needed to apply knowledge in unfamiliar situations; and
higher skills - ie analysis, synthesis and evaluation.
By adding these dimensions to the test grids used to monitor and evaluate the examinations, the team ensured that the test papers were educationally valid and true to the spirit of the Nuffield approach. Their work led to innovations in examining which have stood the test of time. Remarkably, perhaps, the specimen written papers for Edexcel's new 2008 AS-level chemistry course use essentially the same types of assessment that were used for Nuffield O-level chemistry in the 1960s - fixed-response (mainly multiple-choice) items,3 structured questions and free-response items.
Fixed-response items in chemistry examinations
Chemistry examiners have found that fixed-response items of various kinds have a valuable and valid contribution to make to testing.
A collection of multiple-choice items, for example, makes it possible to sample student learning over the full range of a course. The broad coverage can enhance the reliability of a test.
Multiple-choice items of various format have been tried out. Originally, the examination boards pre-tested such questions to ensure that only those of appropriate difficulty and discrimination were used in test papers. The most common type is the standard item which requires students to pick the best answers from four or five options. The multiple-completion format is also appropriate in chemistry because there are many instances where a full answer cannot be reduced to a single statement (Box 2).
It is relatively easy to devise fixed-response items to test knowledge, comprehension and the application of learning in unfamiliar contexts. Generally, however, attempts to devise items to assess the higher skills of analysis, synthesis and evaluation have led to contrived formats that have not stood the test of time.
Box 2 Example of a multiple-completion item
Q Isotopes of an element are related in that they have the same:
- relative atomic mass;
- position in the Periodic Table;
- number of neutrons in the nucleus;
- arrangement of electrons.
- A if 1, 2 and 3 only are correct.
- B if 1 and 3 only are correct.
- C if 2 and 4 only are correct.
- D if 4 only is correct.
Fixed-response items in teaching and learning
Fixed-response items also have a place in teaching and learning. Evidence from the Assessment Reform Group (a group of researchers set up by the British Educational Research Association (BERA) as a Policy Task group on assessment), for example, has highlighted the benefits of assessment to inform teaching 4 and enhance learning.5 A project carried out by the Evidence-based Practice in Science Education (EPSE) Research Network explored the use of fixed-response items for these purposes in science education.6 The team, led by Robin Millar at the University of York, worked with teachers to develop a bank of diagnostic questions (see, for example, Box 3) to monitor and evaluate students' learning in four areas, one of which was chemical change.7 They explored how such questions can be used to support teaching and learning of the central scientific concepts that feature in the Key Stage 3 programme of study.
The EPSE research team identified several effective ways for teachers to use the questions in their classes. The findings suggest that fixed-response items can help to provoke discussion in small groups of learners who can then be challenged to explain and defend their answers. At the beginning of a lesson, a single item can be used to get students thinking. Alternatively, one or two of the questions can be the focus of the final lesson to check their understanding of the key ideas introduced in the lesson. Another approach that some teachers have tried successfully, is to use a few diagnostic questions at the start of a new topic, to find out what students can recall of previous learning. Monitoring attainment in this way helps to stress the importance of understanding fundamental ideas rather than simple recall of facts, rules and conventions that all too often dominate in external tests and examinations.
Box 3 Sample EPSE diagnostic question
Below are some statements about the particle model of a gas. All the statements are correct.
In a gas:
- A The particles are far from each other.
- B The particles move around rapidly in all directions.
- C The particles collide with the walls of the container they are in.
- D The particles are too far apart to exert any force on each other.
- E If you heat a gas, the average speed of the particles gets bigger.
Which of the statements A-E help to explain each of the following? Write a letter (or letters) on the line to show your answer.
(a) Gases are fairly easy to compress.
(b) Gases spread out to fill the whole space they are in.
(c) Gases don't settle to the bottom of a container, but fill the whole space.
(d) Gases are less dense than liquids and solids.
2006 GCSE assessment
Since the 1960s, multiple-choice items have never completely disappeared from public examinations, and have been used in the main as short, end-of-module tests. The cost of administering pre-tested multiple-choice items before they are operational, however, has meant that this process has largely been dropped. Now the awarding bodies for GCSE and GCE examinations depend on learning from the data from previous tests when compiling new examinations.
The development of new technologies that incorporate on-screen scoring and the electronic processing of marks means that it is now possible to provide tests with a mixture of fixed-response and other types of test questions relatively easily. From the point of view of the awarding bodies, fixed-response items are attractive because they can be scored by relatively unskilled markers, leaving specialist teachers the task of dealing with responses where judgement is needed to award the marks correctly.
With the large numbers of young people taking GCSE Science, the provision of end-of-module tests has become popular. So the pressure on awarding bodies to adopt fixed-response items for such short periodic tests in the new GCSE Science courses is understandable. However, they should not have been allowed to exclude opportunities for students to demonstrate the higher skills in Bloom's taxonomy.
The GCSE Science curriculum reflects the rationale for compulsory school science that was set out in Beyond 2000.8 At the core of the new curriculum is the aim of developing scientific literacy for all secondary school students. The Qualifications and Curriculum Authority's (QCA) programme of study9 states, among many other things, that students should be taught to:
evaluate methods of data collection and consider their validity and reliability as evidence;
present information, develop an argument and draw a conclusion, using scientific, technical and mathematical language, conventions and symbols and ICT tools;
consider how and why decisions about science and technology are made, including those that raise ethical issues, and about the social, economic and environmental effects of such decisions.
Educational aims such as these call for the higher skills of analysis, synthesis and evaluation. These are not skills that can be appropriately tested with fixed-response items. Despite this some of the awarding bodies have adopted models of assessment for GCSE Science that rely almost exclusively on multiple-choice and other fixed-response formats, such as matching and sequencing exercises. Such forms of assessment do not reflect the true spirit of the new curriculum.
The introduction of the 2006 GCSE Science courses is as radical a change as that intended by the Nuffield O-level courses aimed at the grammar school population in the 1960s. Unfortunately, the thought that has gone into devising the courses and to creating new resources for teaching and learning has not been complemented by appropriate investment in new approaches to assessment. This work remains to be done and is urgently needed.
Andrew Hunt is co-director of the Twenty-first century Science Project.
Nuffield Curriculum Centre
The Nuffield Foundation has supported innovations in teaching and learning for over 40 years
University of York Department of Educational Studies
Information on the Evidence-based Practice in Science Education (EPSE) Research Network, with EPSE teaching resources including diagnostic questions
King's College London Department of Education & Professional Studies
Selected publications of Professor Jonathan Osborne, including link to pdf of 'Beyond 2000: Science Education for the Future'
Qualifications and Cirriculum Authority
National Curriculum Key Stages 3 and 4
Twenty First Century Science
Twenty First Century Science is a set of GCSE science courses developed to give all 14 to 16 year olds a worthwhile and inspiring experience of science
- C. K. Tittle, and K. M. Miller, Assessing attainment. London: Independent Assessment and Research Centre, 1976.
- B. S. Bloom (ed), The taxonomy of educational objectives. London: Longmans Green, 1956.
- Nuffield Curriculum Centre website (accessed March 2008)
- P. Black et al, Science inside the black box. London: NFER-Nelson, 2006.
- Assessment Reform Group, The role of teachers in the assessment of learning. London: Institute of Education, 2006.
- R. M. Millar et al, Improving subject teaching, lessons from research in science education. London and New York: Routledge, 2006.
- Website with EPSE Teaching resources including diagnostic questions (accessed June 2008):
- R. Millar and J. F. Osborne (eds), Beyond 2000: Science education for the future. London: King's College London, 1998
- Qualifications and Curriculum Authority, National Curriculum website (accessed March 2008)
No comments yet