Tim Oates looks at what we can learn from international comparisons

0116EiCEndpoint300m

© Cambridge Assessment

Many argue we can learn a lot from education in other nations, benefitting by borrowing education policies. There are others who argue we can learn nothing from other systems. For them, other systems are like a voyeuristic focus of academic curiosity, not a source of insight into potential domestic action. There is a third group, who use data from international surveys to frighten their home audience into compliance with their own policy recommendations – a new version of the hysterical cold war cry ‘the Russians are coming!’

Fortunately, there are international comparativists who are cutting through this conflict. Jeremy Hodgen, now at the University of Nottingham, conducted a study that highlighted the need for forms of maths qualifications other than A-level, to better meet the needs of higher education and the economy. Hilary Steedman at the London School of Economics has for many years dug deeply into vocational systems around the world, and outlined key mechanisms in initial industrial training that are not exploited in England and could be, to good effect. Paul Morris, UCL Institute of Education, continues as editor of Compare, the journal of comparative studies, and is pushing methodology in comparative education.

My view is that we can learn from others’ systems, but we must not naively cherrypick or borrow policies. To avoid these errors, and create what the late and great comparativist David Raffe called ‘policy learning, not policy borrowing’, sophisticated and sensitive analytic approaches have been developed at Cambridge Assessment, drawing on the work of Bill Schmidt in particular.

Using survey data from the Trends in International Mathematics and Science Study, Bill identified differences between high performing jurisdictions and lower-performing systems. This highlighted the significance of ‘curriculum coherence’. A system is regarded as coherent when the national curriculum content, textbooks, teaching content, pedagogy, assessment, and drivers and incentives are all aligned and reinforce one another. ‘Curricular materials in high-performing nations focus on fewer topics, but also communicate the expectation that those topics will be taught in a deeper, more profound way.’ Their analysis of mathematics emphasises curriculum coherence should also be demonstrated through arranging concepts in an appropriate age-related hierarchy.

Their extended analysis suggests there is no rigid association between a system possessing curriculum coherence and being subject to tight, top-down control.

From this emerge three distinct insights:

  • That curriculum coherence should be a fundamental policy aim
  • That the mechanisms for control, in order to ensure coherence, rely on subtle management of the interacting control factors in a system. All elements in a system should line up, so that contradictions are not set up in the different elements, and professionals are not subject to contradictory incentives and targets
  • The content of curriculum frameworks, textbooks etc, should be arranged into an evidence-based age-related hierarchy

Control factors

We developed a framework at Cambridge Assessment to elaborate the concept of curriculum coherence. This is a list of system elements that should sit in coherent relation:

  1. Curriculum content
  2. Assessment and qualifications
  3. National framework for qualifications
  4. Inspection
  5. Pedagogy
  6. Professional development
  7. Institutional development
  8. Institutional structures (eg size of schools)
  9. Allied social measures (linking social care, health care and education)
  10. Funding
  11. Governance
  12. Accountability arrangements (targets, performance measures)
  13. Labour market/professional licensing
  14. Allied labour market regulation (eg health and safety legislation)

Control factors are aspects of systems about which policy action can be formulated. They do not include explanatory factors, which, alongside control factors, also explain the form and performance of an education system. Explanatory factors include those relating to history of a jurisdiction, economy, society etc. We highlight control factors since they help inform the development of deliberate policy.

These control factors interact in complex patterns. Because the factors are frequently in different states of development in each national setting  the precise operation and actions needed in each setting are different.

This has proved to be a powerful approach to understanding how national educational arrangements operate, and what action carries the greatest potential to effect improvement in each setting. It also gives insights that can inform domestic policy.

The recent revised national curriculum reinstated the detailed list of key concepts and knowledge – conservation of mass, terms around molecular structure and so on – that had been removed in the 2007 version. International comparisons made it clear that our national curriculum had become far too general and vague. But pundits said: ‘The detail isn’t necessary … look at Singapore’s very general national curriculum statement … and they do very well.’ Such a view fails to look at how different control factors are used in different jurisdictions. Singapore didn’t need a highly specific national curriculum since they have very detailed state-approved textbooks. But England does not have state-approved textbooks; it is only the national curriculum statements that provide the legally binding requirement. So the government has made an evidence-based decision to move to a more specific curriculum. This insight came from the factors analysis, which helps us to analyse the system we have, and helps us move towards the system we need.

Tim Oates is group director of assessment research and development at Cambridge Assessment