Shoud international test results be used with caution?

Back to FAQs

Yes.   It’s important international education test results should be kept in proportion. They test only a limited number of subjects at set age groups every three to five years.

Unfortunately, governments have a tendency to concentrate only relative rankings (ie league table position) and use an apparent fall in league table position to justify policies. The Government uses PISA rankings to find “evidence” from top-performing PISA countries to support Government policy (see case study below).

The test results are sometimes contradictory. For example, the performance of English 14/15 year-olds in PISA 2009 was contradicted by TIMSS 2007. In top-performing Finland, 15 year-olds outscored most of the world in PISA 2009 but in TIMSS 2011 (maths) Finnish 14 year-olds were ranked as average performers along with England.  The below-average performance of English 16-24 year-olds in literacy/numeracy in the OECD Adult Skills Survey 2013 is contradicted by the performance of UK 15 year-olds in PISA 2013 which showed UK pupils performing at the OECD average for reading and maths (and above average in Science).

These international tests provide an important snapshot – but they should not be used as a sole justification for policies. Nor should they be seen as a judgement on a country’s whole education system. For example, top performing South Korean teenagers are the unhappiest among OECD countries while Finland’s top performing system appears less effective in stretching the most able. The Learning Curve found that there's no magic bullet for school improvement so picking bits-and-pieces from countries which perform well in international tests and transposing these to very different countries is unlikely to improve performance.

The tests don’t just provide scores and rankings. The organisers research and analyse a range of other factors such as socio-economic background, maternal education, private and state schools and how many books are available in the home. However, this wealth of analysis is often ignored in favour of the raw test results and rankings.

The Sutton Trust (2013) warned against jumping to conclusions about a country's education system based solely on league table rankings. It could be misleading.

Case study: How the exam system of just one top performer can be used as “evidence” to support the Government’s proposed changes to England’s exam system.

Singapore is a top performer in international education league tables. The Government argues that Singapore’s “rigorous” exam system (O levels at 16) causes Singapore’s success. The Government uses this to justify its overhaul of English exams at 16+. However, top-performing Finland doesn’t test pupils until age 18/19 when they take a minimum of four tests. In high-performing South Korea pupils receive either a High School Certificate or a Vocational High School Certificate at age 18+. Only those South Korean pupils wishing to attend university or junior college take the College Scholastic Aptitude Tests (CSAT). Top-performing Hong Kong has just replaced its O and A level type exams with the Hong Kong Diploma of Secondary Education, comprising 4 core subjects and 2-3 elective subjects, designed to be taken after 6 years of secondary school.

Singapore is out-of-step with other high-performing countries but because it’s in-step with the Government’s ideas about examinations it is used to promote the Government’s policies. (For more information about the exam systems in several other countries see the more detailed faq.)

Updated 6 December 2013