“We used to be fourth in the world for our science education, now we are 16th. We used to be seventh…” Blah! Blah! But we now know that Mr Gove misrepresented international test data
* - the UK Statistics Authority says so
. It’s odd that politicians who are so keen on rigour should be so casual about quoting statistics.
But there’s more to international test results than national pride or shame.
First, they only assess performance in a limited number of subjects so shouldn’t be taken as a reflection of a country’s whole education system. A relative drop down PISA rankings, for example, shouldn’t provoke a knee-jerk reaction about “broken” educational systems.
Second, results are sometimes contradictory. For example, the performance of English 14/15 year-olds in PISA 2009 was contradicted by TIMSS 2007. And analysis by the Economist Intelligence Unit found that the UK was 6th in the world and second in the Western world
when international test results were combined with literacy and graduation rates. This is a long way from “plummeting” down league tables.
Unfortunately, governments have a tendency to concentrate on relative rankings and use an apparent fall in league table position to justify policies. The Government, for example, uses the 2009 PISA rankings to underpin its academy conversion/free schools policy and its overhaul of the primary curriculum. It also uses elements of education systems in top-performing PISA countries to support its policies – but only those elements which it wants to impose (see faq above, Is the UK tumbling down the international league tables?
for Case Study into how the exam system of just one top-performing country, Singapore, is used to support the Government’s proposed exam changes).
Third, the tests don’t just provide scores and rankings. The organisers investigate a range of other factors such as socio-economic background, teacher pay and how many books are available in the home. However, this wealth of analysis is often overlooked in favour of raw test results and rankings.
Fourth, questions should be asked about the costs as well as benefits of heading international league table. The top-performing Pacific Rim countries, for example, have a culture which reveres success and regards failure as shameful. This attitude may push countries to the top but South Korean teenagers, for example, are the unhappiest in OECD** countries. Research by Prof. Desiree Qin
from Michigan University found that when “parents try to push their children to succeed at all cost … [it] can lead to higher levels of depression and anxiety in children.”
Fifth, concentrating disproportionally on the narrow range of subjects tested internationally can distort what is taught. While Shanghai produces the best-performers in PISA tests, the Government there is promoting a greater emphasis on creativity. It recognises that China is good at imitating but not good at creating while UK, on the other hand, punches above its weight in creativity. EBacc threatens this
International education league tables provide useful data which can inform education systems but should not drive them.
*The Programme for International Student Assessment (PISA),
the Trends in Maths and Science Survey (TIMSS) and the Progress in International Reading Literacy Study (PIRLS) are international education tests taken by a sample of children in participating countries and jurisdictions every three to five years (see faq above for more information).
**Organisation for Economic Cooperation and Development. The OECD administers the three-yearly PISA tests.