Stories + Views

Avatar Image

grandparent

Posted on

16/10/12

go to 24 comments

Statistics watchdog expresses concern about DfE use of the PISA 2000 figures

After a campaign lasting nearly two years, FullFact has seen a letter from Andrew Dilnot, Chair of the UK Statistics Authority, which gives concerns about the use by the Department for Education (DfE) of the flawed figures from the Programme for International Student Assessment (PISA) tests for the UK in 2000.

In December 2010, FullFact published its misgivings about a DfE press release timed to coincide with the publication of the PISA 2009 test results. This press release was widely churned in the media including by the BBC, the Mail and in a Telegraph blog. FullFact pointed out at the time that the Organisation for Economic Cooperation and Development (OECD), which administers the triennial PISA tests, had warned that the 2000 PISA results for the UK were flawed and should not be used for comparison.

Despite this warning the DfE defended its use of the figures and FullFact asked the UK Statistics Authority to intervene. This request was followed by a letter from David Miliband, Schools Minister from June 2002 to December 2004, expressing his concern about the use of the flawed figures by Sir Michael Wilshaw, Chief Inspector of Schools, in September.

Andrew Dilnot, Chair of the UK Statistics Authority, in his reply to Miliband (downloadable here), expresses concern that OECD warnings were not highlighted when trend comparisons using the 2000 data were published. He singled out the DfE press release of December 2010 for detailed criticism:

“I was concerned to review the Department for Education’s press release of 7 December 2010 in which headline results for England from the PISA study, alongside relative international rankings, were not accompanied by detailed advice or caveats to help the reader in making comparisons over time, nor were the statistical implications of an increase in the number of reporting countries in later PISA studies noted.”

Dilnot said that readers might misunderstand the trend comparisons if they were presented without these warnings. He noted that the PISA data was contradicted by other evidence, including the Trends in International Mathematics and Science Study (TIMSS), and concluded that ‘it may be difficult to treat an apparent decline in secondary school pupils’ performance as “a statistically robust result”’.

FullFact points out that it isn’t ‘necessarily wrong’ to compare the 2000 PISA results with those of 2009, but any comparison should be accompanied by the OECD caveat. Dilnot wrote:

“These uncertainties and weaknesses are not just a technical footnote; they are themselves an important part of the evidence, and affect interpretation and meaning. League tables and the presentation of international rankings can be statistically problematic, and require clear and careful commentary alongside them.”

“Clear and careful commentary” has been missing in the trend comparisons which have been used repeatedly by Secretary of State for Education, Michael Gove, Parliamentary Under-Secretary of State for Schools, Lord Hill, and ex-schools minister, Nick Gibb, to justify the Government’s education policies. At the same time, contradictory evidence, such as TIMSS, has been ignored.

“Clear and careful commentary” has been missing from innumerable media articles which used these trend comparisons despite many commentators, including FullFact and this site, warning that these comparisons are based on flawed data. Instead, these comparisons have been used to paint a bleak picture of UK state education – “plummeting down international league tables” is now accepted as “truth”.

FullFact noticed that the DfE has not used these trend comparisons since the Statistics Authority has been looking into the case. However, this does not mean that these comparisons will not be made again in the future without the necessary warnings. FullFact says it will be vigilant in spotting any “slips”.

 

Share this page:

Enter your email address to receive notifications of all new posts by email.

Join 178 other subscribers

Comments, replies and queries

  1. Ricky Tarr says:

    The comparison over time may be problematic, but the main point the DfE makes in relation to PISA requires no comparison over time and does not rely on any claim that we are “plummeting down the league tables”.

    That point is simply that a country like ours (with all its cultural and financial advantages) should be at or near the TOP of the table. Certainly we should be way above countries like Slovakia on every measure.

    It’s the plain fact that we aren’t clear leaders that’s the issue.

    The rest is rhetoric.

    • Ricky – are you accusing Andrew Dilnot, chair of the UK Statistics Authority, who was awarded the CBE in 2000 for services to economics and economic policy and who was a founding presenter of the BBC Radio 4’s series on numbers and statistics, “More or Less”, of mere “rhetoric” because he has expressed concerns about the DfE use of the PISA 2000 figures and concluded that ‘it may be difficult to treat an apparent decline in secondary school pupils’ performance as “a statistically robust result”’.

      If you have any evidence that Andrew Dilnot is a windbag, please present it.

      http://www.statisticsauthority.gov.uk/about-the-authority/meet-the-board/andrew-dilnot-cbe/index.html

    • Ricky – there is no country called Slovakia in the 2009 PISA table. Are your referring to Slovenia? Or could it be the Slovak Republic? Both of these countries do appear in the tables but the UK was ranked above them on the reading scale. In Maths, Slovenia was above the OECD average – the UK and the Slovak Republic were at the OECD average. In Science, both Slovenia and the United Kingdom were above the OECD average.

      http://www.sefi.be/wp-content/uploads/oecd%20pisa%202009%20exec%20summary.pdf

      You say that the UK should be “at or near the TOP of the table” – but the Guardian article to which you refer below lists UK as one of the top countries (albeit at the bottom of the top 25). And Michael Gove attended a summit of the top-performing countries (based on PISA 2009 results) in March 2011.

      http://www.localschoolsnetwork.org.uk/2011/12/uk-at-education-summit-for-high-performing-countries-%e2%80%93-why-no-publicity/

      • Ricky Tarr says:

        Ricky – there is no country called Slovakia ………could it be the Slovak Republic?

        Dearie me, Janet. That’s like saying “there’s no country called France, do you mean the French Republic?”

        As ever, you declare yourself content with second (or 25th, even 27th) best. Being “above the OECD average” is hardly a big pull for investors, is it?

        • Dearie me, Ricky – it doesn’t follow that I am “content” with the UK ranking. However, as Andrew Dilnot said – the rank needs putting in context. He pointed out that the significance of more countries entering the PISA tests had not been noted by the DfE. Nor did the DfE pay sufficient attention to the Trends in Maths and Science Survey results which contradicted those of PISA.

          Some people seem very anxious to rubbish the English state system – even to the extent of sneering at being “above the OECD average”.

  2. Thanks for posting this Janet. The point has always been why Michael Gove and the DfE felt it necessary to misrepresent data in order to hoodwink a gullible electorate and pander to parental fears about declining standards. This suggests of course that the school system, although obviously in need of improvement, was nowhere near as desperate or broken as he – or his supporters who perpetrated the untruth – liked to make out.

    It is condescending to Slovakia to remark that “we” should be superior to them on every measure, not least because “we” have long ceased to have an Empire to lord it over the rest of the world. What is noteworthy about the 2009 PISA tables is that the UK is positioned more or less alongside other developed Western nations, with perhaps comparable cultural, sociological and economic challenges.

    It is worth noting that, without Gov-ian manipulation, PISA doesn’t show the United States to be racing ahead of the UK, despite it’s Charter schools, on which Academies and Free Schools are modelled, so the the question has to be why Gove has gambled with our children’s future by adopting a policy that has done little to push the US anywhere near the top of international league tables.

    Two other points:-

    Firstly – two top-scoring participants – Shanghai and Finland – both have strong public school systems. Neither is deregulating their schools and handing control over to private organizations. Different as they are, they achieved academic success by strengthening the public sector, not by deregulation and privatization.

    Secondly – just like the US, we have an exceptional and shameful rate of child poverty, further increased since the coalition came to power and with no signs of being decreased never mind eradicated.

    There is plenty of evidence that poverty affects students’ readiness to learn. It affects their health, their nutrition, their attendance, and their motivation. Yet this government ignores this by implementing policies which promote inequality and further impoverish the already poor. It is insane to ignore the dire consequences of persistent poverty and it is even more insane to assume that “we” can reach the top of the international achievement tables by closing schools, firing teachers, and hastening privatization.

  3. Ricky Tarr says:

    Why point the accusing finger at Michael Gove?

    For all you know he could have taken his line from the Guardian.


    UK schools slip down world rankings

    OECD study shows that despite comparatively high levels of per-pupil spending, the UK is behind Poland and Norway

    The Guardian also noted that:

    The UK spends £54,000 per student, while Germany and Hungary achieve a similar performance for £40,000 and £28,000, the study found.

    http://www.guardian.co.uk/education/2010/dec/07/uk-schools-slip-world-rankings

    Wow. Annual boarding fees at Eton are currently running at c. £32,000 per annum. Are you thinking what I’m thinking?

    • leonard james says:

      No I’m thinking that secondary schools get about £4-5000 per pupil per year and you’ve a) got your figures wrong or b) need to ask Gove where the best part of 45k is going each year cos it doesn’t seem to be going to schools.

      • Ricky Tarr says:

        They’re PISA/OECD figures, Leonard; not mine. At least, that’s what the Guardian says; not the Mail or the Telegraph.

        If the figure is accurate, where indeed does the money go?

        I suppose someone has to pay the army of SIPs, education consultants and whathaveyou.

        • leonard james says:

          Come off it! The Gaurdian isn’t clear but it seems far more plausible to me that £52000 is what it costs to put one child through 11 years of state education.

          • Sam Bickerstaff says:

            ‘In Barnsley, there is an independent Christian school, Hope House. Last year 100 per cent of its pupils achieved at least five good GCSE passes.

            This is a town where the council-run comprehensives have some of the worst results in the country.

            Barnsley Council spends an average of £5,912 per pupil a year.

            Hope House fees vary by age but average around £4,000 a year.

            For London, the costs are somewhat higher but the general point is the same.

            In Tottenham, the Wisdom School is applying to become a free school so it can expand to two-form entry. It particularly caters for Turkish-speaking children who were falling behind at the schools provided by Haringey Council.

            Children starting at the school are behind for their age but improve at an incredible rate; last year 100 per cent got between A* and C grades in English and maths.

            The Haringey average is 48 per cent.

            The fees at Wisdom are £6,000 a year.

            For the state secondary schools in Haringey the cost to the taxpayer per pupil is over £8,000.’

            http://cdn.spectator.co.uk/wp-content/uploads/2012/09/schools_sept_12.pdf

    • The Government was using the “falling down international league tables” argument to justify Coalition education policies before December 2010 when the PISA 2009 results were published. The foreword to the Education White Paper, November 2010, compared the 2006 PISA results with the 2000 ones to say that “our country” (meaning England) had fallen down international league tables and, therefore, radical reform was needed.

      Of course, it’s possible that the Prime Minister and Deputy Prime Minister, co-signatories to the foreword, may not have known that the 2000 PISA results were flawed in November. But this is unlikely. Concerns were expressed as early as 2003 in the Oxford Review of Education which listed problems with PISA 2000 including the response rate in England. The Oxford Review paper concluded:

      “These reservations, taken together, are sufficiently weighty for it to be unlikely that anything of value for educational policy in the UK can be learnt from the PISA [2000] survey.”

      http://www.oecd.org/education/preschoolandschool/programmeforinternationalstudentassessmentpisa/33680693.pdf

      This paper shouldn’t be dismissed as the musings of a lone academic. It was used by the University of Vienna (2007) in its review of “core issues” surrounding PISA tests. The core issue in this case was “serious flaws in the response rates and sampling [in England], which necessarily lead to biased results.” This was the reason that OECD used to conclude that the 2000 PISA results for the UK were flawed.

      There is, therefore, no excuse for politicians or commentators to use the 2000 PISA UK results without explaining the warnings and reservations.

      http://www.univie.ac.at/pisaaccordingtopisa/introduction_pisaaccordingtopisa.pdf

      Education White Paper available here:

      https://www.education.gov.uk/publications/eOrderingDownload/CM-7980.pdf

  4. I don’t think either you or Michael Gove can blame the Guardian for his decision to deliberately misrepresent PISA’s findings. Surely he used some “rigour” to draw his own conclusion yet decided to cherry pick and distort? I don’t think this ignorance was his very own Magna Carta moment (I bet Eton are very proud of the rigour of their teaching of a fundamental aspect of English history on Mr. Cameron).

    Since when has Gove taken a lead from The Guardian? Like the rest of the cabinet, he was too busy trying to do favours for Murdoch and News International and briefing the Daily Mail. And when he wasn’t plotting with the corrupt rightwing press, he was busy with the rest of his colleagues dismantling the public sector and further impoverishing the very children whose lives he claims he wants to improve! The hypocrisy is breathtaking.

    What rational reason is there for importing the Charter School model when it has done nothing to advance the US up the rankings? The real reason of course is privatization.

  5. Adrian Elliott says:

    ‘In Barnsley, there is an independent Christian school, Hope House. Last year 100 per cent of its pupils achieved at least five good GCSE passes.

    According to the DfE website,which publishes no results for the school, it has 75 children – that is spread across 12 school years.

    • Sam Bickerstaff says:

      Over in Edgware there is the non-denominational Holland House School, for pupils aged
      four to 11. The headmistress, Irina Tyk, is the Queen of Phonics, being the
      author of The Butterfly Book…..the fees are a maximum of £3,945 a year. This is compared with state spending of more than £5,000 a year for Barnet Council’s primary schools.

      Among the independent schools in that borough for the same age range are Gower House, which has no particular religious affiliation and charges £6,155. There are also Muslim and Jewish schools, which tend to charge a bit less.

      Peter Meyer is the chief finance officer of the New Model School Company. This has a not-for-profit model, but relies on fees to cover its costs. ‘We have a mix of
      income groups among our parents,’ he told me. ‘There is a high proportion of self-employed people, which reflects the risk-taking, pioneering nature of the school……

      So a good number of private schools charge fees that are comparable with, sometimes much lower than, the equivalent state school spending.

      ‘The teachers at the Christian schools are willing to work for lower
      salaries than they could get elsewhere,’ says Morgan. ‘Often premises are pretty modest. I inspected one school which was a couple of houses knocked together. But the classrooms met the regulations.’

      http://cdn.spectator.co.uk/wp-content/uploads/2012/09/schools_sept_12.pdf

  6. “In Barnsley, there is an independent Christian school, Hope House. Last year 100% of its pupils achieved at least five good GCSE passes.”

    This may appear to have nothing to do with this thread. However, this statement reveals the importance of putting statistics into context. As Adrian pointed out above, there are only 75 pupils spread across 12 years in this school, therefore we would need to know how many pupils took GCSE in order to make sense of the 100% pass rate. The numbers are likely to be too small to make any sort of judgement about the quality of the school.

    The Ofsted report for the school said that the education provided was “satisfactory” which in Michael Wilshaw’s eyes would mean it needs improvement. And the report noted that “the school recognises that some older pupils are not gaining as much as they could from their education and that some of their parents have concerns.”

    However, the Ofsted report wasn’t actually done by Ofsted but by the Bridge Schools Inspectorate which inspects Christian and Muslim schools. In 2009 it rated Darul Uloom school in Birmingham as an “improving school with a number of positive features”. But this school was closed after Channel 4 “Dispatches” revealed racist teaching at the school.

    This raises a further questions about the reliability of inspections, the out-sourcing of Ofsted inspections and whether independent schools should be inspected by different inspectorates. If schools have to be inspected, then surely they should all be inspected using the same rules?

    Which brings us back to the reliability of data and the importance of putting it in context.

    http://www.barnsleychristianschool.org.uk/OFSTED.pdf

    http://www.cypnow.co.uk/cyp/news/1044979/islamic-school-close-weeks-accusations-extremist-teaching

  7. This may not be best place to ask this but since there is a general discussion on school statistics going, I thought I’d put it here. Mods, feel free to move if you wish.

    My wife has recently become a governor at her Primary School and has been landed with pages on bar charts / summary stats for her school performance. Many of them compare her school to the others in the LEA (e.g. % who achieve KS2 in Reading across the borough). They then give the %FSM, % EAL, %EM and %SEN for all the schools in the borough and split out some of the results by these factors (although only against the borough as a whole). They further place the %FSM on many of the intra-borough comparative bar charts. The only explanation I can think for this is the LEA / DfE / Ofsted is trying to give some sort of justification as to why some of the schools in the borough have lower attainment than others, or at least put the various results in some sort of context. Assuming this is the case, I presume that someone(s) at Ofsted / DfE, IOE etc has some sort of model (logistic I guess) that attempts to predict the various attainment % based on these factors? If so, does anyone have any idea where I can find the latest version? It would help me give my wife some idea of how the school is really doing against the other schools in her borough. If such a thing is not freely available, does anyone know if Ofsted or similar would use such a model if rating the school. If not, has anyone any idea why they are focussing so much on these four factors?

    Oh, as a highly qualified and experienced mathematician / statistician, I’d like to add that most of the data my wife received is utterly meaningless. So I would have grave doubts about the validity of any model proposed. Moreover, the fact that they are forcing non-experts to draw conclusions from such summary statistics is very dangerous. But that’s where we are. If she is going to be forced to pore over this, it would be really helpful if anyone knew of any “proper” modelling of this kind of data for me to look at.

  8. Thankyou Janet.
    The bristol paper is quite good at first glance although they are concerned with gcse results. They fit a model which has some of the factors I mentioned plus some others. I’ll print it off and read it on the tube. However, I think we’ll both agree with this from their conclusion:
    “These results strongly suggest that attempting to summarise school effectiveness in a single overall measure will lead to misleading inferences about schools”.
    Maybe we could send that to the various consultancies / think-tanks who write on educational matters.
    Oh, I know Will Moy from Full Fact quite well now.

  9. […] Chair of UK Statistics Authority tells Gove to stop lying about Education  (here) […]

  10. […] for 2000. But the “plummeting” down league tables propaganda was relentlessly plugged until the UK Statistics Authority intervened last […]

  11. […] the “plummeting” down league tables propaganda was relentlessly plugged until the UK Statistics Authority intervened last […]

  12. […] down (here) 96.  Chair of UK Statistics Authority tells Gove to stop lying about Education  (here) 97.  Chair of UK Statistics Authority tells Cameron to stop telling people he cut the debt […]

  13. […] down (here) 96.  Chair of UK Statistics Authority tells Gove to stop lying about Education  (here) 97.  Chair of UK Statistics Authority tells Cameron to stop telling people he cut the debt […]

Want to follow comments on this post? Use the RSS feed or subscribe below

Reply


5 + 7 =