After a campaign lasting nearly two years,
FullFact has seen a letter from Andrew Dilnot, Chair of the UK Statistics Authority, which gives concerns about the use by the Department for Education (DfE) of the flawed figures from the Programme for International Student Assessment (PISA) tests for the UK in 2000.
In December 2010, FullFact published its
misgivings about a DfE press release timed to coincide with the publication of the PISA 2009 test results. This press release was widely churned in the media including by the
BBC, the
Mail and in a
Telegraph blog. FullFact pointed out at the time that the Organisation for Economic Cooperation and Development (OECD), which administers the triennial PISA tests, had warned that the 2000 PISA results for the UK were flawed and should not be used for comparison.
Despite this warning the DfE defended its use of the figures and FullFact asked the UK Statistics Authority to intervene. This request was followed by a letter from David Miliband, Schools Minister from June 2002 to December 2004, expressing his concern about the use of the flawed figures by Sir Michael Wilshaw, Chief Inspector of Schools, in September.
Andrew Dilnot, Chair of the UK Statistics Authority, in his reply to Miliband (downloadable
here), expresses concern that OECD warnings were not highlighted when trend comparisons using the 2000 data were published. He singled out the DfE press release of December 2010 for detailed criticism:
“I was concerned to review the Department for Education’s press release of 7 December 2010 in which headline results for England from the PISA study, alongside relative international rankings, were not accompanied by detailed advice or caveats to help the reader in making comparisons over time, nor were the statistical implications of an increase in the number of reporting countries in later PISA studies noted.”
Dilnot said that readers might misunderstand the trend comparisons if they were presented without these warnings. He noted that the PISA data was contradicted by other evidence, including the Trends in International Mathematics and Science Study (TIMSS), and concluded that ‘it may be difficult to treat an apparent decline in secondary school pupils’ performance as “a statistically robust result”’.
FullFact points out that it isn’t ‘necessarily wrong’ to compare the 2000 PISA results with those of 2009, but any comparison should be accompanied by the OECD caveat. Dilnot wrote:
"These uncertainties and weaknesses are not just a technical footnote; they are themselves an important part of the evidence, and affect interpretation and meaning. League tables and the presentation of international rankings can be statistically problematic, and require clear and careful commentary alongside them."
“Clear and careful commentary” has been missing in the trend comparisons which have been used repeatedly by Secretary of State for Education, Michael Gove, Parliamentary Under-Secretary of State for Schools, Lord Hill, and ex-schools minister, Nick Gibb, to justify the Government’s education policies. At the same time, contradictory evidence, such as TIMSS, has been ignored.
“Clear and careful commentary” has been missing from innumerable media articles which used these trend comparisons despite many commentators, including FullFact and this site, warning that these comparisons are based on flawed data. Instead, these comparisons have been used to paint a bleak picture of UK state education – “plummeting down international league tables” is now accepted as “truth”.
FullFact noticed that the DfE has not used these trend comparisons since the Statistics Authority has been looking into the case. However, this does not mean that these comparisons will not be made again in the future without the necessary warnings. FullFact says it will be vigilant in spotting any “slips”.
Comments
That point is simply that a country like ours (with all its cultural and financial advantages) should be at or near the TOP of the table. Certainly we should be way above countries like Slovakia on every measure.
It's the plain fact that we aren't clear leaders that's the issue.
The rest is rhetoric.
If you have any evidence that Andrew Dilnot is a windbag, please present it.
http://www.statisticsauthority.gov.uk/about-the-authority/meet-the-board...
http://www.sefi.be/wp-content/uploads/oecd%20pisa%202009%20exec%20summar...
You say that the UK should be "at or near the TOP of the table" - but the Guardian article to which you refer below lists UK as one of the top countries (albeit at the bottom of the top 25). And Michael Gove attended a summit of the top-performing countries (based on PISA 2009 results) in March 2011.
http://www.localschoolsnetwork.org.uk/2011/12/uk-at-education-summit-for...
Dearie me, Janet. That's like saying "there's no country called France, do you mean the French Republic?"
As ever, you declare yourself content with second (or 25th, even 27th) best. Being "above the OECD average" is hardly a big pull for investors, is it?
Some people seem very anxious to rubbish the English state system - even to the extent of sneering at being "above the OECD average".
It is condescending to Slovakia to remark that "we" should be superior to them on every measure, not least because "we" have long ceased to have an Empire to lord it over the rest of the world. What is noteworthy about the 2009 PISA tables is that the UK is positioned more or less alongside other developed Western nations, with perhaps comparable cultural, sociological and economic challenges.
It is worth noting that, without Gov-ian manipulation, PISA doesn't show the United States to be racing ahead of the UK, despite it's Charter schools, on which Academies and Free Schools are modelled, so the the question has to be why Gove has gambled with our children's future by adopting a policy that has done little to push the US anywhere near the top of international league tables.
Two other points:-
Firstly - two top-scoring participants - Shanghai and Finland - both have strong public school systems. Neither is deregulating their schools and handing control over to private organizations. Different as they are, they achieved academic success by strengthening the public sector, not by deregulation and privatization.
Secondly - just like the US, we have an exceptional and shameful rate of child poverty, further increased since the coalition came to power and with no signs of being decreased never mind eradicated.
There is plenty of evidence that poverty affects students' readiness to learn. It affects their health, their nutrition, their attendance, and their motivation. Yet this government ignores this by implementing policies which promote inequality and further impoverish the already poor. It is insane to ignore the dire consequences of persistent poverty and it is even more insane to assume that "we" can reach the top of the international achievement tables by closing schools, firing teachers, and hastening privatization.
For all you know he could have taken his line from the Guardian.
UK schools slip down world rankings
OECD study shows that despite comparatively high levels of per-pupil spending, the UK is behind Poland and Norway
The Guardian also noted that:
The UK spends £54,000 per student, while Germany and Hungary achieve a similar performance for £40,000 and £28,000, the study found.
http://www.guardian.co.uk/education/2010/dec/07/uk-schools-slip-world-ra...
Wow. Annual boarding fees at Eton are currently running at c. £32,000 per annum. Are you thinking what I'm thinking?
No I'm thinking that secondary schools get about £4-5000 per pupil per year and you've a) got your figures wrong or b) need to ask Gove where the best part of 45k is going each year cos it doesn't seem to be going to schools.
Of course, it’s possible that the Prime Minister and Deputy Prime Minister, co-signatories to the foreword, may not have known that the 2000 PISA results were flawed in November. But this is unlikely. Concerns were expressed as early as 2003 in the Oxford Review of Education which listed problems with PISA 2000 including the response rate in England. The Oxford Review paper concluded:
“These reservations, taken together, are sufficiently weighty for it to be unlikely that anything of value for educational policy in the UK can be learnt from the PISA [2000] survey.”
http://www.oecd.org/education/preschoolandschool/programmeforinternation...
This paper shouldn’t be dismissed as the musings of a lone academic. It was used by the University of Vienna (2007) in its review of “core issues” surrounding PISA tests. The core issue in this case was “serious flaws in the response rates and sampling [in England], which necessarily lead to biased results.” This was the reason that OECD used to conclude that the 2000 PISA results for the UK were flawed.
There is, therefore, no excuse for politicians or commentators to use the 2000 PISA UK results without explaining the warnings and reservations.
http://www.univie.ac.at/pisaaccordingtopisa/introduction_pisaaccordingto...
Education White Paper available here:
https://www.education.gov.uk/publications/eOrderingDownload/CM-7980.pdf
If the figure is accurate, where indeed does the money go?
I suppose someone has to pay the army of SIPs, education consultants and whathaveyou.
Come off it! The Gaurdian isn't clear but it seems far more plausible to me that £52000 is what it costs to put one child through 11 years of state education.
This is a town where the council-run comprehensives have some of the worst results in the country.
Barnsley Council spends an average of £5,912 per pupil a year.
Hope House fees vary by age but average around £4,000 a year.
For London, the costs are somewhat higher but the general point is the same.
In Tottenham, the Wisdom School is applying to become a free school so it can expand to two-form entry. It particularly caters for Turkish-speaking children who were falling behind at the schools provided by Haringey Council.
Children starting at the school are behind for their age but improve at an incredible rate; last year 100 per cent got between A* and C grades in English and maths.
The Haringey average is 48 per cent.
The fees at Wisdom are £6,000 a year.
For the state secondary schools in Haringey the cost to the taxpayer per pupil is over £8,000.'
http://cdn.spectator.co.uk/wp-content/uploads/2012/09/schools_sept_12.pdf
Since when has Gove taken a lead from The Guardian? Like the rest of the cabinet, he was too busy trying to do favours for Murdoch and News International and briefing the Daily Mail. And when he wasn't plotting with the corrupt rightwing press, he was busy with the rest of his colleagues dismantling the public sector and further impoverishing the very children whose lives he claims he wants to improve! The hypocrisy is breathtaking.
What rational reason is there for importing the Charter School model when it has done nothing to advance the US up the rankings? The real reason of course is privatization.
According to the DfE website,which publishes no results for the school, it has 75 children - that is spread across 12 school years.
four to 11. The headmistress, Irina Tyk, is the Queen of Phonics, being the
author of The Butterfly Book.....the fees are a maximum of £3,945 a year. This is compared with state spending of more than £5,000 a year for Barnet Council’s primary schools.
Among the independent schools in that borough for the same age range are Gower House, which has no particular religious affiliation and charges £6,155. There are also Muslim and Jewish schools, which tend to charge a bit less.
Peter Meyer is the chief finance officer of the New Model School Company. This has a not-for-profit model, but relies on fees to cover its costs. ‘We have a mix of
income groups among our parents,’ he told me. ‘There is a high proportion of self-employed people, which reflects the risk-taking, pioneering nature of the school......
So a good number of private schools charge fees that are comparable with, sometimes much lower than, the equivalent state school spending.
‘The teachers at the Christian schools are willing to work for lower
salaries than they could get elsewhere,’ says Morgan. ‘Often premises are pretty modest. I inspected one school which was a couple of houses knocked together. But the classrooms met the regulations.’
http://cdn.spectator.co.uk/wp-content/uploads/2012/09/schools_sept_12.pdf
This may appear to have nothing to do with this thread. However, this statement reveals the importance of putting statistics into context. As Adrian pointed out above, there are only 75 pupils spread across 12 years in this school, therefore we would need to know how many pupils took GCSE in order to make sense of the 100% pass rate. The numbers are likely to be too small to make any sort of judgement about the quality of the school.
The Ofsted report for the school said that the education provided was "satisfactory" which in Michael Wilshaw's eyes would mean it needs improvement. And the report noted that "the school recognises that some older pupils are not gaining as much as they could from their education and that some of their parents have concerns."
However, the Ofsted report wasn't actually done by Ofsted but by the Bridge Schools Inspectorate which inspects Christian and Muslim schools. In 2009 it rated Darul Uloom school in Birmingham as an "improving school with a number of positive features". But this school was closed after Channel 4 "Dispatches" revealed racist teaching at the school.
This raises a further questions about the reliability of inspections, the out-sourcing of Ofsted inspections and whether independent schools should be inspected by different inspectorates. If schools have to be inspected, then surely they should all be inspected using the same rules?
Which brings us back to the reliability of data and the importance of putting it in context.
http://www.barnsleychristianschool.org.uk/OFSTED.pdf
http://www.cypnow.co.uk/cyp/news/1044979/islamic-school-close-weeks-accu...
My wife has recently become a governor at her Primary School and has been landed with pages on bar charts / summary stats for her school performance. Many of them compare her school to the others in the LEA (e.g. % who achieve KS2 in Reading across the borough). They then give the %FSM, % EAL, %EM and %SEN for all the schools in the borough and split out some of the results by these factors (although only against the borough as a whole). They further place the %FSM on many of the intra-borough comparative bar charts. The only explanation I can think for this is the LEA / DfE / Ofsted is trying to give some sort of justification as to why some of the schools in the borough have lower attainment than others, or at least put the various results in some sort of context. Assuming this is the case, I presume that someone(s) at Ofsted / DfE, IOE etc has some sort of model (logistic I guess) that attempts to predict the various attainment % based on these factors? If so, does anyone have any idea where I can find the latest version? It would help me give my wife some idea of how the school is really doing against the other schools in her borough. If such a thing is not freely available, does anyone know if Ofsted or similar would use such a model if rating the school. If not, has anyone any idea why they are focussing so much on these four factors?
Oh, as a highly qualified and experienced mathematician / statistician, I’d like to add that most of the data my wife received is utterly meaningless. So I would have grave doubts about the validity of any model proposed. Moreover, the fact that they are forcing non-experts to draw conclusions from such summary statistics is very dangerous. But that’s where we are. If she is going to be forced to pore over this, it would be really helpful if anyone knew of any “proper” modelling of this kind of data for me to look at.
http://fullfact.org/articles/people_engaged_data_truth_Andrew_Dilnot_sta...
I know of no modelling which predicts outcomes in the way you suggest. The only one I could find is a working paper which discusses pupil mobility and contains some terrifying equations.
http://www.bris.ac.uk/Depts/CMPO/workingpapers/wp189.pdf
And there's this seminar paper with the purpose of advancing understanding of factors associated with low pupil achievement. However, I'm not sure that this is what you're looking for.
http://cep.lse.ac.uk/seminarpapers/17-11-06-KIN.pdf
The bristol paper is quite good at first glance although they are concerned with gcse results. They fit a model which has some of the factors I mentioned plus some others. I'll print it off and read it on the tube. However, I think we'll both agree with this from their conclusion:
"These results strongly suggest that attempting to summarise school effectiveness in a single overall measure will lead to misleading inferences about schools".
Maybe we could send that to the various consultancies / think-tanks who write on educational matters.
Oh, I know Will Moy from Full Fact quite well now.
Add new comment