“Our major reforms, which are improving the lives of children and young people, are underpinned by a substantial programme of research and evaluation.”
Anonymous Department for Education (DfE) spokeswoman
, 13 September 2013
This statement was in response to a lecture by Professor Robert Coe, Durham University, who said the Department for Education “misunderstands [and] misuses evidence”.
Here are 10 examples showing how the DfE has misunderstood, misrepresented or misused evidence:
1 Evidence that academies raise standards does not do so*.
2 Evidence about floor standards raising results doesn’t show what DfE press release claims**.
3 Evidence about UK “plummeting” down league tables was based on flawed figures
and shouldn’t have been used for comparison.
4 Evidence that converter academies drives up standards didn’t exist
: Ofsted found good schools shared several characteristics
and academy status wasn’t one of them.
5 Evidence to support Education Secretary Michael Gove’s assertion that the first free schools were outperforming all other schools was based on too small a sample (small sample size was one of Professor Coe’s criticisms leveled at the DfE) and was contradicted by the data
in any case.
6 Evidence used to support Gove’s claim that the number of academies in the capital caused the success of the London Challenge actually said academy status had little to do with it
7 Evidence which showed teenagers’ lack of knowledge of historical facts turned out to be dodgy surveys
8 Evidence from Sweden that their free schools raised results was accompanied by a warning from the authors that the Swedish free school programme couldn’t be transferred to other countries’ school systems.
9 Evidence about the success of US charter schools shows only modest improvement and is contradicted by the latest test results from New York
10 Evidence showing a relatively positive picture of Labour’s sponsored academies which were mainly previously poor-performing schools can’t be used to justify converting other types of schools to academies
, said the authors. But their warning was ignored.
So, in these ten oft-repeated claims, the “substantial programme of research and evaluation” turns out to be rather insubstantial or to have been misrepresented.
*See faq above Do academies get better results, or improve more quickly, than other state schools?.
The Academies Commission
(2013) found that many previously underperforming non-academy schools in poor areas did just as well as similar academies (see here
**I have issued a second Freedom of Information request
asking for the evidence about floor standards and raised results. It is now overdue.