Stories + Views
DfE fails to refute LSN analysis: DfE data shows academies do not perform better
This week the Department for Education issued an attempted rebuttal of our data comparing results from academies and other state schools, as highlighted in last week’s Observer. It is an odd and rather shallow response. The critique bears so little relation to either the Observer article or the extensive analysis on this site that I wonder if the author actually read them.
DfE: “Much of their analysis was based on a simplistic comparison between all schools and Academies – nearly all of which were previously failing local authority maintained schools. As Academies are having to recover from such a low base such a comparison is nonsensical.”
Not true. In fact the Observer article compares academies and non-academies starting from a similar low base. For schools achieving less than 35% in 2008 (in % of students getting 5 GCSE A-Cs including English & Maths), academies increased 18% (from 24% to 42%) and non-academies by 19%, from 24% to 43%. These figures are given in the Observer article. It is hard to understand how the DfE author missed them.
DfE: “The Observer also claim that Academies with poor results in 2008 have improved no faster than maintained schools with poor results over the same period. But this analysis excludes the most successful academies that opened between 2001-2007 and which had already seen huge improvements.”
Let’s examine that. First, the DfE complains that the analysis doesn’t compare academies with other state schools with low results. Then, in the next paragraph, they complain that we do make that comparison. It is true that this excludes academies that have previously improved, but these are covered in our analysis of well established academies. Here, for a fair comparison, we group the schools into three categories, according to level of deprivation (as measured by % of students on free school meals). Non-academies did 4% better in each of two of the categories and academies did 2% better in one of the categories.
DfE: “The analysis of progress measures is limited because the league tables consider progress of pupils over the full five years in secondary school. All pupils in Academies will have spent the majority of their time in the underperforming predecessor school not the Academy.”
This is odd phrasing. For it to be true that “all pupils in Academies” spent the majority of their time in the predecessor school, it would have to be the case that no academy was more than two years old. In fact 46 academies have been in existence for five years or more and so the pupils there have spent their entire secondary school period in the Academy. These are the schools that were used in the comparison above, showing long-established academies giving, overall, no better results than similar non-academies.
DfE: “They have used a narrow way of comparing schools, looking purely at the Free School Meal intake. We look at FSM rates, previous results and prior attainment levels of pupils. This means we are genuinely comparing like for like.”
This statement is so untrue it is laughable. Our analysis has looked at comparing schools by FSM intake (here and here), by previous results (here and here), and by prior attainment. None of these comparisons with similar schools show any advantage to academies, and some show them doing worse. In contrast the DfE, despite the claim in the statement, has not used such comparisons. Its main claim for academies, repeated in this statement, is that they grew at twice the rate of non academies in 2011. It is only able to reach this conclusion because it doesn’t compare academies with similar schools. Indeed of the six sets of facts given in this statement, only one uses any breakdown by comparable schools – and that is by FSM. No mention at all is made of comparison by previous results or prior attainment.
Ok, moving on, let’s take some of the facts quoted in the statement:
DfE: “Academies’ GCSE results improved by nearly twice the level seen across all maintained schools”
We have already shown that this result disappears when academies are compared to non-academies with similar previous results. To explain, the House of Commons analysis (graph on p9) shows the clear relationship between 2010 GCSE levels and the increase from 2010 to 2011. Sadly I’ve not managed to insert a copy of the graph so here’s a summary:
GCSEs < 30% in 2010: Growth in 2011 GCSE results of 8-11%
30% – 50% in 2010: Growth in 2011 of 4-7%
50% – 65% in 2010: Growth in 2011 of 1% – 3%
65% – 90% in 2010: Growth in 2011 of -1% to 0%
(Figures here refer to the % achieving 5 GCSE A-Cs including English and Maths)
The difference is dramatic. If you take any selection of schools where a high proportion of schools are in the lower bands in 2010, they are pretty likely to have growth rates above a selection that includes those of 50% and above. It is straightforward selection bias. Academies tended to have exactly such lower results in 2010. If you don’t compare them with similar schools (as the DfE claims to do, but doesn’t) then it will look like academies are performing better when in fact they performed absolutely in line with similar schools.
The graph and this data is actually very revealing. First it means the increase in results in 2011 are very unlikely to be down to easier exams. (If they were, all schools would be expected to improve their results.) Also, it goes against the popular perception that schools with low results continue to have such low results. It is this perception that allows the DfE to claim academies are the only way to get schools to shift from low results. In fact we now know, from these figures, that schools that were performing badly in terms of 2010 GCSE results are exactly the schools that are most likely to have improved quickly.
DfE: “The attainment rate for FSM pupils in Academies improved by 8.0 percentage points between 2009 and 2010. This more than double the improvement rate recorded in comparable schools.”
This is an intriguing selection. The DfE has not taken the latest figures, 2010 to 2011 or the longer term trend (eg, 2008 to 2011), but one specific year, 2009 to 2010. Is this cherry picking the year with the data that best serves them? Further, they don’t state what they’ve used to find comparable schools. Also data on achievement of FSM students at individual school level is only available for 2011, so it is not possible to check if this claim is true. We will put in an FOI request to check it but my guess is, as with the growth claim for academies overall, that the difference will disappear once the comparison is made to genuinely comparable schools.
DfE: “Higher proportion [of academies are] rated outstanding by Ofsted.”
This is actually true, just. Of academies inspected last year, 18% were found to be outstanding, compared to just 15% of schools overall. But it is a carefully selected statistic. If we look at the % getting Good or outstanding, the situation is reversed: 53% of academies achieved it last year, compared to 57% of all state schools inspected that year (Ofsted annual report). Overall 70% of state schools were rated Good or Outstanding in their last inspection. (The figure for last year is lower because Good and Outstanding schools are inspected less often.)
The DfE statement goes on to make a series of claims for the achievement of schools in sponsor-led chains. It notes that Harris schools increased between 2010 and 2011 by 13%, Ark by 11% and so on.
We have already addressed the under-performance of chains in this post. It is difficult to assess whether the figures DfE quote show genuine improvement as the chains are so dependent on GCSE equivalents, which are generally ridiculed by the DfE (as in this recent press release) and are mainly no longer to be allowed to count as GCSEs from 2015. The Harris % for 5 ACEM falls by 24% after taking out equivalents and the Ark figure falls by 21%. (For comparison, the figure for state schools as a whole falls by just 6%.)
Admittedly I haven’t covered every claim in the DfE statement. It does refer to the LSE and NAO research but these refer to earlier exam results. Our claim, yet to be refuted or even seriously challenged by the DfE or any of the Academy supporters, is that the 2011 detailed DfE data on individual school performance gives no support to a claim that the 249 sponsor-led academies performed better than a comparable set of state schools. Indeed on many criteria they performed worse.
It is clear that over the past few years different local authorities have taken different routes to deal with schools they see as underperforming. Some have chosen to convert schools to academies, and have secured substantial amounts of government funding to help them. Others have chosen to improve them without converting to academies, despite this meaning less funding. What the 2011 DfE data reveals is that those who chose to go down the non-academy route saw their schools improve as much, and sometimes more, than those who chose the academy route – despite the higher levels of funding.
The DfE statement also claims that academies have an effect on other schools and academy supporters may try to claim this is the reason other schools rise as much. Its very difficult to test this claim in the DfE data. However, anecdotally, its easy to find counter-examples. Arguably the best performing local authority in the country is Tower Hamlets (where 22% of low prior attainment students achieved 5 ACEM, the highest of any authority in the country). In this borough none of the schools had become academies by the 2011 exam results and so it clearly isn’t academies driving the improvements. It could just be that most local authorities are rather good at supporting schools and helping them to improve – and that this may be an element that will be hugely missed in the new academy-dominated educational landscape.
Data sources: The DfE data release can be obtained here:
Some people have found it difficult to download this file. If you have difficulty, feel free to email me on firstname.lastname@example.org and I will send you a copy of the file. The above analysis was generally done in Excel with Pivot tables.