Stories + Views

Posted on

04/03/12

go to 5 comments

DfE fails to refute LSN analysis: DfE data shows academies do not perform better

This week the Department for Education issued an attempted rebuttal of our data comparing results from academies and other state schools, as highlighted in last week’s Observer. It is an odd and rather shallow response. The critique bears so little relation to either the Observer article or the extensive analysis on this site that I wonder if the author actually read them.

DfE: “Much of their analysis was based on a simplistic comparison between all schools and Academies – nearly all of which were previously failing local authority maintained schools. As Academies are having to recover from such a low base such a comparison is nonsensical.”

Not true. In fact the Observer article compares academies and non-academies starting from a similar low base. For schools achieving less than 35% in 2008 (in % of students getting 5 GCSE A-Cs including English & Maths), academies increased 18% (from 24% to 42%) and non-academies by 19%, from 24% to 43%. These figures are given in the Observer article. It is hard to understand how the DfE author missed them.

DfE: “The Observer also claim that Academies with poor results in 2008 have improved no faster than maintained schools with poor results over the same period. But this analysis excludes the most successful academies that opened between 2001-2007 and which had already seen huge improvements.”

Let’s examine that. First, the DfE complains that  the analysis doesn’t compare academies with other state schools with low results. Then, in the next paragraph, they complain that we do make that comparison. It is true that this excludes academies that have previously improved, but these are covered in our analysis of well established academies. Here, for a fair comparison, we group the schools into three categories, according to level of deprivation (as measured by % of students on free school meals). Non-academies did 4% better in each of two of the categories and academies did 2% better in one of the categories.

DfE: “The analysis of progress measures is limited because the league tables consider progress of pupils over the full five years in secondary school. All pupils in Academies will have spent the majority of their time in the underperforming predecessor school not the Academy.”

This is odd phrasing. For it to be true that “all pupils in Academies”  spent the majority of their time in the predecessor school, it would have to be the case that no academy was more than two years old. In fact 46 academies have been in existence for five years or more and so the pupils there have spent their entire secondary school period in the Academy. These are the schools that were used in the comparison above, showing long-established academies giving, overall, no better results than similar non-academies.

DfE: “They have used a narrow way of comparing schools, looking purely at the Free School Meal intake. We look at FSM rates, previous results and prior attainment levels of pupils. This means we are genuinely comparing like for like.”

This statement is so untrue it is laughable. Our analysis has looked at comparing schools by FSM intake (here and here), by previous results (here and here), and by prior attainment. None of these comparisons with similar schools show any advantage to academies, and some show them doing worse. In contrast the DfE, despite the claim in the statement, has not used such comparisons. Its main claim for academies, repeated in this statement, is that they grew at twice the rate of non academies in 2011. It is only able to reach this conclusion because it doesn’t compare academies with similar schools. Indeed of  the six sets of facts given in this statement, only one uses any breakdown by comparable schools – and that is by FSM. No mention at all is made of comparison by previous results or prior attainment.

Ok, moving on, let’s take some of the facts quoted in the statement:

DfE: “Academies’ GCSE results improved by nearly twice the level seen across all maintained schools”

We have already shown that this result disappears when academies are compared to non-academies with similar previous results. To explain, the House of Commons analysis (graph on p9) shows the clear relationship between 2010 GCSE levels and the increase from 2010 to 2011. Sadly I’ve not managed to insert a copy of the graph so here’s a summary:

GCSEs < 30% in 2010: Growth in 2011 GCSE results of 8-11%
30% – 50% in 2010: Growth in 2011 of 4-7%
50% – 65% in 2010: Growth in 2011 of 1% – 3%
65% – 90% in 2010: Growth in 2011 of -1% to 0%

(Figures here refer to the % achieving 5 GCSE A-Cs including English and Maths)

The difference is dramatic.  If you take any selection of schools where a high proportion of schools are in the lower bands in 2010, they are pretty likely to have growth rates above a selection that includes those of 50% and above. It is straightforward selection bias.  Academies tended to have exactly such lower results in 2010. If you don’t compare them with similar schools (as the DfE claims to do, but doesn’t) then it will look like academies are performing better when in fact they performed absolutely in line with similar schools.

The graph and this data is actually very revealing. First it means the increase in results in 2011 are very unlikely to be down to easier exams. (If they were, all schools would be expected to improve their results.) Also, it goes against the popular perception that schools with low results continue to have such low results. It is this perception that allows the DfE to claim academies are the only way to get schools to shift from low results. In fact we now know, from these figures, that schools that were performing badly in terms of 2010 GCSE results are exactly the schools that are most likely to have improved quickly.

DfE: “The attainment rate for FSM pupils in Academies improved by 8.0 percentage points between 2009 and 2010. This more than double the improvement rate recorded in comparable schools.”

This is an intriguing selection. The DfE has not taken the latest figures, 2010 to 2011 or the longer term trend (eg, 2008 to 2011), but one specific year, 2009 to 2010. Is this cherry picking the year with the data that best serves them? Further, they don’t state what they’ve used to find comparable schools. Also data on achievement of FSM students at individual school level is only available for 2011, so it is not possible to check if this claim is true. We will put in an FOI request to check it but my guess is, as with the growth claim for academies overall, that the difference will disappear once the comparison is made to genuinely comparable schools.

DfE: “Higher proportion [of academies are] rated outstanding by Ofsted.”

This is actually true, just. Of academies inspected last year, 18% were found to be outstanding, compared to just 15% of schools overall. But it is a carefully selected statistic. If we look at the % getting Good or outstanding, the situation is reversed: 53% of academies achieved it last year, compared to 57% of all state schools inspected that year (Ofsted annual report). Overall 70% of state schools were rated Good or Outstanding in their last inspection. (The figure for last year is lower because Good and Outstanding schools are inspected less often.)

The DfE statement goes on to make a series of claims for the achievement of schools in sponsor-led chains. It notes that Harris schools increased between 2010 and 2011 by 13%, Ark by 11% and so on.

We have already addressed the under-performance of chains in this post. It is difficult to assess whether the figures DfE quote show genuine improvement as the chains are so dependent on GCSE equivalents, which are generally ridiculed by the DfE (as in this recent press release) and are mainly no longer to be allowed to count as GCSEs from 2015. The Harris % for 5 ACEM falls by 24% after taking out equivalents and the Ark figure falls by 21%. (For comparison, the figure for state schools as a whole falls by just 6%.)

Admittedly I haven’t covered every claim in the DfE statement. It does refer to the LSE and NAO research but these refer to earlier exam results. Our claim, yet to be refuted or even seriously challenged by the DfE or any of the Academy supporters, is that the 2011 detailed DfE data on individual school performance gives no support to a claim that the 249 sponsor-led academies performed better than a comparable set of state schools. Indeed on many criteria they performed worse.

It is clear that over the past few years different local authorities have taken different routes to deal with schools they see as underperforming. Some have chosen to convert schools to academies, and have secured substantial amounts of government funding to help them. Others have chosen to improve them without converting to academies, despite this meaning less funding. What the 2011 DfE data reveals is that those who chose to go down the non-academy route saw their schools improve as much, and sometimes more, than those who chose the academy route – despite the higher levels of funding.

The DfE statement also claims that academies have an effect on other schools and academy supporters may try to claim this is the reason other schools rise as much. Its very difficult to test this claim in the DfE data. However, anecdotally, its easy to find counter-examples. Arguably the best performing local authority in the country is Tower Hamlets (where 22% of low prior attainment students achieved 5 ACEM, the highest of any authority in the country). In this borough none of the schools had become academies by the 2011 exam results and so it clearly isn’t academies driving the improvements. It could just be that most local authorities are rather good at supporting schools and helping them to improve – and that this may be an element that will be hugely missed in the new academy-dominated educational landscape.

Data sources: The DfE data release can be obtained here:

http://www.guardian.co.uk/news/datablog/2012/jan/26/secondary-school-league-tables-data?INTCMP=SRCH#data

Some people have found it difficult to download this file. If you have difficulty, feel free to email me on henry@happy.co.uk and I will send you a copy of the file. The above analysis was generally done in Excel with Pivot tables.

Share this page:

Comments, replies and queries

  1. The DfE says that academies established between 2001 and 2007 had experienced “huge improvements”. Yet PricewaterHouseCoopers 2008 found “considerable diversity across individual Academies in the levels and improvements achieved”. PwC concluded “there is insufficient evidence to make a definitive judgement about the Academies as a model for school improvement.”

    The University of Birmingham endorsed the PwC conclusion in its analysis of the results of academies from 2002 to 2006. It concluded that “…some schools are gaining higher scores since Academisation, but others are gaining lower scores. Using the most recent results available there is no clear evidence that Academies produce better results than local authority schools with equivalent intakes. The academies programme therefore presents an opportunity cost for no apparent gain.”

    http://www.localschoolsnetwork.org.uk/2012/02/minister-cites-report-which-didn%e2%80%99t-unconditionally-endorse-academies-%e2%80%93-has-he-actually-read-it/

    http://eprints.bham.ac.uk/598/1/Academies_paper4,_JEP.pdf

  2. “The critics can’t ignore overwhelming international evidence which shows that giving schools independence drives up standards and LSE’s research that Academies have a knock on effect on results in neighbouring schools,” says the DfE.

    That same “international evidence” (ie OECD) found that in 2009 UK secondary heads already enjoyed more autonomy than in most other countries. But academies in chains actually risk losing much of this autonomy as John Burn, OBE, warned in his evidence to the Education Bill Committee. This view was confirmed by Toby Young in his book “How To Set Up a Free School” when discussing sponsorship of free schools. “…the parent group might have a role on the marketing side, trying to drum up custom and so forth, but beyond that no input.”

    Channel 4 FactCheck looked at the LSE report and found it to be a solid piece of research. However, it raised doubts about the use of equivalent exams in judging GCSE performance and also wondered, like PwC 2008, whether the improvements found by LSE could be attributed solely to academy conversion. PwC had found that when schools improved they used similar methods and these were found in all types of improving schools, not just improving academies.

    Stephen Machin, one of the authors of the LSE report, told BBC Radio 4 (“The Report” 16 January 2012) that the results of the LSE research could not be applied to either converter academies or primary schools.

    http://www.localschoolsnetwork.org.uk/2011/12/%e2%80%9cthey-create-a-prison-and-call-it-freedom-%e2%80%9d-schools-education-providers-and-autonomy/

    http://blogs.channel4.com/factcheck/is-the-academy-programme-the-answer-for-failing-schools/6933

  3. Guest says:

    Where’s the data for grammar school academies? Mr Gove is removing vocational courses so that schools can’t game league tables but surely cream-skimming easy to teach pupils is cheating the stats.

  4. Emma R says:

    Has any similar analysis been done on the performance of primary academies?

  5. [...] the rest of the article over at Local Schools Network Share this:Like this:LikeBe the first to like this post. This entry was posted in Academy News [...]

Want to follow comments on this post? Use the RSS feed or subscribe below

Reply

CAPTCHA Image
*

DfE fails to refute LSN analysis: DfE data shows academies do not perform better

1 Trackback