The government has now been asked, in both the Commons and the Lords, a simple question: How does the performance of sponsored academies compare to that of similar schools in the LA maintained sector? Given the government's determination to force all struggling schools to become sponsored academies, this is a crucial question. Indeed I would argue it is the crucial question, to judge whether this government policy makes sense. Each time the government minster has dodged the question. possibly because the data actually shows - for both primary and secondary - that sponsored academies improve at a slower rate than maintained schools that start from a similar level.
To summarise the exchange that took place at the Committee stage of the Education and Adoption Bill on 7th July: Gibb: Sponsored academies grades rose, over 4 years, by 6.4% pts compared with 1% for maintained schools Brennan: Is he comparing that improvement with figures for schools in similar circumstances? Gibb: Sponsored academies have improved their grades by about 6.4% compared with all LA schools Nick Gibb surely understands the question he was asked but chooses not to answer it, instead falling back on the comparison with all schools.
Lord Hunt tried a different approach, seeking to specify in detail what was required: Written Question: To ask Her Majesty’s Government how the performance of sponsored academies compares to that of maintained schools when grouped by prior results at achieving five A*-C grade GCSEs, including English and Maths, broken down per decile, over (1) the last year, (2) the last two years, and (3) the last three years. Lord Nash's response talked of "transforming some of the worst underperforming schools" and "giving strong leaders the freedom to make decisions that will drive up standards balanced with tough accountability". However he again was unable to answer the question on how sponsored academies perform compared to maintained schools starting from a similar point. Lord Nash had previously claimed, in a letter to peers on 21st October, that “primary sponsored academies are improving faster than all state-funded schools”. As usual the comparison was with all schools, not with schools starting from the same point.
Imagine you were to ask about the performance of a hospital and were told "our hospital has improved the health of our patients by more than the health of the overall population". Well, yes, that is to be expected. If they start off sick, you would expect the improvement in their health to be greater than the mainly healthy overall population. The key question is whether the health improvement of the patients is as good as that in other hospitals, or any other institution dealing with similar levels of sick people. The results of struggling schools increase at a far faster rate than that of schools overall, as is clear in the graphs below. To test whether sponsored academies are a vehicle for superior school improvement, the issue is not whether they improve results more than all schools but whether they do better than maintained school starting from a similar level. That is the question that the government consistently refuses to answer.
The government has not always been so hesitant to compare sponsored academies to similar schools. In the early days of the coalition government they were fond of quoting from the research of Stephen Machin. Machin's research was on the early Labour academies and his most recent date is for 2008. Machin himself made clear that it was “hard to justify” the use of his research by the government for its very different academies. Indeed he called it a “step too far”. After Local Schools Network published our original analysis of the 2011 school-by-school GCSE data, the DfE responded with its own research on how the performance of sponsored academies compared to similar maintained schools. Based on this it claimed in the High Court in June 2013 that this showed "marginally higher" results for sponsored academies. That 2011 analysis was based on benchmarks that included GCSE equivalents such as Btecs. At that point I was claiming that sponsored academies did no better than similar maintained schools. Since the exclusion of those equivalents the data has clearly shown that sponsored secondary academies do worse than similar maintained schools. And all the evidence from primary results indicates the same.
The latest DfE data on primary school results, released this month, show that maintained schools improve their results (in terms of the new Level 4b benchmark) faster than similar sponsored academies. On average maintained school results increased by 6.4% more than similar sponsored academies: A similar pattern is present for level 4 and level 5 improvement in 2014-15 and for level 4, level 4b and level 5 improvement over 2013-15. The better performance of maintained schools is remarkably consistent. Full details here. (I have used the 2014-15 analysis here because the number of sponsored academies is larger, over 400.) The DfE has claimed that sponsored academies are converted because they are in a worse state. That is another reason that it is important to compare schools at similar prior levels, so the comparison is between two sets of schools at similar starting positions.
For secondary schools it is possible to use a longer period and the comparison below is based on the last three years. Again for each group of similar schools (grouped by 2012 results) the maintained schools improved their GCSE benchmark results at a faster rate.
I can understand the government ignoring the analysis of the Local Schools Network (although I know that they do read these posts). However the government has been asked the question in both the House of Commons and the House of Lords. The basis of the forced academisation provision of the Education and Adoption Bill is that the only vehicle for school improvement is to become sponsored academies. It is surely time to produce evidence that sponsored academies even do as well maintained schools, when they are compared with schools starting from a similar level.