The 2nd release of detailed school level GCSE data is due next week, on 24th January. As the government has made clear that its primary programme for school improvement is academy conversion, this information is important. It enables analysis of whether schools who become academies do better and therefore whether there is any basis to the £1 billion plus that Gove has spent on converting schools.
The government and its supporters claim that last year's data showed that academies results grew twice as fast as other schools. This is only true if you compare academies (generally with low previous GCSE %) with all schools, including those with already high % and less likely to grow.
To test whether becoming an academy provides benefits, LSN compared academies
with similar schools. Comparing both groups for those previously below the 35% floor for 5ACEM (5 A-C GCSEs including English and Maths) we found:
* On average Academies, below 35% in 2010, grew from 29% to 37% - a rise of 8%. Impressive, but the same turns out to be true for maintained schools. These rose on average from 30% to 38%, again a rise of 8%
* The data allowed comparison back to 2008. Taking academies below 35% in 2008, we find an average increase of 18.6% by 2011. Again, very impressive. But, again non-academies more than match the figure. Their average rise was 19.1%.
We found that, contrary to popular perception, there were very large increases in achievement across all previously low-achieving schools. However these improvements were at least as great in maintained schools as in academies - despite the massive investment of funds in academies.
These figures include GCSE equivalents like BTECs. Remove these and the academies figures fall in 2011 by 13%, while non academies fall by just 10%. A direct comparison is not possible (as GCSE-only data was not available in previous years) but this suggests that GCSE-only increases (as use of BTECs is regarded by the government as 'gaming' the system) and the increase in non academies is likely to be greater in both periods.
The DfE Response
The Department for Education responded to this analysis in June 2012, with the publication of Attainment at Key Stage 4 by pupils in Academies 2011
. The principle research was based on analysis of growth between 2006 and 2011. This is very difficult to check as the data is not publicly available. It has taken six months of FOI requests to get even the list of schools used, and I am still awaiting the response to my request for the underlying data. But these conclusions can now be drawn:
1) The original Local Schools Network analysis, showing academies did no better than similar non-academies, has never been challenged by the DfE - not in this report or elsewhere.
2) The comparison from 2006 to 2011 is based on just 33 academies and 33 non academies, out of a population of over 2,000 schools. Such a small sample is unlikely to be statistically significant.
3) The 33 non academies were chosen, according to the report, for their similar characteristics. However there will have been several hundred schools with similar characteristics and so it would be easy to choose schools which performed below average to achieve a positive comparison.
The performance of this list of 33 schools can be checked against the full set of schools, based on their GCSE %. The 2006 data is not publicly available so I have had to use 2008 data. The rise for 2008-2011 for each school has been compared to all those in its 5% range. So a school with 33% GCSEs is compared to the average for all those with 30-35% GCSEs.
Of the 33 schools, 10 were above the average for their set of schools but 23 were below. Whether deliberately or accidentally it seems that the small comparison set of similar non academies was made up of schools that, on average, performed less well than the full set of similar non academies. Over the period 2008-2011 these 33 schools, on average, grew by 2.5% less than if they had been fully representative of all similar schools. This is equivalent to 4.2% over the full 2006-2011 period and explains the bulk of the 6.4% difference found between academies and non-academies.
Academies do no better than Non Academies
The publicly available DfE data shows, comparing results between 2010 and 2011, and between 2008 and 2011, that academies do at best as well as non academies - and probably worse.
The DfE's attempt to show that academies did better over 2006-2011 is based on a very small sample and most of the difference can be explained by the use of a comparison set of schools with below-average increases.
All of this analysis is based on 'sponsored academies', chosen for conversion because of their previous low performance. There has been no evidence at all of whether 'converter academies', those previously Good or Outstanding, will benefit from becoming academies as there were only 25 that had converted by the time of the 2011 results. Yet the best part of £1 billion has been spent on this conversion programme, on what is no more than a hunch that it will improve performance. In fact the first indications are that the results for Converter Academies have fallen
There is also no evidence at all of what effect academy conversion would have on primary schools, but hundreds are being forced into becoming academies.
”The government took a very conscious decision that its major school improvement programme was the academies programme", according to the DfE's head civil servant, Chris Wormald speaking at the Public Accounts Committee in December. It has spent over £1 billion on this programme and yet there was little evidence that academy conversion brought any benefit to sponsored academies and none for converter academies.
The tragedy is that there is real evidence of approaches that do bring real improvements to schools, such as the partnership work of the London Challenge. If only we had a government that based its approach on evidence rather than ideology...