This morning the DfE issued its school-by-school GCSE data
that sponsored academies were improving at five times the rate of all state-funded schools. This claim is in line with the prediction
I made 2 days ago of how the DfE would distort the results.
As you can see from the graph below, how much a school's GCSE increase is related to its previous results. Those with previously low results tend to see large increases and those with previously high results tend to see only small increases or falls. The best way to judge how one set of schools performs is to compare like-with-like.
How Did Similar Academies & Non-Academies Compare?
For those schools whose GCSE benchmark was in the 20-40% range in 2011, academies increased by 7.8% and maintained schools by 7.7%. Both are impressive results and these schools should be praised for their improvement. However it does seem that the structure of the school makes little difference, and it would be good to see a bit more praise of the success of non-academies from the Department for Education.
In the other bands, non-academies did slightly better in the 40-60% and 80-100% bands and academies did slightly better in the 60-80% band.
This backs up our research
on the 2011 data, which showed that - when compared to similar schools - academies did no better (and sometimes worse).
Again we find that previously under-performing schools who chose to stay with local authorities did as well as those which found a sponsor and became an academy. Perhaps more attention should be paid to what these schools and successful local authorities are doing - without the cost and time of academy conversion.
The DfE Claim: A Gross Distortion
It is technically true that the benchmark GCSE results for academies grew, on average, by 3.1% compared to 0.6% for all state schools. However this is not comparing like-with-like and simply reflects the tendency of results for more successful schools to grow at a slower rate (or fall). It is a gross distortion of the data to claim this as a conclusion from them.
This analysis is before including the effect of GCSE equivalents (such as BTECs), the use of which is regarded by Michael Gove as 'gaming'. The benchmark GCSE figure falls by 12% for sponsored academies once these are excluded and by only 6.6% for non-academies. It can be expected, therefore, that the comparison would be still less flattering for academies if the GCSE only figures were used.
One caveat must be added to any analysis of 2012 GCSE results, as the result of the summer English debacle may have been slightly random in which schools it affected. This could be a factor in the fall in results at the top end.
I have not included the 0-20% band as there were only 3 schools in this range in 2011. All were academies and all did improve, by a very impressive average 22%. However the sample is very small and there are no non-academies to compare them to. Numbers of schools in each band:
20%-40%: 73 sponsored academies, 175 non-academies
40%-60%: 134 academies, 1,012 non-academies
60%-80%: 27 academies, 706 non-academies
80%-100%: 12 academies, 134 non-academies
Converter academies, which were not include in the DfE press release and whose results fell overall, have not been included in this comparison.
Sponsored academies are generally previously under-performing schools who are supported by a sponsor. Converter academies are the previously Good or Outstanding schools who were offered the chance to convert on their own.