Earlier this year we published a series of posts, summarised here
, analysing the DfE data release and showing that academies performed no better than similar secondary schools that had not become academies. (And, on some measures, performed worse.) While the DfE claimed faster growth rates for academy GCSE results we found this was due to starting from a lower base. If you compared academies to similar state schools we found the difference disappeared.
For instance while academies starting with less than 35% achieving 5 GCSEs (including English and Maths) grew their results by 8% in 2011, so did non-academies
starting below 35%. When the DfE argued that most academies had not existed long enough for a fair comparison, we focused on long-established academies. Comparing with similar schools, by levels of deprivation, we again found
no clear advantage to academies.
Today the DfE published its own analysis
. It followed a similar methodology and established a group of 'similar schools' (based on levels of prior attainment, deprivation and previous outcomes). It came to very similar conclusions. To quote the key headings from the comparison (p 9-10):
- "Having started from a slightly lower base, results for pupils in Sponsored Academies were broadly the same as in a group of similar schools in 2011."
- "If equivalent qualifications are excluded, results in Sponsored Academies were lower than in a group of similar schools."
- "Results for FSM and non-FSM pupils were broadly the same in Sponsored Academies and the group of similar schools."
- "Results for pupils with and without SEN were broadly the same in Sponsored Academies and the group of similar schools."
- "In both Sponsored Academies and the group of similar schools, White pupils were the lowest performing ethnic group."
- "Results for pupils whose first language was English or other than English were broadly the same in Sponsored Academies and the group of similar schools."
- "In both Sponsored Academies and the group of similar schools, pupils with first language other than English outperformed pupils whose first language was English."
This graph, taken form p11, shows the comparison. Local authorities in recent years have, faced with underperforming schools, had a choice. They could convert them to academies or seek improvements as state schools. We found that, despite the extra funding available for those choosing the academy route, they did no better than similar state schools.
The DfE appears to have now confirmed that academies do no better than non-academies. And note the point that academies did slightly worse when GCSE equivalents were removed. As the Daily Telegraph pointed out
some months ago, if you remove equivalents, overall % GCSEs drop by 6% in all schools but those for academies drop by 12%. The DfE report confirms this fact, with a slightly larger gap: 5.8% in all state-funded schools (including academies) against 12.5% in academies alone (fig 3.2 in the report).
This difference is significant when looking at Section 2, which seeks to argue that academy results are growing faster. Michael Gove quoted from this section in his Spectator speech
today, claiming far higher growth in results for academies. This speech was curious. Gove berates those schools that use Btecs, diplomas and other GCSE equivalents: "The students were told these qualifications would equal up to 4 GCSEs - but employers regarded them as worth much less than a single GCSE." He then goes on to quote results of academies that are heavily based on using the very qualifications he so disdains. (eg, He claims that in Harris South Norwood Academy , 100% of students achieved 5 GCSEs. However, once equivalents are removed, just 46% of students achieved 5 GCSEs including English and Maths.)
Indeed, while Section 1 did quote the figures without GCSE equivalents (and showed academies then performing less well than similar schools), Section 2 of the report at no point gives any figures without equivalents. Given that Gove sees these equivalents as of no value and most will from 2015 no longer count as GCSE equivalents, it is curious that the DfE bases its claim for greater growth purely on figures that include those equivalents. Given their far greater use by academies, it is likely that most or all of any growth gap would disappear once equivalents are removed.
And Section 2 is the one place where the DfE disagrees with our analysis. They claim academies GCSE results grew by 5.7%, while similar schools grew by only 3.4%. Again this difference could be explained by use of GCSE equivalents. However I would also question the figures. We found that taking the group of non-academies with results below 35% in 2010 showed an increase of no less than 8%
in 2011. I would ask the DfE to release the data it is using, so the detail can be checked.
So the DfE acknowledges that academies are generally doing no better than similar non-academies, but claims their results are growing faster. However, being heavily based on use of GCSE equivalents, these claims are highly suspect. The data continues to fail to show any ringing endorsement for academies, and leaves the question of why they haven't performed better given the vast amounts of funding poured into those schools.