Did academy results grow more in 2011? Not when compared to similar schools

Henry Stewart's picture
 4
My previous posts have, using the latest DfE data, shown that results from academies are no better than non-academies. In fact, when GCSE equivalents are taken out, they are often worse. But there remains one claim which the DfE continues to make and which was repeated in the Observer article yesterday:

"Final GCSE results for 2011 show that, of the 166 academies with results in both 2010 and 2011, the percentage of pupils achieving five or more good GCSEs including English and Maths rose from 40.6% to 46.3%. This means that academies' GCSE results improved by nearly twice the level of state-funded schools, which increased by 3.1% to 58.2%."

There are a couple of caveats that the DfE normally adds: This sample of 166 includes only the academies created by 2009 and excludes those that converted from City Technology Colleges and Independent schools. It seems fair to only include academies that have had a chance to have an impact and to exclude the CTCs and private schools, as they will already have strong results.

But that is the clue. The comparison set of non-academies includes those previously performing well. It seems unlikely that a school already achieving 70% or 80% (in terms of % achieving 5 GCSEs including English and Maths) is able to grow its results as much as those under 35%. So what happens if we compare academies to non-academies with similar 2010 results?

Growth compared between similar schools



If we take all schools with 2010 results under 35%, we find that academies results grew from 29% in 2010 to 37% in 2011, an increase of 8%. That is very impressive and those schools should be congratulated on the improvement. But the comparison group of non-academies grew from 30% in 2010 to 38% in 2011, again a growth of 8%. (There were 58 academies and 161 non academies in this range.)

If we take all those schools with results between 35% and 50% in 2010 a similar pattern occurs. Academies in this band grew from 42% to 46%, up 4%. Non academies actually grew slightly more, from 43% to 48% - a rise of 5%. (Based on 84 academies and 794 non-academies.) In the 50% to 65% category, both academies and non-academies grew by 2% and in the top category, those with over 65% in 2010, there was on average no growth in results. It turns out that it is indeed those schools with the lower 2010 results that were most likely to grow.

The DfE likes to paint a picture of under-performing schools that can only be rescued by becoming academies. There is no doubt that some schools have been under-performing in terms of their results. In some areas the local authority chose to tackle this by turning them into academies, with the financial incentives that this brought into the schools. Other local authorities chose to tackle this without turning them into academies, and without those financial incentives. What the DfE data reveals is that those choosing the non-academy route secured as big an improvement as those taking the academy route.

A success story for all schools



This is good news. The DfE should be celebrating the improvement across the schools it is responsible for, whether academies or not. There is remarkable improvement taking place, and the teachers and students responsible deserve recognition for this. It is sad that the Department for Education only seems to want to give that recognition to the schools that are academies.

The growth for academies is level with non academies even though this is based on the GCSE figures including equivalents (as the GCSE only data is not available for 2010). The 2011 figures fall more for academies once the equivalents are taken out. In the under 35% range, average academy results fall from 37% to 24%, a drop of 13%. Non academy results fall from 38% to 28%, a drop of 10%. The same is true in the 35% to 50% range, where academy results fall 14%, where non academy results fall just 9%.

A puzzle: More funding but academies don't perform better



The DfE definition, for its growth claim, of schools that were academies from 2009 mean that all the academies included here are the Labour sponsor-led academies. Given the extra funding they received and their extra use of GCSE equivalents, it would not be surprising if their increase from 2010 to 2011 was greater than the non academies. But it wasn't. On the DfE's preferred key measure of growth in 5 A-Cs at GCSE (including English and Maths) from 2010 to 2011 the academies did not achieve better results when compared to similar schools. This cast doubts on whether the academy model brings any benefits at all.

Data notes

All this analysis is based on the DfE data released in January. This can be downloaded from this Guardian page: "GCSEs, all schools, KS4 (CSV)". I saved this as a CSV file and then imported into Excel.

All the % achievement figures quoted here are for the % of students achieving 5 A-Cs at GCSE including English and Maths, except where GCSE only results are specifically referred to. Academies included are only those created in 2009 or before. Non academies inlcude community schools, foundation schools and voluntary aided schools.
Share on Twitter Share on Facebook

Be notified by email of each new post.





Comments

Alasdair Smith's picture
Mon, 27/02/2012 - 11:36

Excellent work Henry! Very helpful data & analysis.


Janet Downs's picture
Mon, 27/02/2012 - 15:21

The rate of performance of sponsored academies is measured from a lower base. Channel 4 FactCheck pointed this out in their analysis of academy achievement in January when it investigated Mr Gove's remarks. FactCheck concluded that any ministerial statement about academies should be treated with a healthy dose of scepticism.

http://blogs.channel4.com/factcheck/factcheck/8994

Janet Downs's picture
Mon, 27/02/2012 - 15:25

One academy which is not likely to have appeared in the "rate of improvement" figures is St Aldhelm's Academy, Poole, where only 3% of pupils reached the benchmark 5+ GCSEs and above in 2011. To give the school its due, it only had one high-attaining pupil so it's really a secondary modern, not a comprehensive. However, if 6% of St Aldhelm's pupils reach the benchmark in 2012, then the DfE can chortle that it's a rate of improvement of 100%!

http://www.education.gov.uk/cgi-bin/schools/performance/school.pl?urn=13...

St Aldhelm's predecessor school was Rossmore Community College where 21% achieved the benchmark in 2010. In April 2008 Ofsted gave it notice to improve but by May 2009 it was judged satisfactory. It was in the process of becoming an academy before the election but didn't open until September 2010 under the sponsorship of the Diocese of Salisbury and Bournemouth University.

So between 2010 and 2011 the proportion of St Aldhelm's pupils reaching the benchmark fell from 21% to 3%. How fortunate that the school changed its name - it gives the DfE an excuse not to include St Aldhelm's in their rate of improvement calculations.

http://www.education.gov.uk/cgi-bin/performancetables/school_10.pl?No=83...

http://www.educationadviser.co.uk/ofsted-report/rossmore-community-college

csc501's picture
Thu, 07/02/2013 - 10:42

Whilst I have no personally constructed data, it seems to me that your own data is guilty of exactly the kind of manipulation you're accusing DfE of. You've selected your own data bands etc. so why should I assume your data isn't distorting the picture, and believe DfE's is? Both different perspectives on the same results.

The key point is that the comparison is essentially meaningless. Each school is unique, and those converted to academies were selected to benefit from such a scheme - some perhaps should not have, but a greater that did, have benefited. The results are pretty unambiguous on that. How this compares to those that used other means is really not important. Overall improvement is good in the wider school system and the academy system appears to be a positive factor in that.

Add new comment

Already a member? Click here to log in before you comment. Or register with us.