Latest Government Data Shows Academies Underperform

Henry Stewart's picture
As the government has rushed to convert all English schools to academies, some have questioned the move and asked for the evidence that academies are more successful. The response has seemed to be along the lines "Look at Mossbourne's success. Mossbourne is an Academy. Therefore academies are the way forward."

The very detailed figures released today on school performance allow us to examine the evidence. And the evidence is clear. Academies are less successful than any of the other three categories, community schools, foundation schools or voluntary aided schools. First, the % achieving 5 GCSEs including English and Maths:

Academies (249): 47%
Community Schools (1,306): 56%
Foundation Schools (875): 61%
Voluntary Aided Schools (497): 67%

The numbers in brackets are the number of each type of school. Now most of these 249 will be the original Labour academies, generally situated in deprived areas. It could be argued that this explains the lower %. However, if academies are as successful as other schools (never mind more so), we can expect students to make as much progress in them. The latest data shows what % make the expected level of progress in English and Maths:

% of students making expected progress in English

Academies: 65%
Community Schools: 71%
Foundation Schools: 73%
Voluntary Aided Schools: 79%

% of students making expected progress in Maths:

Academies: 56%
Community Schools: 63%
Foundation Schools: 67%
Voluntary Aided Schools: 73%

The data is very clear. Academies underperform. Less students make expected progress in both English  and Maths than in any of the three other categories. It turns out that Mossbourne (which scores 82%, 92% and 95% on these measures) is the exception, not the rule. This is despite the fact that many of these academies had vast investments as they were established.

Does the government believe in evidence-based policy or not? If it does, surely this clear and disturbing data should lead to an immediate review of the move to academies while the reason for this disparity is investigated.

Data Notes

These figures are a straight average of all schools in these categories in the DfE figures you can reach from here:


Added note: The academies data is based on the AC category of original academies (termed Sponsor-led academies by DfE). It does not include the 25 schools coded ACC for converter academies.

(The schools data is in the GCSEs, CSV click through. I found it wouldn't open directly in Excel. I had to store it as a CSV file, and open it from Excel. The three categories quoted here are headed AC5EM11,  PT24EngPrg, PT24MathPrg. The categories are explained in the Metadata document, from the above link. The calculation was done with a simple pivot table analysis.)


Share on Twitter


Leonard James's picture
Fri, 27/01/2012 - 06:51

Hmm the LSN has used the 'disadvantaged intake' excuse on numerous occasions to justify some state school results. Given your admission that many academies here are rebranded inner city schools you seem relatively reluctant to make the same excuses for the academies. Does this mean you have abandoned this line of argument altogether and will encourage your members to stop using it?

Henry Stewart's picture
Fri, 27/01/2012 - 09:26

Leonard, thanks for the question. I think this is an explanation for the raw figures on % getting 5 A-Cs EM. If one school has an intake with 40% entering on level 4 or above, it clearly is going to have to overperform to get the same GCSEs as one with 75% of entries on Level 4 or above. That is the problem with using raw % pass rates for comparison.

However value added is a different matter and I have certainly never seen 'disadvantaged intake' as an explanation of low VA. Every school should be ensuring their students make (indeed exceed) expected progress, given the level they entered the school with.

Janet Downs's picture
Fri, 27/01/2012 - 11:12

Looking at the context in which a school operates is important when judging the effectiveness of a school. In its latest Economic Survey of the UK (2011) the OECD raised concerns about the usefulness of the league tables published in England because they reflected levels of achievement rather than the value added. OECD noted that the Contextual Value Added (CVA) scores used since 2006 tried to “reflect students’ progress by relating output measures to inputs” but suggested that even this “more advanced approach does not give good estimates of school efficiency as it to a large extent reflects pupil specific factors.”

The Government has removed this “advanced” measure from the 2011 league tables because it believes in a “no excuses” approach. On its own measure, then, any school which does not reach its benchmark is to be “named and shamed” and judged as "failing" regardless of its intake or the circumstances in which the school operates. The worst-performing secondary school in England in 2011 was St Aldhelm’s Academy in Poole, Doreset, where just 3% of pupils got five GCSEs A*-C or equivalent including Maths and English. The head teacher explained on Radio 4 news last night that her school served two large council estates and a settled traveller community. In the cohort that had just taken GCSEs only one pupil had entered the school with a Key Stage 2 level 5.

But the Government says, “No excuses!” which is an attitude I deplore. It also says that academy conversion is the key to raising standards but it’s clear that academy conversion is not a magic bullet. The question now is: what is the government proposing to do about academies which on its own measure are “failing”?

Latest government data show that academies over perform!

The evidence is more mixed than is made out and one shouldn’t be too quick to dismiss the effects of intake.

For example:

The Value Added measure based on the best 8 GCSE and equivalent results gives the following results:

Academies 1002.08
Community Schools 997.46
Foundation Schools 1000.28
Voluntary Aided Schools 1004.75

Now none of these values with their confidence intervals fall outside the range of the confidence intervals of the other values, so you can’t draw statistically significant conclusions, but it shows another story.

In terms of the intakes the following is the case:

% of pupils starting KS4 in ____ prior attainment Band
Academies 27 53 20
Community Schools 19 51 30
Foundation Schools 17 47 36
Voluntary Aided Schools 13 47 40

As Henry says, this could be used to explain GCSE results, but the progress results can be broken down a little more.

With regards to progress of previous low attaining pupils academies (48% Eng, 29% Ma) do roughly as well as community schools (49% Eng, 27% Ma) but much better than foundation schools (46% Eng, 26% Ma). All lag behind voluntary aided schools.

However, academies (En 78%, Ma 74%) fall behind when it comes to the progress of high previously attaining pupils compared with community schools (En 85% , Ma 82%), foundation schools (Eng 85%, Ma 83%) and voluntary aided schools (Eng 88%, Ma 86%).

These results are also replicated when you look at the progress each type of school on average makes with those looked after or on FSM compared with those not looked after or on FSM, again with academies (26%) having a much greater proportion of these pupils that community schools (15%), foundation schools (13%) and voluntary aided schools (13%).

So the schools have very different intakes on average and seem to be doing different things with them.

In the OECD report “Viewing the United Kingdom School System through the prism of Pisa” they said the following:

“In the United Kingdom there are 27% of students in schools with a socio-economically disadvantaged intake, of which 48% are students who are socio-economically disadvantaged themselves (i.e. they are overrepresented), while 23% of students are in socio-economically privileged schools of which only 6% are socio-economically disadvantaged themselves. Disadvantaged students tend to do worse than expected in disadvantaged schools, but by about the same margin as in many other OECD countries, and advantaged students tend to do much worse than expected, in this case by a larger margin than average. In schools with a mixed socio-economic intake, disadvantaged students tend to do better than expected and advantaged students tend to do worse than expected by about the same margin as in the OECD in general. In schools with a privileged socio-economic intake, disadvantaged students tend to do better than expected (but by a margin less than in the OECD) and advantaged students tend to do better than expected”

I stress - "Disadvantaged students tend to do worse than expected in disadvantaged schools... and advantaged students tend to do much worse than expected"

So in fact based on what you would expect to occur in schools with such intakes we can see that by doing as well for the disadvantaged and previous low attaining students as they would do in other schools LATEST GOVERNEMNT DATA SHOW ACADEMIES OVERPERFORM!

Fiona Millar's picture
Fri, 27/01/2012 - 14:06

Do you know what curriculum and qualifications the academies are using to improve their value added? The E_Bacc figures, and last years tables showing GCSE results without equivalent qualifications ( to be released for 2011 shortly I hope), seem to suggest that they are not following the sort of curriculum you approve of and may be using "easier" qualifications that are worth more than 1 GCSE to improve their VA scores.

To answer your question, I have no idea what qualifications they are on average taking, but, as I said, the Value Added figure is not significant in any case and I only mentioned it to show that the data is not straightforward.

But what do you think of my substantive point about progress in English and Mathematics which was at the heart of Henry's article? It seems that across Europe disadvantaged students are doing worse than expected, apart from in our academies.

Henry Stewart's picture
Fri, 27/01/2012 - 14:44

Charlie, I presume you also noticed the value added figures on English and Maths. Both are 999 for academies (against average 1000). The fact that they have 1002 on the Best 8 figure does indeed indicate the VA is coming from the Btec use that Mr Gove criticises so heavily.

This is backed up by the data that shows on average an academy student did 11.9 GCSEs when you count equivalents but only 6.6 without equivalents. (Compared to 7.7 at community schools and 8 at foundations.) And only 5% achieved the ebacc at academies, compared to 13% at community, 18% at foundation and 22% at VA schools.

I had also noticed the differing progress at academies, with academies getting similar progress to non-academies for students with low prior achievement but doing less well for those with medium and high prior achievement.

Your argument seems to be that, yes, overall academies underperform but its to be expected given their disadvantaged intake and the fact that on one limited criteria they do as well as other schools - they are overperforming?!?

Let's be clear. To justify the massive shift to academies that Gove is forcing on English education, there should be a clear case that they produce better education. This case simply doesn't exist and the data we have is clear that students at academies overall did worse - in terms of not making expected progress - than at non-academies.

I can understand that some schools struggle in disadvantaged areas but, like Gove, I would say we can't accept this as inevitable. We need to learn from the success stories. One of those is indeed Mossbourne. But just across the border in Tower Hamlets are a range of comprehensives enabling students from the most disadvantaged backgrounds to succeed. And, as at last summer, not one was an academy.

Henry, ignore that value added data, as I said it doesn't prove anything, but I realise it is a distraction from my main point. I think we can agree that in general absolute performance levels have been higher at non-academies.

Henry Stewart's picture
Fri, 27/01/2012 - 14:51

Charlie, you are arguing that disadvantaged students do better in academies than other schools. So let's look at that statistic. The data gives the % of those with FSM achieving 5 GCSEs including English and Maths. The figures are:

Academies: 33%
Community: 33%
Foundation: 35%
Voluntary Aided: 42%

So they do match community schools on this measure, but are behind the other categories and so academies are clearly under-performing compared to non-academies as a whole.

No, I'm not arguing that. I am arguing that disadvantaged students do better than you would expect them to do given the average entry profile of academies. The OECD report tells us to expect that disadvantaged students will do much better in advantaged schools. Also, I was looking at pupil progress which you were also looking at.

Henry Stewart's picture
Fri, 27/01/2012 - 15:02

But, Charlie, your deeper question intrigued me. Your claim, based on your interpretation of PISA, is that disadvantaged students in disadvantaged areas do worse. So if academies are doing almost as well you argue they are over-achieving.

So I've extracted only those schools where more than 40% of students are on free-school meals. Ignoring special schools, there are 156 of these and 40 are academies. So if we compare the % of FSM students getting 5 A_Cs (incl English and Maths) in this subset of disadvantaged schools we get:

Academies (40): 38%
Community (63) 44%
Foundation (33): 41%
Voluntary (16): 49%

Very interesting. First, it is clear that, when compared to the same disadvantaged cohort, academies clearly under-perform. But we also have an interesting refutation of your PISA argument: Disadvantaged students achieve more in disadvantaged schools than they do in more advantaged ones. Intriguing.

Firstly, it is the OECD's claim. I don't see how anyone could interpret their very straightforward text in a different manner.

Your data is interesting, what is the data for this cohort of schools on progress in English and Maths, and for English and Maths for those starting in a low prior attainment band?

If your figures are correct then that is a startling result. You should write to the OECD and tell them they are wrong. Cripes, if we can't trust them on this then what can we trust them on? :-)

Janet Downs's picture
Fri, 27/01/2012 - 17:33

Charlie - you are correct in interpreting the OECD findings in the way that you have. Globally, all children tend to perform worse in schools which have a high proportion of disadvantaged pupils. This finding was upheld by the Education Endowment Fund's research into under-performing English schools. Disadvantaged children tend to do better while advantaged children tend not to perform as well in schools which have a roughly equal proportion. And all children tend to perform better when in a school with a majority of advantaged pupils. The OECD also found that there were strategies which could be taken to help disadvantaged pupils overcome their disadvantage (OECD called these pupils "resilient"). This has been discussed on this site before:

What you seem to be suggesting is that the context of the school should be taken into account when judging its effectiveness. I would agree with you on that point. It's unfortunate that the government doesn't.

Henry Stewart's picture
Fri, 27/01/2012 - 21:34

Charlie, happy to provide that info. From the cohort of schools with 40%+ FSM, results

Academies: English progression 66%, Maths 57%
Community: English 70%, Maths 62%
Foundation Foundation: English 69%, Maths 61%
Voluntary: English 72%, Maths 65%

Taking the same schools, the figures for students on low prior attainment are:

Academies: English progression 52%, Maths 36%
Community: English 58%, Maths 40%
Foundation Foundation: English 58%, Maths 38%
Voluntary: English 59%, Maths 39%

It does seem that, whichever way you cut it, the academies under-perform.

Henry Stewart's picture
Fri, 27/01/2012 - 21:53

Ok, here's another question. Some of the academies are new and some take over an existing (and probably under-performing) school. Could the figures be low for academies because of the effect of those having to take over schools in difficulty?

The data gives the figures of % getting 5 A-Cs EM for 2008, 2009, 2010 and 2011. Those without 2008 results can be assumed to be new schools, like Mossbourne. There are 175 of these and the figures for % getting expected progress:

English: 63% (v 73% across all state non-special schools)
Maths: 54% (v. 66%)

These figures are slightly lower than for academies as a whole (see above) and lower than community, foundation or voluntary aided schools.

These are presumably new schools that have the advantages of vast amounts invested in new buildings, have had a fresh start, a specially chosen head and new teachers. All can be assumed to improve results. But these new academies have, on average, under-performed.

What can explain this? I'm a data geek and the only conclusion I can draw from this data is that there is something about being an academy that leads not to better achievement but to worse. Could it be that traditional community schools and the support of a local authority - rather than a sponsor - are actually quite important?

Janet Lallysmith's picture
Sat, 28/01/2012 - 17:20

Those are really interesting figures, Henry. I agree that the support of a local community is vital.

A particular trend that I picked up looking at schools converting to a sponsor is a trend of sponsors including Harris taking on improving schools and then claiming their success as their own. On its website, Harris takes credit for the achievement of 'our children' in the May 2011 SATS, when in fact it took over the school in Sept 2011.

Ben Taylor's picture
Sun, 29/01/2012 - 02:24


Have you adjusted these figures for state grammar schools?

At a cursory glance I find that these are variably listed

For example

Wirral Grammar School for Girls - NFTYPE is CY which is "community"
Clitheroe Royal Grammar School is - NFTYPE is FD - "foundation"
St Michael's Catholic Grammar School - NFTYPE is VA - "voluntary aided"

AC5EM11 for all the above ranges from 98 to 100%

So once you adjust for this you should find the %s dropping for community schools foundation schools and voluntary aided schools: unless you want us to infer an argument for grammar schools versus academies.

Janet Downs's picture
Sun, 29/01/2012 - 08:56

Ben - grammar school select their pupils. They cream the top 25%. Academies are supposed to be fully comprehensive although academies in areas which still practise selection such as Lincolnshire and Kent are really secondary-moderns. Some academies, like Mossbourne, have a strict banding system which ensures that each ability range is fully represented.

If you compare selective grammar schools with other, non-selective schools you are not comparing like with like. One way round this is to compare the results for the high attainers - these figures are now available in the DfE school performance tables.

The results can be illuminating. Here in Lincolnshire 99% of high attainers at one grammar school achieved the benchmark of 5+ GCSEs A*-C including Maths and English. At a fully-comprehensive school (rare in Lincolnshire) a few miles away, 100% of high attainers achieved the benchmark.

It would be interesting to compare the results of high attainers in schools known to be fully-comprehensive with those of grammar schools whose very purpose is supposed to be to "stretch" the high attainers. Perhaps the results above might be replicated nationally. In which case, the growing rhetoric of "grammars good, comps bad" might be shown to be nothing more than hot air.

Janet Lallysmith's picture
Sun, 29/01/2012 - 09:11

In addition to Janet's excellent points, in what way should Henry 'adjust' the figures for grammar schools?

He's discussing evidence for the debate about academies = good, community = bad - the DfE's argument is no more sophisticated than that.

Ben Taylor's picture
Sun, 29/01/2012 - 11:12

Marigold, because the category of community schools includes some of the state grammar schools (not all - some are foundation or voluntary aided for example), Henry's argument is therefore also that the success of grammar schools versus academies is a good thing.

Or to put it another way these figures should compare comprehensive community schools with comprehensive academy schools if you want a fair comparison. Otherwise if this is not done, thanks for agreeing that grammar schools are good thing since they raise the performance of the non academy categories.

Janet Lallysmith's picture
Sun, 29/01/2012 - 11:57

Ben, sorry, but I have no idea what you're talking about.

Ben Taylor's picture
Sun, 29/01/2012 - 16:08

It's not surprising that academies with comprehensive, or near comprehensive entries, are below performance levels schools for the average of the other state sector schools, especially since selective schools are included in the state sector average and many academies were at the tail end of poor performance. What academies need to do is copy Mossbourne, or if they are maintained copy Bethnal Green.

Janet Downs's picture
Sun, 29/01/2012 - 16:58

Ben - as I said above Mossbourne has a banded intake. It has 25% high attaining pupils, 52% middle attainers and 23% low attainers. Bethnal Green's intake is similarly spread: 28% high attainers, 59% middle attainers and 13% low attainers. Both these schools, then, are fully comprehensive.

Results for both schools were impressive - the figures below show the percentage of pupils in each band that gained the benchmark 5+ GCSEs (or equivalent) A+-C including Maths and English:

Mossbourne: Low attainers 61%, Middle attainers 88%, High attainers 97%
Bethnal Green: Low attainers 62%; Middle attainers 89%; High attainers 100%

Two schools - both with similar intake range, both with similar results (although Bethnal Green has the edge). However, it is Mossbourne that is constantly praised while the other doesn't seem to get a mention even though its GCSE results in 2008 were poor so it, like the much praised academies, has improved results rapidly.

One of these successful schools is an academy; one isn't. This shows that it isn't necessary to be an academy in order to get good results although the Government would have us believe otherwise.

But both are fully comprehensive and their results for middle and high attainers equal those for grammar schools. This gives the lie to the message that it is only by expanding grammar schools that results can be improved.

But what I really want to know is how are London schools getting such good results?

Henry Stewart's picture
Sun, 29/01/2012 - 17:24

No, Ben, I shouldn't remove the grammar schools. Selective area do not have better results. The higher % of the grammars is inevitably balanced by the lower % of the 'comprehensives' (actually secondary moderns) in the area. So to compare academies to the whole set of non-academies is absolutely valid.

I did want to do some calculations on comparative performance between grammars and comprehensives - given the data on students with prior high attainment - but the DfE data does not identify which schools are selective.

Ben Taylor's picture
Sun, 29/01/2012 - 23:57


If academies are supposed to be all comprehensive intake then I think it's more valid to compare them with comprehensive maintained schools, you should knock out the secondary moderns and grammars. Don't you think you need to see what the historical performance of academies was if converters from the poorly performing maintained sector? Otherwise what you have done is essentially boost the performance of the non academy schools, by knocking out a section of their schools which are then recategorised as academies.

You don't need DfE for a list of academic selection: there is a list of English grammar schools at wiki which has links to the statute law designating grammar schools, so you can go to the original source. There are only 164 grammar schools so you can cross reference it to the data fairly easily. Secondary moderns I guess you can get reasonably accurate by eliminating all grammars and academies from each county, what is left should be categorisable as secondary modern although I guess it's potentially complicated by non academic selection like faith.

Janet Downs's picture
Mon, 30/01/2012 - 10:16

Ben - as you think that cross referencing the data for grammars schools could be done "fairly easily", it would be helpful if you would do this and then publish the findings here. However, you would also need to do this for secondary moderns and this is where it becomes a problem.

Few schools categorise themselves as "modern" - they call themselves "high schools", "community schools" and so on. The only way to find out whether schools have a full spread of ability (like Mossbourne and Bethnal Green) is to look at the figures on the DfE website which give the breakdown of how many low, medium, and high attainers there are in a school. One school that does identify itself as "modern" is Gleed Boys' School in Spalding, Lincolnshire (a selective area), which has 25% low attainers, 64% middle attainers and only 11% high attainers (a fully-comprehensive school would be 25% low, 50% middle, 25% high).

Academies are supposed to be fully comprehensive but those in selective areas are not necessarily so. The Marlowe Academy in Kent (another selective area) has the following intake: 42% low attainers, 52% middle attainers, 6% high attainers. St Adhelm's Academy in Poole (the worst performing secondary school in the UK when judged on raw results) has 38% low attainers, 61% middle attainers and only 1% high attainers.

I've given you a start, Ben. I discussed two London schools above, one "modern" and two academies, both of which are described as "comprehensive" but clearly are not. There are over 3,000 secondary schools in England so that's about 3,000 records you need to consult. But as you say, it can be done "fairly easily".

Good luck - I look forward to an analysis of your results with the evidence.

Lynne Mendoza's picture
Mon, 30/01/2012 - 11:44

It's not that simple though - there are many partially selective secondary schools in the country so you would need to account for those too.

Adrian Elliott's picture
Mon, 30/01/2012 - 11:00

If we are comparing the results of academies and local authority maintained schools shouldn't we simply be including the results of community,foundation and v.a schools together? Academies include all three types of ex-LA schools. Why bother with the three categories for existing LA schools? Of course, I know the reason is that is how the results are reported. But if you compare all three types of LA schools with academies you get a simpler comparison.
Whichever way you look at the figures it seems there is no evidence for the huge superiority of academies claimed by ministers.

Henry Stewart's picture
Mon, 30/01/2012 - 22:55

Good point, Adrian. Check out my more recent post for analysis which simply compares academies to non-academies:

Add new comment

Already a member? Click here to log in before you comment. Or register with us.