As the government has rushed to convert all English schools to academies, some have questioned the move and asked for the evidence that academies are more successful. The response has seemed to be along the lines "Look at Mossbourne's success. Mossbourne is an Academy. Therefore academies are the way forward."
The very detailed figures released today on school performance allow us to examine the evidence. And the evidence is clear. Academies are less successful than any of the other three categories, community schools, foundation schools or voluntary aided schools. First, the % achieving 5 GCSEs including English and Maths:
Academies (249): 47%
Community Schools (1,306): 56%
Foundation Schools (875): 61%
Voluntary Aided Schools (497): 67%
The numbers in brackets are the number of each type of school. Now most of these 249 will be the original Labour academies, generally situated in deprived areas. It could be argued that this explains the lower %. However, if academies are as successful as other schools (never mind more so), we can expect students to make as much progress in them. The latest data shows what % make the expected level of progress in English and Maths:
% of students making expected progress in English
Academies: 65%
Community Schools: 71%
Foundation Schools: 73%
Voluntary Aided Schools: 79%
% of students making expected progress in Maths:
Academies: 56%
Community Schools: 63%
Foundation Schools: 67%
Voluntary Aided Schools: 73%
The data is very clear. Academies underperform. Less students make expected progress in both English and Maths than in any of the three other categories. It turns out that Mossbourne (which scores 82%, 92% and 95% on these measures) is the exception, not the rule. This is despite the fact that many of these academies had vast investments as they were established.
Does the government believe in evidence-based policy or not? If it does, surely this clear and disturbing data should lead to an immediate review of the move to academies while the reason for this disparity is investigated.
Data Notes
These figures are a straight average of all schools in these categories in the DfE figures you can reach from here:
http://www.guardian.co.uk/news/datablog/2012/jan/26/secondary-school-league-tables-data?INTCMP=SRCH#data
Added note: The academies data is based on the AC category of original academies (termed Sponsor-led academies by DfE). It does not include the 25 schools coded ACC for converter academies.
(The schools data is in the GCSEs, CSV click through. I found it wouldn't open directly in Excel. I had to store it as a CSV file, and open it from Excel. The three categories quoted here are headed AC5EM11, PT24EngPrg, PT24MathPrg. The categories are explained in the Metadata document, from the above link. The calculation was done with a simple pivot table analysis.)
Comments
Hmm the LSN has used the 'disadvantaged intake' excuse on numerous occasions to justify some state school results. Given your admission that many academies here are rebranded inner city schools you seem relatively reluctant to make the same excuses for the academies. Does this mean you have abandoned this line of argument altogether and will encourage your members to stop using it?
However value added is a different matter and I have certainly never seen 'disadvantaged intake' as an explanation of low VA. Every school should be ensuring their students make (indeed exceed) expected progress, given the level they entered the school with.
The Government has removed this “advanced” measure from the 2011 league tables because it believes in a “no excuses” approach. On its own measure, then, any school which does not reach its benchmark is to be “named and shamed” and judged as "failing" regardless of its intake or the circumstances in which the school operates. The worst-performing secondary school in England in 2011 was St Aldhelm’s Academy in Poole, Doreset, where just 3% of pupils got five GCSEs A*-C or equivalent including Maths and English. The head teacher explained on Radio 4 news last night that her school served two large council estates and a settled traveller community. In the cohort that had just taken GCSEs only one pupil had entered the school with a Key Stage 2 level 5.
But the Government says, “No excuses!” which is an attitude I deplore. It also says that academy conversion is the key to raising standards but it’s clear that academy conversion is not a magic bullet. The question now is: what is the government proposing to do about academies which on its own measure are “failing”?
The evidence is more mixed than is made out and one shouldn’t be too quick to dismiss the effects of intake.
For example:
The Value Added measure based on the best 8 GCSE and equivalent results gives the following results:
Academies 1002.08
Community Schools 997.46
Foundation Schools 1000.28
Voluntary Aided Schools 1004.75
Now none of these values with their confidence intervals fall outside the range of the confidence intervals of the other values, so you can’t draw statistically significant conclusions, but it shows another story.
In terms of the intakes the following is the case:
% of pupils starting KS4 in ____ prior attainment Band
LowMiddleHigh
Academies 27 53 20
Community Schools 19 51 30
Foundation Schools 17 47 36
Voluntary Aided Schools 13 47 40
As Henry says, this could be used to explain GCSE results, but the progress results can be broken down a little more.
With regards to progress of previous low attaining pupils academies (48% Eng, 29% Ma) do roughly as well as community schools (49% Eng, 27% Ma) but much better than foundation schools (46% Eng, 26% Ma). All lag behind voluntary aided schools.
However, academies (En 78%, Ma 74%) fall behind when it comes to the progress of high previously attaining pupils compared with community schools (En 85% , Ma 82%), foundation schools (Eng 85%, Ma 83%) and voluntary aided schools (Eng 88%, Ma 86%).
These results are also replicated when you look at the progress each type of school on average makes with those looked after or on FSM compared with those not looked after or on FSM, again with academies (26%) having a much greater proportion of these pupils that community schools (15%), foundation schools (13%) and voluntary aided schools (13%).
So the schools have very different intakes on average and seem to be doing different things with them.
In the OECD report “Viewing the United Kingdom School System through the prism of Pisa” they said the following:
“In the United Kingdom there are 27% of students in schools with a socio-economically disadvantaged intake, of which 48% are students who are socio-economically disadvantaged themselves (i.e. they are overrepresented), while 23% of students are in socio-economically privileged schools of which only 6% are socio-economically disadvantaged themselves. Disadvantaged students tend to do worse than expected in disadvantaged schools, but by about the same margin as in many other OECD countries, and advantaged students tend to do much worse than expected, in this case by a larger margin than average. In schools with a mixed socio-economic intake, disadvantaged students tend to do better than expected and advantaged students tend to do worse than expected by about the same margin as in the OECD in general. In schools with a privileged socio-economic intake, disadvantaged students tend to do better than expected (but by a margin less than in the OECD) and advantaged students tend to do better than expected”
I stress - "Disadvantaged students tend to do worse than expected in disadvantaged schools... and advantaged students tend to do much worse than expected"
So in fact based on what you would expect to occur in schools with such intakes we can see that by doing as well for the disadvantaged and previous low attaining students as they would do in other schools LATEST GOVERNEMNT DATA SHOW ACADEMIES OVERPERFORM!
http://www.pisa.oecd.org/dataoecd/33/8/46624007.pdf
Do you know what curriculum and qualifications the academies are using to improve their value added? The E_Bacc figures, and last years tables showing GCSE results without equivalent qualifications ( to be released for 2011 shortly I hope), seem to suggest that they are not following the sort of curriculum you approve of and may be using "easier" qualifications that are worth more than 1 GCSE to improve their VA scores.
But what do you think of my substantive point about progress in English and Mathematics which was at the heart of Henry's article? It seems that across Europe disadvantaged students are doing worse than expected, apart from in our academies.
This is backed up by the data that shows on average an academy student did 11.9 GCSEs when you count equivalents but only 6.6 without equivalents. (Compared to 7.7 at community schools and 8 at foundations.) And only 5% achieved the ebacc at academies, compared to 13% at community, 18% at foundation and 22% at VA schools.
I had also noticed the differing progress at academies, with academies getting similar progress to non-academies for students with low prior achievement but doing less well for those with medium and high prior achievement.
Your argument seems to be that, yes, overall academies underperform but its to be expected given their disadvantaged intake and the fact that on one limited criteria they do as well as other schools - they are overperforming?!?
Let's be clear. To justify the massive shift to academies that Gove is forcing on English education, there should be a clear case that they produce better education. This case simply doesn't exist and the data we have is clear that students at academies overall did worse - in terms of not making expected progress - than at non-academies.
I can understand that some schools struggle in disadvantaged areas but, like Gove, I would say we can't accept this as inevitable. We need to learn from the success stories. One of those is indeed Mossbourne. But just across the border in Tower Hamlets are a range of comprehensives enabling students from the most disadvantaged backgrounds to succeed. And, as at last summer, not one was an academy.
Henry, ignore that value added data, as I said it doesn't prove anything, but I realise it is a distraction from my main point. I think we can agree that in general absolute performance levels have been higher at non-academies.
Academies: 33%
Community: 33%
Foundation: 35%
Voluntary Aided: 42%
So they do match community schools on this measure, but are behind the other categories and so academies are clearly under-performing compared to non-academies as a whole.
No, I'm not arguing that. I am arguing that disadvantaged students do better than you would expect them to do given the average entry profile of academies. The OECD report tells us to expect that disadvantaged students will do much better in advantaged schools. Also, I was looking at pupil progress which you were also looking at.
So I've extracted only those schools where more than 40% of students are on free-school meals. Ignoring special schools, there are 156 of these and 40 are academies. So if we compare the % of FSM students getting 5 A_Cs (incl English and Maths) in this subset of disadvantaged schools we get:
Academies (40): 38%
Community (63) 44%
Foundation (33): 41%
Voluntary (16): 49%
Very interesting. First, it is clear that, when compared to the same disadvantaged cohort, academies clearly under-perform. But we also have an interesting refutation of your PISA argument: Disadvantaged students achieve more in disadvantaged schools than they do in more advantaged ones. Intriguing.
Your data is interesting, what is the data for this cohort of schools on progress in English and Maths, and for English and Maths for those starting in a low prior attainment band?
If your figures are correct then that is a startling result. You should write to the OECD and tell them they are wrong. Cripes, if we can't trust them on this then what can we trust them on? :-)
http://www.localschoolsnetwork.org.uk/2011/07/disadvantaged-pupils-do-wo...
http://www.localschoolsnetwork.org.uk/2011/07/socio-economic-disadvantag...
What you seem to be suggesting is that the context of the school should be taken into account when judging its effectiveness. I would agree with you on that point. It's unfortunate that the government doesn't.
Academies: English progression 66%, Maths 57%
Community: English 70%, Maths 62%
Foundation Foundation: English 69%, Maths 61%
Voluntary: English 72%, Maths 65%
Taking the same schools, the figures for students on low prior attainment are:
Academies: English progression 52%, Maths 36%
Community: English 58%, Maths 40%
Foundation Foundation: English 58%, Maths 38%
Voluntary: English 59%, Maths 39%
It does seem that, whichever way you cut it, the academies under-perform.
The data gives the figures of % getting 5 A-Cs EM for 2008, 2009, 2010 and 2011. Those without 2008 results can be assumed to be new schools, like Mossbourne. There are 175 of these and the figures for % getting expected progress:
English: 63% (v 73% across all state non-special schools)
Maths: 54% (v. 66%)
These figures are slightly lower than for academies as a whole (see above) and lower than community, foundation or voluntary aided schools.
These are presumably new schools that have the advantages of vast amounts invested in new buildings, have had a fresh start, a specially chosen head and new teachers. All can be assumed to improve results. But these new academies have, on average, under-performed.
What can explain this? I'm a data geek and the only conclusion I can draw from this data is that there is something about being an academy that leads not to better achievement but to worse. Could it be that traditional community schools and the support of a local authority - rather than a sponsor - are actually quite important?
A particular trend that I picked up looking at schools converting to a sponsor is a trend of sponsors including Harris taking on improving schools and then claiming their success as their own. On its website, Harris takes credit for the achievement of 'our children' in the May 2011 SATS, when in fact it took over the school in Sept 2011.
Have you adjusted these figures for state grammar schools?
At a cursory glance I find that these are variably listed
For example
Wirral Grammar School for Girls - NFTYPE is CY which is "community"
Clitheroe Royal Grammar School is - NFTYPE is FD - "foundation"
St Michael's Catholic Grammar School - NFTYPE is VA - "voluntary aided"
AC5EM11 for all the above ranges from 98 to 100%
So once you adjust for this you should find the %s dropping for community schools foundation schools and voluntary aided schools: unless you want us to infer an argument for grammar schools versus academies.
If you compare selective grammar schools with other, non-selective schools you are not comparing like with like. One way round this is to compare the results for the high attainers - these figures are now available in the DfE school performance tables.
The results can be illuminating. Here in Lincolnshire 99% of high attainers at one grammar school achieved the benchmark of 5+ GCSEs A*-C including Maths and English. At a fully-comprehensive school (rare in Lincolnshire) a few miles away, 100% of high attainers achieved the benchmark.
It would be interesting to compare the results of high attainers in schools known to be fully-comprehensive with those of grammar schools whose very purpose is supposed to be to "stretch" the high attainers. Perhaps the results above might be replicated nationally. In which case, the growing rhetoric of "grammars good, comps bad" might be shown to be nothing more than hot air.
He's discussing evidence for the debate about academies = good, community = bad - the DfE's argument is no more sophisticated than that.
Or to put it another way these figures should compare comprehensive community schools with comprehensive academy schools if you want a fair comparison. Otherwise if this is not done, thanks for agreeing that grammar schools are good thing since they raise the performance of the non academy categories.
Ben, sorry, but I have no idea what you're talking about.
It's not surprising that academies with comprehensive, or near comprehensive entries, are below performance levels schools for the average of the other state sector schools, especially since selective schools are included in the state sector average and many academies were at the tail end of poor performance. What academies need to do is copy Mossbourne, or if they are maintained copy Bethnal Green.
Results for both schools were impressive - the figures below show the percentage of pupils in each band that gained the benchmark 5+ GCSEs (or equivalent) A+-C including Maths and English:
Mossbourne: Low attainers 61%, Middle attainers 88%, High attainers 97%
Bethnal Green: Low attainers 62%; Middle attainers 89%; High attainers 100%
Two schools - both with similar intake range, both with similar results (although Bethnal Green has the edge). However, it is Mossbourne that is constantly praised while the other doesn't seem to get a mention even though its GCSE results in 2008 were poor so it, like the much praised academies, has improved results rapidly.
One of these successful schools is an academy; one isn't. This shows that it isn't necessary to be an academy in order to get good results although the Government would have us believe otherwise.
But both are fully comprehensive and their results for middle and high attainers equal those for grammar schools. This gives the lie to the message that it is only by expanding grammar schools that results can be improved.
But what I really want to know is how are London schools getting such good results?
http://www.education.gov.uk/cgi-bin/schools/performance/school.pl?urn=13...
http://www.education.gov.uk/cgi-bin/schools/performance/school.pl?urn=10...
I did want to do some calculations on comparative performance between grammars and comprehensives - given the data on students with prior high attainment - but the DfE data does not identify which schools are selective.
If academies are supposed to be all comprehensive intake then I think it's more valid to compare them with comprehensive maintained schools, you should knock out the secondary moderns and grammars. Don't you think you need to see what the historical performance of academies was if converters from the poorly performing maintained sector? Otherwise what you have done is essentially boost the performance of the non academy schools, by knocking out a section of their schools which are then recategorised as academies.
You don't need DfE for a list of academic selection: there is a list of English grammar schools at wiki which has links to the statute law designating grammar schools, so you can go to the original source. There are only 164 grammar schools so you can cross reference it to the data fairly easily. Secondary moderns I guess you can get reasonably accurate by eliminating all grammars and academies from each county, what is left should be categorisable as secondary modern although I guess it's potentially complicated by non academic selection like faith.
http://en.wikipedia.org/wiki/List_of_grammar_schools_in_England
Few schools categorise themselves as "modern" - they call themselves "high schools", "community schools" and so on. The only way to find out whether schools have a full spread of ability (like Mossbourne and Bethnal Green) is to look at the figures on the DfE website which give the breakdown of how many low, medium, and high attainers there are in a school. One school that does identify itself as "modern" is Gleed Boys' School in Spalding, Lincolnshire (a selective area), which has 25% low attainers, 64% middle attainers and only 11% high attainers (a fully-comprehensive school would be 25% low, 50% middle, 25% high).
Academies are supposed to be fully comprehensive but those in selective areas are not necessarily so. The Marlowe Academy in Kent (another selective area) has the following intake: 42% low attainers, 52% middle attainers, 6% high attainers. St Adhelm's Academy in Poole (the worst performing secondary school in the UK when judged on raw results) has 38% low attainers, 61% middle attainers and only 1% high attainers.
I've given you a start, Ben. I discussed two London schools above, one "modern" and two academies, both of which are described as "comprehensive" but clearly are not. There are over 3,000 secondary schools in England so that's about 3,000 records you need to consult. But as you say, it can be done "fairly easily".
Good luck - I look forward to an analysis of your results with the evidence.
http://www.education.gov.uk/cgi-bin/schools/performance/school.pl?urn=12...
http://www.education.gov.uk/cgi-bin/schools/performance/school.pl?urn=12...
http://www.education.gov.uk/cgi-bin/schools/performance/school.pl?urn=13...
It's not that simple though - there are many partially selective secondary schools in the country so you would need to account for those too.
Whichever way you look at the figures it seems there is no evidence for the huge superiority of academies claimed by ministers.
Good point, Adrian. Check out my more recent post for analysis which simply compares academies to non-academies: http://bit.ly/zDOoh7
Add new comment