Time to Fact Check Our Debate with DfE

Henry Stewart's picture
 1
Over the last few weeks we have published a series of posts analysing the 2011 GCSE DfE data. We believe it shows that academies perform no better than comparable state schools, and on some indicators perform worse. We have refuted the DfE rebuttal. Gove advisor SamFreedman (@samfr) has engaged with us on Twitter but has declined to discuss it further.  Two broadsheet education correspondents have offered to either have the rival views independently assessed or publish an email debate. @samfr has declined to take part and no other DfE spokesperson has volunteered.

It is possible that DfE spokespeople and the supporters of academies are simply very shy and like to avoid publicity. But it is hard to avoid the more likely conclusion, that they are unable to argue with the clear conclusion of the data.

Time to Fact Check

Independent fact checkers Full Fact have kindly offered to give their assessment of the viewpoints. So here I will set out the facts to be checked and invite the supporters of academies to put any rival views. Our claim:

The 2011 DfE data on GCSE results provides no evidence, despite DfE claims, that academies perform better than comparable non-academies. Indeed on some criteria they perform worse. This is despite the extra funding that these initial sponsor-led academies received.

This is the statement we would like to be checked. We are referring solely to the 2011 data, as this is the only data that has been released in sufficient detail to enable comparisons of comparable academies and non-academies. The DfE, in response, have claimed that the 2011 data paints a different picture and that academies grew twice as fast as non-academies between 2010 and 2011.

The key points of our case were put in this Observer article, in the subsequent post showing that the DfE claim that academies grew faster did not stand up (and that post links to most of the detailed analysis, also available here), when they were compared to similar schools. The DfE set out its arguments in a response to the Observer article published last week. We in turn published a post to refute that argument. There was also an interesting bit of analysis of the 2011 data by Leonard James which came to the same conclusion.

The key explanation of the different interpretation is contained in the House of Commons briefing on the performance of academies, specifically the graph on p9. This shows that schools with low 2010 results showed strong growth (over 8% on average for those starting from under 35%) and those with high 2010 results (over 65%) showed, on average, no growth at all. Therefore any sample would show growth above the average if it had a high proportion of schools with lower figures for 2010, as most academies did.

The key is to ensure you compare similar schools. When we compared schools whose GCSE results were under 35% in 2010, we found both academies and non academies showed an average 8% growth in results from 2010 to 2011. This is great news, it shows that overall schools that were getting low results are improving fast. But it makes no singificant difference whether they are academies or non-academies.

That is our evidence and I will tweet @fullfact asking them to check our claim. I will also tweet @samfr inviting him to put any contrary evidence and I also invite any supporters of academies to post their comments below.
Share on Twitter Share on Facebook
Category: 

Be notified by email of each new post.





Comments

Allan Beavis's picture
Sat, 17/03/2012 - 17:23

What strikes me as being very peculiar is that for a government obsessed with stats, data, charts, performances tables and league tables when it comes to smearing local authorities, bullying schools, advancing the Academy and Free School policy and dismissing maintained schools, it shows a reluctance to publicise them when data collated and assessed works against them and reveals that not only is the Academy policy still questionable but that Academies do not perform better than non-Academies.

British school children are one of the most tested in the world. The excuse is that this relentless probing is going to raise standards. Better exam results are apparently behind the imposition of rigid and authoritarian discipline in many Academies, such as appropriate haircuts, no contravention whatsoever of the school uniform, absolutely no touching and no talking during lunch. In one Hackney Academy, a boy was forced asked to remove his gloves and then given a detention because they were the wrong colour. He spent the day with his hands frozen, so I suppose the school thought it more important to impose its will than to keep a child warm. Children with Mediterranean parents find it unnatural that you aren’t allowed social touching or talking when sharing food. Perhaps British children do too.

Finland, a nation that is consistently at, or near, the top of PISA tables hardly test at all. Education in the tiger “nations” – Shanghai, Singapore, South Korea – has traditionally been based on a far greater emphasis on rote learning but even in China there is a move towards a more creative system of teaching and learning. In none of these top performing nations are the results of a school published in order to punish them either by showing they do less well than a neighbouring school, or putting them on a hit list for closure. Test results are used, not to punish or reward teachers as they are in the US, but for the school to assess how best to teach the children under their care.

When schools success is measured solely by statistics, data, and tests, we run a very high risk of dubious practices being introduced into the system and we have already seen some of these in action – the handing over of Downhills Primary School in Tottenham to the Harris Federation, whose founder is a Tory donor; less able children being “nudged out” or excluded (bad “behaviour” being the official reason) from Academies; Covert or even overt selection, such as Free Schools right to amend the Admissions Code and even cheating, as illustrated by the case at the end of last year when two examiners were secretly filmed briefing teachers at paid-for seminars, during which they gave advice on exam questions and the exact wording pupils should use to obtain higher marks.

Those of us who do not believe that results and tests are the only measures of judging a school’s performance might therefore be uncomfortable with Henry Stewart’s brilliant analysis of figures published by the DfE showing that Academies do not outperform non-Academies. But why should we be uncomfortable with this comparison, when all it shows is that the schools favoured and coerced into greater numbers by the government are not outperforming the type of schools the government is trying to dismantle?

It is Gove’s policies which have encouraged unhealthy competition. The virtually unfettered Academy programme has been both the flagship and the symbol of the supposed superiority of the Academy and the failure of the Comprehensive. Henry’s analysis is not a “play-them-at-their-own-game” shaming of Academies but it is a challenge to the government to explain why they have consistently and unfairly misrepresented the superiority of Academies and ignored the achievements of non-Academies.

The government is keen on using results, statistics and charts to manipulate public opinion when it suits them, but it seems that when their own data calls into question their policies or their rhetoric, they are less than willing to rise to the challenge.

Add new comment

Already a member? Click here to log in before you comment. Or register with us.