Stories + Views

Posted on

22/01/13

go to 22 comments

Schools GCSE Data: How the Government Will Present It

This Thursday the DfE will release the detailed data on how each secondary school in England performed at GCSEs in 2012, including comparison to previous years, figures with and without GCSE equivalents and comparison by free school meal status and by (low, medium or high) performance of students at age 11. It is a remarkably useful and comprehensive set of information, and the Department is to be congratulated for distributing it.

Misleading Statements

However, judging by its past record, the government is likely to spin the information in support of its academies policy in a way that is at best misleading and at worst dishonest. Two statements that are likely, as government ministers have been using them over the past few months, are:

“Convertor academies achieved above other schools, with 68% achieving 5ACEM (5 GCSEs including English and Maths), compared to 57% in schools overall.”
“Performance in sponsored academies grew at a rate twice as fast as schools overall.”

Sponsored Academies: What to Look For

“Sponsored academies” include the original Labour academies, converted because of low performance or built in areas of deprivation, and continuing conversion – with a sponsor – of under-performing schools. All had low GCSE results and were generally below the floor of 35% achieving the 5ACEM benchmark.
Between 2010 and 2011 previously low achieving schools grew their results much more than the average – whether academies or maintained schools.  Schools with over 60% on the GCSE benchmark did not, on average, increase that figure at all. For those below 35% the average growth in 2011 was above 8%. Sponsored academies have far more schools in the under 35% category (this being the reason they were converted) and so, even if they do no better than similar schools, will always look impressive when compared to the overall average.
The key questions are:
** How did the growth in GCSE results in academies compare to non-academies with similar results?
** How did that growth compare when GCSE equivalents (such as BTECs) are removed?
The “twice as fast” claim, endlessly repeated by government supporters, was made on the 2010 to 2011 growth. Take academies on less than 35% in 2010 and they did indeed grow their results by 8%. But take the non-academies on less than 35% and they also grew their results by 8%. Whether local authorities chose to use the academy route or not to improve their under-performing schools, the results were, overall, the same.
For the first time in this year’s data we will be able to compare two years (2011 and 2012) where the figures for 5ACEM are available without equivalents. We know that academies make more use of GCSE equivalents (an activity described by Gove as ‘gaming’ the system). Comparing their GCSE-only growth with other similar schools will reveal whether they even perform as well as similar non-academies.

Converter Academies: What to Look For

“Converter academies” are those Good and Outstanding schools encouraged by Gove to convert in the last two years. These were, by definition, the better performing schools. They therefore had better GCSE results and will, unless disaster has struck, continue to have better GCSE results. To claim their higher results are due to academy status is about as sensible as selecting a group of people based on being above average height and then boasting that they are taller than the average. Key questions to ask include:

** Have GCSE results in Converter Academies risen or fallen since becoming academies?
** How have they fared compared to Good or Outstanding schools who did not convert? (Ofsted ratings are unlikely to be included, so  the appropriate comparison will be with other schools on similar levels of GCSE achievement.)

As I’ve already noted, the first indications are that results in Converter Academies actually fell. If this is confirmed by Thursday’s data, then serious questions should be asked about the £1 billion overspend on academy conversion.

The Key Question

The government has made clear that its main vehicle of school improvement is the academy programme (and the linked free school initiative, but few of these will have any results yet). The National Audit Office questioned the £1 billion overspend in pursuit of this strategy. So the key question is:

Has this £1 billion been a good use of public money? Does the data show that it has actually resulted in significant school improvement?

And, if the data is presented in this misleading way, it begs the question of why? If academies were really performing as well as the government claims, then surely no distortion of the statistics would be necessary.

 

Share this page:

Comments, replies and queries

  1. Excellent, as always. This is approximately a half-ways stage for the academies, and the instigation of very detailed report cards at 6 week intervals for some academies show that the DfE not only can exert more central control than before (hence weakening the claimed academy freedoms concept), but that they are almost certainly doing so because they are so concerned that their pet project might be seen to be struggling. Note – a concern of self interest, not of pupils within the schools.

    Running out of steam so far from the finishing post may explain the ultra rapid, forced academy conversions. But I strongly feel that it is already failing as an initiative with the film about teachers and ‘carpet baggers’ to be shown in London on the 31st a sign of the intense anger that is growing over Roke school and others.

    The DfE seems to be able to operated extremely opaquely, with no recourse for redress. If the claims of cronyism are true, and become exposed, it will become extremely messy. But the DfE have a very poor track record for Freedom Of Information responses, so it is very unlikely that they will leak any underhand practice.

    Standing right back, we must keep remembering that they are elected to act on our behalf. A school that repeatedly fails is put into special measures. The DfE falls into this category.

    • Neil – Russell Hobby, NAHT, told TES he thought the DfE issued the scorecards because it wanted to avoid being embarrassed by criticism of its flagship academy
      programme due to underperformance.

      The scorecards request detailed data including attendance, attainment, quality of lessons and self-rating of performance in such things as finance.

      Brian Lightman, ASCL general secretary, described the intervention as a “very bureaucratic, heavy-handed approach”. He told TES “It is asking for far more information than anything from the local authority or an inspection for a school in special measures.”

      The academies were chosen, apparently, because they risked their results falling below the benchmark. But Lightman said that in one case a school had just been rated “good” by Ofsted.

      The TES article is here: http://www.tes.co.uk/article.aspx?storycode=6314557

      • Thanks Janet. I read the TES article. If you collect together all the strands, it points at DfE desperate to fulfil a denied privatisation agenda in double quick time before they are caught out. I think the forced appointment of Harris Federation instead of the promised local sponsor was one step too far. Deeply cronyistic, if I can conjure up such a word.

  2. Patrick Hadley says:

    Thank you for this pre-results briefing. If all the education correspondents read it we would have much better reporting.

    • Patrick – you are correct to draw attention to the woeful reporting of education matters by most of the media. They’ve just churned DfE press releases without doing what you would expect responsible journalists to do – examine the evidence. Check, check and recheck.

      Instead, they’ve published Gove’s propaganda and he, in turn, has poured obsequious praise on particular editors or newspaper proprietors who’ve supported the Gove line.

      In the worst cases certain section of the media have relished repeating the “plummeting down league tables” mantra, the propaganda about a “broken” education system, Gove’s and Cameron’s tough-sounding talk about smashing complacency, enemies of promise etc. Above all these papers and their so-called education editors have colluded with the deception about academies.

      For one example of biased reporting see my critique here:

      http://www.localschoolsnetwork.org.uk/2010/12/state-education-suffers-from-biased-media-coverage/

      • Patrick Hadley says:

        Does the Local Schools Network publish Press Releases? Even if only one educational correspondent read them and began to understand the issues and stopped simply churning out DfE propaganda it would be worth it.

        • Press releases from education specialists like LSN would be a great idea. Counter-spin I would call it.

        • Patrick – I don’t think the education editors on some papers are really interested in the truth. They’ve continued spitting out Gove’s gobbets since he became Secretary of State.

          But there are signs this might be changing. The media ignored the recently-released DfE spin about the number of open academies.

          It will be interesting to see how the media handles the GCSE performance figures when they’re published this week. Will they churn the DfE press release? Or will they be more critical about academy spin?

          • Adrian Elliott says:

            Janet, in fairness, when I was an active member of ACSL (SHA then) we often heard from education correspondents and editors that it was actually the editors of the papers who would prevent anything but the usual party line on education and schools being published.

            I did notice recently that the Times seemed to be taking quite an independent line on education, over international tests not backing up Gove’s views for example

            .But then guess what? Murdoch sacked the editor of the Times (not just over education, of course, but the result will still be that the approach will change)

  3. The Academies Commission has expressed concern that the public isn’t being told the truth about academies.

    The Commission found that 40 sponsored academies had been “red-rated” in 2011 and 8 “pre-warning notices” had been issued. The Office for the Schools Commissioner (OSC) was monitoring 166 academies – 30 had “causes for concern”. 8% of sponsored academies had been judged “inadequate”. These figures, the Commission said, undermine claims that the number of academies at high risk regarding performance is “very small”.

    The Commission noted that “the public remains uninformed about this”. Presumably this is because it weakens Government spin.

    Academies Commission report available from:
    https://dl.dropbox.com/u/6933673/130109%20-%20Academies%20Commission/Academies_commission_report%20FINAL%20web%20version.pdf

  4. The spin pushed out by the DfE about academy conversion being essential for “improvement” is unraveling.

    It’s obvious that the DfE’s main criterion for “improvement” is raw exam results. This is confirmed by the issue of scorecards to those academies where results are likely to fall below the benchmark even though it appears that one of these has just been judged “good” by Ofsted.

    The Education Endowment Foundation discovered that many below-floor schools were nevertheless “outstanding” and were doing a good job in difficult circumstances. The OECD has warned that the excessive emphasis on test results in England risks having damaging consequences. The Institute of Fiscal Studies found that the quality of a school’s intake governs its results.

    But the DfE continues to insist that the main measure of performance is league table position. No excuses.

    http://www.localschoolsnetwork.org.uk/2011/07/disadvantaged-pupils-do-worse-in-schools-containing-a-large-number-of-disadvantaged-children-new-research-reveals-mark-2/

    http://www.localschoolsnetwork.org.uk/2011/06/too-much-emphasis-on-grades-is-cause-of-concern-say-oecd/

    http://www.localschoolsnetwork.org.uk/2011/08/school-intake-governs-academic-achievement-says-ifs-report/

  5. Whatever the results of converter academies this year, it can’t be said that any “improvement” or otherwise is the result of conversion. Results would have more to do with what took place in the academy before conversion because that’s where most of the GCSE candidates would have spent their education.

    In any case, it’s not necessary to become an academy to raise results as Henry’s research has found. And PriceWaterhouseCoopers (2008) and Ofsted have both found that when schools improve they use similar methods which have nothing to do with academy status.

    There’s also the London Challenge and the City Challenge which were more successful than the sponsored academies programme.

    • This is the moot point about the whole academies fiasco – the emphasis on matters other than enabling and trusting teachers to teach effectively is the key, and not the school management appointees. So academy status was always a red herring. I created a diagram to try to visually illustrate the factors used in school accountability : http://www.educationreform.co.uk/Live/index.php?Id=194 to show that, as you say, enormous factors such as the previous teaching/school of pupils is too easily forgotten in political rhetoric.

      • Thanks, Neil, for the link to the succinct chart which shows the inadequacy and unfairness of judging schools simply on exam results.

  6. Roger Titcombe says:

    Henry – Excellent summary and advice. The emphasis on school improvement is not just misleading but educationally damaging. The magic formula for school improvement is still (until 2014) as follows.

    1. Prioritise English and maths C grade at the expense of E, D, B & A, using all the behaviourist methods possible, threats, rewards, revision, cramming, booster classes, teaching to the test, memorising the exam board revision guides and leaving out the most difficult parts of the syllabus (needed for A Level) to concentrate on repeated drilling of enough of the easiest parts to get a C. Exam boards have provided guidance on this.

    2. Ensure ALL pupils get at least 3/4 additional 100 percent pass rate vocational equivalents. The most effective way to do this is to substitute BTEC science (4 x Cs) for GCSE science for all pupils unlikely to get a C at GCSE sciences or even (as happens in many schools) make all pupils take BTEC science.

    This has worked for hundreds of ‘improved’ schools for many years, especially sponsored academies.

    The consequences are educationally disastrous for all the reasons set out in my previous posts.

  7. Patrick Hadley says:

    It will be a surprise if the Convertor Academies do not show a drop in exam performance. This is explained by the statistical principle of “reversion to the mean”. A group of schools that have performed above their long term average in one particular year are likely on the whole to do less well than that in the following year. While some of the schools chosen to be CAs will be genuinely good schools that will continue to improve irrespective of status, there are always going to be some which fall into the category of schools that happen to have had a good year, or a rather generous Ofsted grade.

  8. The same principle is observed in sports such as golf. It is yet another danger with statistics, a most misused and misunderstood subject.

    But it also begs the more basic question – if government want to drive up results by measuring just the results and cracking the whip for ‘under-performers’, rather than understanding deeper issues, then they will be wasting their time, and indeed be undermining hope for improvement, not least because education does not succumb to carrot and stick measures.

    • Pat Glass, who sits on the Education Select Committee said in the EBacc debate on 16 January:

      ‘One insider in the system recently told me confidentially, “When the blood bath happens, I expect this Secretary of State will be long gone.”’

  9. Adrian – reply to comment above about education editors (no reply button). Point taken. I should revise my comment to say newspaper editors put pressure on their education editors to push a particular line (the alternative being that the education editor would lose their jobs).

    And, of course, newspaper editors are under pressure from the owners (or else the editor joins the dole queue).

    So much for a free and fearless press.

  10. Patrick Hadley says:

    While it is true that much of the media generally supports the Conservatives, it is not true that they are all enthusiastic supporters of everything that this coalition government has done. Some departments are quite regularly criticised even in papers such as the Daily Telegraph and the Daily Mail. There is however a myth that the one really successful minister in the government is Michael Gove. I suspect that this is because of a lack of proper understanding of the issues by most of the education correspondents. These correspondents often have little or no specialist knowledge of education, nor do they have any great awareness of how to interpret research in relevant areas. They are journalists not educationalists.

    • Patrick – and we must remember that Gove also was a journalist not an educationalist.

      But criticism of departments by the media doesn’t necessarily imply criticism of the coalition. For papers like the Mail, government departments are filled with time-serving “mandarins” who deserve to be cleared out (hence all the glee expressed about cutting the number of civil service jobs even when it means there aren’t enough left to do the job properly). Government departments are part of the “bloated public sector”.

    • And of course, Michael Gove was himself just that – a journalist.

Want to follow comments on this post? Use the RSS feed or subscribe below

Reply