Stories + Views

Posted on

24/01/13

go to 29 comments

2012 GCSE Data: Non Academies Do As Well as Academies

This morning the DfE issued its school-by-school GCSE data, claiming that sponsored academies were improving at five times the rate of all state-funded schools. This claim is in line with the prediction I made 2 days ago of how the DfE would distort the results.

As you can see from the graph below, how much a school’s GCSE increase is related to its previous results. Those with previously low results tend to see large increases and those with previously high results tend to see only small increases or falls. The best way to judge how one set of schools performs is to compare like-with-like.

How Did Similar Academies & Non-Academies Compare?

For those schools whose GCSE benchmark was in the 20-40% range in 2011, academies increased by 7.8% and maintained schools by 7.7%. Both are impressive results and these schools should be praised for their improvement. However it does seem that the structure of the school makes little difference, and it would be good to see a bit more praise of the success of non-academies from the Department for Education.

In the other bands, non-academies did slightly better in the 40-60% and 80-100% bands and academies did slightly better in the 60-80% band.

This backs up our research on the 2011 data, which showed that – when compared to similar schools – academies did no better (and sometimes worse).

Again we find that previously under-performing schools who chose to stay with local authorities did as well as those which found a sponsor and became an academy. Perhaps more attention should be paid to what these schools and successful local authorities are doing – without the cost and time of academy conversion.

The DfE Claim: A Gross Distortion

It is technically true that the benchmark GCSE results for academies grew, on average, by 3.1% compared to 0.6% for all state schools. However this is not comparing like-with-like and simply reflects the tendency of results for more successful schools to grow at a slower rate (or fall). It is a gross distortion of the data to claim this as a conclusion from them.

This analysis is before including the effect of GCSE equivalents (such as BTECs), the use of which is regarded by Michael Gove as ‘gaming’. The benchmark GCSE figure falls by 12% for sponsored academies once these are excluded and by only 6.6% for non-academies. It can be expected, therefore, that the comparison would be still less flattering for academies if the GCSE only figures were used.

One caveat must be added to any analysis of 2012 GCSE results, as the result of the summer English debacle may have been slightly random in which schools it affected. This could be a factor in the fall in results at the top end.

 Data Note

I have not included the 0-20% band as there were only 3 schools in this range in 2011. All were academies and all did improve, by a very impressive average 22%. However the sample is very small and there are no non-academies to compare them to. Numbers of schools in each band:

20%-40%: 73 sponsored academies, 175 non-academies
40%-60%: 134 academies, 1,012 non-academies
60%-80%: 27 academies, 706 non-academies
80%-100%:  12 academies, 134 non-academies

Converter academies, which were not include in the DfE press release and whose results fell overall, have not been included in this comparison.

Sponsored academies are generally previously under-performing schools who are supported by a sponsor. Converter academies are the previously Good or Outstanding schools who were offered the chance to convert on their own.

Share this page:

Comments, replies and queries

  1. Paul Reeve says:

    This school opened as an academy in January 2012 but the results for KS4 are still accredited to the school before it converted.
    http://tinyurl.com/bdg6mw7

    Is that legitimate?

    • Paul – this doesn’t seem unusual. In my area one school that converted on 1 September 2011 had its results accredited to the academy. But three academies which didn’t convert until after that date haven’t. Their 2012 GCSE results have been accredited to the pre-converter school.

      It appears that any school which converted on or before 1 September 2011 will have its results accredited to the academy. But schools which didn’t convert until after 1 September have their results accredited to their old schools.

  2. Simon Longley says:

    Thanks for this analysis – which I wish I could see reflected at (e.g.) the Guardian – rather than the uncritical parroting of DfE press releases. Your graph looks like a very good example of regression to the mean, combined with some successful pressure for improvement at the lower end! Not surprising to those of us who have long known that central government’s preoccupation with structure/organisation over teaching/learning.

  3. Thanks, Henry, for this analysis. The DfE is trying very, very hard to show that sponsored academies are doing spectacularly well: their latest press release (which for some odd reason is not on the DfE news home page) uses such hyperbole as “great success”, “amazing”, “brilliant sponsors”.

    Trying a bit too hard, I think. Especially as the Academies Commission, while praising some sponsored academies like Mossbourne, for their “stunning success”, found that most sponsored academies don’t perform any better than similar non-academy schools.

    http://www.education.gov.uk/inthenews

    http://www.education.gov.uk/inthenews/inthenews/a00220534/secschoolperf

  4. In Leicester, all the schools maintained by Leicester City achieved above the new benchmark target. This is an improvement on last year. Two city schools which are not under local authority control, Samworth Enterprise Academy, and Darul Uloom, an independent school, did not reach the benchmark. A Councillor said the city council would be happy to work with those schools.

    http://www.thisisleicestershire.co.uk/level-GCSE-league-tables-Leicester-city-schools/story-17940329-detail/story.html

  5. Patrick Hadley says:

    I downloaded the data onto a spreadsheet, filtered for “Type contains AC” and removing any without data in both 2011 and 2012, found that the average for A-C inc Eng & Maths for the academy schools fell from 63.06% to 62.79%.

    If my method is correct then it contradicts the DfE claim that converting to academy status means that schools improve “five times as fast”. Surely the Office for National Statistics should comment on this misuse of official statistics by a government department.

  6. Thank you for drawing attention to this as no doubt this bold misinformation would (and may still) be regarded as true.

    It strikes me that the main error here is inferring a causal effect from an observed association between a factor and an outcome. To even begin to make such an assertion, we have to be comparing like with like, which from your analysis is clearly not true. (We may use modelling to try to adjust for unbalanced allocation, but that may not capture all the impacts. Ideally we would use RCTs to get a fair comparison.)

    On a more basic but related point, it is poor practice to quote a change without giving the context. If they had compared changes of x percentage points from a starting point of p%, we would probably all be drawn to the fact that the starting conditions were not comparable.

  7. Another thing about causality is being clear about the direction. If other conditions were met, we would still need to consider whether becoming an academy improves performance or that the potential for improvement tends to make becoming an academy more likely.

  8. Somebody on facebook has asked –

    Why did the chappy that does the blog not break this down further into Maintained Schools, Forced Academies and Elective Academies? Why have all academies been viewed as one tranche?

    Can anybody answer that? (The facebook user declined my invitation to ask himself…)

    • Colin – the answer is this:

      It would be difficult to find which secondary academies were forced. One all-through academy admitted it “jumped” before it was pushed but it would be difficult to find an accurate number of those secondary academies that had done the same. The Academies Commission said that some academies had indeed “jumped” before they were forced to change but these might have included primary academies.

      In the case of Lincolnshire, the County Council has “advised” all its schools to become academies preferably with the CfBT trust. Many, particularly secondary schools, have done so. Whether this could be classed as “elective”, “forced” or “jumping before being shoved” is debatable.

      http://www.localschoolsnetwork.org.uk/2012/04/the-school-%e2%80%9cjumped-before-it-was-pushed%e2%80%9d-admits-governor-of-community-school-faced-with-academy-conversion/

    • An interesting question Colin. The reason was that I was responding to the DfE press release, which compared only sponsored academies and maintained schools. It made no mention at all of converter academies, possibly because their results had, on average, fallen. I will be doing further analysis over the weekend and hope to include something on converters.

      • Converters haven’t been around long enough to come to any sort of conclusion. And the DfE has accredited some converter academy 2012 results to predecessor non-academy schools.

        In any case, correlation isn’t causation. Just because converter academies’ results go up or down doesn’t mean that the cause is conversion. As PwC 2008 and Ofsted found – when schools improve they use similar methods which have nothing to do with academy status. And the Academies Commission 2013 said that anything an academy can do, a maintained school can also do. One of its recommendations was that schools concentrate on teaching and learning not systems and timetables.

        • Leonard James says:

          I’d be interested in that as well – does anyone even know how many forced conversions there have been and how many of the elective academies jumped before they were pushed.

          • I can tell you that the data shows 321 sponsored academies (of which 248 have GCSE results) and 680 converter academies. The figure for sponsored is up only 6 on last year’s 315. These are the ones that may have been forced or encouraged to change status (due to poor results).

            This doesn’t include any that have become academies since the GCSE exams, and there will have been a lot in Sept 2012. However these are figures for secondaries and I think most of the forced conversions are taking place in primaries.

  9. Henry – a further press release has been issued by the DfE about the Statistical First Release. It says:

    “The revised results published in this Statistical First Release (SFR) are based on data checked by schools. This shows that the percentage of pupils in all schools achieving five or more GCSEs at grade A* to C or equivalent, including English and mathematics GCSEs, has increased slightly from 2010/11. However, the provisional results published in SFR 25/2012 reported a small decrease in this measure, driven by a significant drop in the percentage of pupils in independent schools achieving this standard. A small fall in the figure for independent schools is still evident in the revised results.”

    I don’t remember seeing any media comments about “a small fall” in the percentage of independent school pupils reaching the benchmark never mind a “significant drop”. If the same thing had been said about state schools I’m sure it would have been splashed across front pages in 4″ high font.

    The press release also says:

    “Pupils in selective schools continue to outperform pupils in other schools in all main Key Stage 4 (KS4) indicators.”

    Now there’s a surprise. Couldn’t be anything to do with the fact that they cream of high-performing pupils, could it?

    http://www.education.gov.uk/researchandstatistics/datasets/a00219173/gcse-and-equivalent-results

  10. Roger Titcombe says:

    Why despite all my efforts to the contrary do all these posts assume that school improvement as defined by C grade thresholds at GCSE is a good thing? Which part of degraded curriculum and degraded teaching methods are all you good folk failing to understand?

    • Leonard James says:

      Isn’t the point that, by their own standards, one of the governments flagship policies is having no effect?

    • Roger, its a fair point but, as Leonard says, the GCSE levels are the standard by which the government asks to be judged – and the question we are trying to answer is whether the academy policy is helping this target.

      And the achievement of C grade is an important one as whether a student has 5 GCSEs including English and Maths will be crucial in determining what they can progress to. Though we should pay due attention to your points on ways to game this benchmark.

      • Roger Titcombe says:

        Henry – The C grade threshold is not as important to progression as is commonly believed. A C grade is not usually enough to progress to A Levels, and a C obtained by cramming in a school that doesn’t teach the full syllabus certainly isn’t. 30 percent of students are dropping out of academic A levels – see previous post. A whole host of other careers that may specify C grade entry requirements in practice admit students without. It is the economics of bums on seats that pervades the FE sector. You might be surprised to learn that this includes nursing and midwifery where minimum entry requirements were abolished a few years ago. Few 11-18 schools would deny a flunked exam student access to a particular 16+ course if the GCSE teacher made a strong enough case. GCSEs are often retaken in Y12 anyway. There are many other examples. The C grade threshold have become high stakes primarily because of the need for a simple performance indicator to drive league tables and provide a wholly invalid basis for the parental choice that drives competition.

  11. Roger Titcombe says:

    Thank you – I digress. This is a really important article that lifts the lid on what market competition is doing to our education system. This is behaviourism in extremis. Can anyone seriously doubt that this culture damages long term, deep learning – by both the poor students and the new teachers, both of whom know no better. Is it so hard to believe that this is resulting in a real decline in understanding on the part of English pupils captured by PISA and other tests? Stop arguing about which schools are improving fastest by these methods, its not the main issue. Nothing short of abolishing the competion based system will deliver our education system from this ‘Alice in Wonderland’ madhouse. Gove is the ‘Mad Hatter’ at the tea party.

    • Leonard James says:

      Why should discussion be limited to what you consider to be the main issue?

    • Roger – I agree with you that league tables are having a distorting effect on what is being taught in schools. I’ve said so often enough. I haven’t “failed to understand” this because I discuss league tables.

      However, as Leonard says, by the Government’s much-publicised and preferred method of measuring schools, their flagship academy programme is NOT having the effect that Gove and Adonis before him claim it has.

      I’m as angry as you about the “Alice in Wonderland” madhouse which judges schools on raw results only. But league tables are not going to go away – even Labour is quiet about this, but you’d expect that considering they upped the emphasis on test results.

      • Roger Titcombe says:

        Janet – Of course it is right to draw attention to abuse of data by the government in support of the academies programme. Very well done Henry and everyone else. But this is still a sympton of the malaise, not the cause. Your last paragraph pinpoints the issue. Labour is now heavily involved in preparing its policies for the next General Election. If nothing changes then the core problems of the education system will remain. I happen to think the public is growing as dissatified with leagues tables, competition and fragmentation as we are. But leadership is needed. It is vital to wake up the serious media to the failings in the system AND their root cause. The Telegraph has in my view often been the most critical. The Guardian, Independent, BBC and Ch4 need to catch up. The media must be targeted, but the message must go deeper than the failings of Academies and the iniquities of Free Schools. Gove is riding on the train created and set in motion by Blair and Adonis, and that must be discredited and derailed if we are to transform hand wringing into effective opposition. I think that the problem with the media is more ignorance than malign intent, however the Guardian in particlar has a lot to answer for, as it has been banging the drum for academies and the mirage of school improvement for a very long time.

        Leonard – Of course there should be no closing down of any debate, but there is always a danger that forums can become places where lots of time and energy is taken up arguing for the sake of argument, producing plenty of heat but little light.

        • Roger – the discrediting of Adonis’s and Blair’s sponsored academy programme which built the foundations of Gove’s policies has been featured here:

          http://www.localschoolsnetwork.org.uk/2012/03/deception-about-academies-has-been-going-on-since-they-first-opened/

          It’s part of an effective opposition to point out the failings in Government propaganda whether this be the “success” of the sponsored academy programme as measured by results when the opposite is the case (Henry’s research), or “plummeting” down league tables, or applying new measures retrospectively (EBacc, Ofsted’s “requires improvement”) to show a large number of schools as “failing” as well as pointing out the futility of league tables.

          That’s not a waste of time and energy.

  12. Roger – I and many other people may realise that league table position is not a reliable indicator of the quality of education offered by a school. But the Government and the media do. And parents are guided by what they read in the media.

    Like it or not, parents view league table position as a reliable measure of the quality of education. “High performing” schools attract a large number of applications while “low” performers are viewed as the last option especially if such schools have been “named and shamed” in the media just because headline results are low.

    And Ofsted seem to be unduly influenced by these raw results meaning a school which fails to meet the benchmark is unlikely to be judged “good” even it it does a good job in difficult circumstances ie a transient population, large number of SEN, pupils struggling with English, small number of high-attainers, serving an area of deprivation, low aspiration and so on.

  13. Tubby Isaacs says:

    Henry, I noticed something interesting in the Guardian article:

    “The bottom of the table could be misleading, however, with Pate’s grammar school in Cheltenham recording no pupils at all getting five good GCSEs. This was because the school used a new English exam that was not included in this measure.”

    Presumably this would give a fall of not far off 100% for Pate’s. Do you know if the DfE included this school (and any others like it) in the overall average for non-sponsored academies? It would certainly bring the average down.

    Regards

Want to follow comments on this post? Use the RSS feed or subscribe below

Reply


7 + = sixteen