Academy Trusts still performing worse than local authorities

Henry Stewart's picture

Analysis by PWC finds that the majority of academy trusts continue to underperform. At the same time, between 2014 and 2015, there has been significant improvement in the local authorities with previously low value added.

Last year, based on 2014 GCSE results, the Department for Education revealed data showing that only 3 of the top 20 academy trusts were above average in terms of value added.

This year, based on 2015 GCSE results, PWC has confirmed the same pattern. Of the 16 academy trusts in the PWC analysis, again only 3 are above average for value added. This was covered by The Times under the headline "huge gulf in academy standards revealed". 

Top Multi Academy Trusts

Harris and ARK were the only trusts to have above average value added in both years, though both saw their figures fall from 2014 to 2015:

 SchoolsYr 11 pupils2015 VA
Outwood Grange715361023
United Learning253962997
David Rose5708996

Top Local Authorities

PWC did not analyse local authority results, so the table below is based on our analysis. To mirror the DfE report of the 2014 results, this table includes only local authorities that still have at least 5 maintained schools and only the results for those schools. It does not include any academies in those areas. 

 SchoolsYr 11 pupils2015 VA
Waltham Forest1115921023
Tower Hamlets1422831019

Note: Bromley, Harrow, Hounslow, Southwark, Thurrock, Westminster also have average value added above 1023 (the figure for the best performing MAT) but are not included as they have less than 5 maintained schools. 

Although local authorities are no longer responsible for the 503 (presumably "underperforming") schools that have become sponsored academies, they have lost over twice as many (presumably "overperforming") schools to become converter academies, 1,272 in total. 

Worst Performers: Multi-academy trusts

The local authority with the lowest average value added is Oldham with a score of 974 (just over 4 GCSE grades below the average). However there are four MATs with figures below this, one quarter of the trusts in the study. SPTA is the lowest at 955 (7.5 GCSE grades below the average).

Indeed the data shows a significant improvement in local authority performance at the lowee end. The 2014 GCSE data showed seven local authorities with value added figures below that Oldham figure. Four of those LAs still had five or more schools in 2015 and had increased their scores. In two LAs this was with the same schools as before, in two cases they had one school less. 

So we have just 3 of 16 trusts performing above average. These constitute just 16%, or less than one in six, of the schools included in the survey. In contrast, of the 94 local authorities that still have five or more maintained secondary schools, 42 have value added above 1000. This represents 45% of the total.

There are no local authorities in this group that had a value added score below 974, but one quarter of the multi-academy trusts included have a value added score below that figure.

Why is the DfE not repeating their analysis?

Many education researchers found that the DfE analysis last year provided an informative insight into the relative performance of local authorities and MATs. So I submitted an FoI request to ask if they had discussed updating it for the 2015 results.

The DfE response revealed that such discussions had been held but they were not prepared to reveal details. The key reason given was that "it relates to the formulation and development of government policy." 

I am puzzled by this response. Are they saying that it is a matter of government policy to decide which data to publish, perhaps meaning only data that supports their arguments? 

It is though not surpising that the DfE has not repeated the publication of comparisons between local authorities and multi-academy trusts. The PWC study makes clear that, while a small number of MATs perform well overall, academy trusts perform far worse than local authorities. And fully one in four trusts have value added results below any local authority in England, and clearly require serious attention.


Data Notes

The figures for MATs are taken directly from the PWC analysis. Local authority figures have been calculated from the DfE school-by-school data on 2015 GCSE results, released in January 2016. The calculation is not based on the average school figure for each LA but is weighted according to size of school to give the average pupil figure across the borough. (This is the method used by PWC in their analysis.)

The DfE data can be found here.

To allow for results that could be due to performance before becoming part of the trust, PWC "take account of how long a school has been an Academy with that Trust".

Notes on Value Added

Value added figures are based around an average national figure of 1000, representing the change from a pupil's Key Stage 2 SATs results at age 11 to their GCSE results at age 16. If a pupil achieves above the level expected from their SAT grades, their value added figure will be above 1000. If they achieve below that expectation, their value added figure will be below 1000.

Each 6 pts represents one GCSE grade. Therefore a pupil if a pupil achieved BBBBBBBB in a school with a value added of 1000 then, on average, a pupil with the same SATs results would have acheived ABBBBBBB in a school with a value added of 1006. For a school with a value added of 1024, those grades would, on average, be AAAABBBB (or A*A*BBBBBB or any similar combination).


Share on Twitter Share on Facebook

Be notified by email of each new post.


Roger Titcombe's picture
Mon, 23/05/2016 - 11:17

It now seems such a long time ago that a PWC report 'soundbite' claiming that 'Academies are improving twice as fast as LA schools' was every trotted out by every Labour MP and MInister in support of Academies. The mechanism for this illusion of school improvement was Blair's 'Vocational Equivalent' scam. Academies, being more 'entrepreneurial', were faster out of the blocks than LA schools in exploiting the scam, like car parking companies were quicker than Local Authorities to realise the vast amounts of money to made from 'free parking' places with three figure 'penalties' for infringing the deliberately complicated and unclear rules.

All this (the schools not the car parks) is documented  in detail in Part 3 of, 'Learning Matters' entitled, 'Spectacular School Improvement'  and my Forum article that shows how it was done. Another Forum article about 'Perry Beeches - The Most Improved School -Ever' describes how LA schools began to catch up.

I mention this because of Henry's startling further reservations, that I am still waiting to see featured in a Guradian article or a TV News feature. The 'value added' parameter is significant here. We are not talking about the poorer GCSE results that schools with lower ability (based on CATs not SATs) should be getting in any proper education system, with no implication of 'failure' on the part of the school or the pupils, but with value-added: what improvements in general attainment and hence life chances conferred by the school.

It is recognised that a balanced intake in terms of ability helps schools to improve the value-added for all pupils. This is the foundation for the Hackney  Schools, LA and Academies alike. This is explained in detail in Part 4 of, 'Learning Matters' in respect of Mossbourne Academy.

The following extract is from Section 4.7

"League tables make it inevitable that LA schools geographically located at the centre of areas of high social deprivation with proximity based admission policies would have eventually failed to meet ‘floor targets’, and under the ‘zero tolerance of failure’ policy of New Labour, become candidates for closure and replacement by new banded Academies that could avoid admitting the problem pupils.

It is unbanded academies (unlike Mossbourne, which is banded)  in poor areas, but whose sponsors and managers believed that the invigorating effect of a commercial sponsor applying the purgative rigour of the free market would be sufficient to secure transformations, that have proved to be the least successful, especially with the demise of the ‘vocational equivalent scam’ (3.3, 3.4, 3.5) that appeared to provide protection, albeit to the ultimate disadvantage of their pupils,

Sir Michael Wilshaw and his co-founders of Mossbourne were therefore very wise to take the banding route to success. There is nothing unreasonable or educationally undesirable about this decision.

I note that the few MATs that are doing well in value added also, like Mossbourne, have banded admissions systems. I am not sure about the others that are performing less well, but I suspect that many are not.

As far as I am aware Hackney is the the only LA where LA schools also have banded admissions in a LA wide system.

If Academies and LA schools had the same powers over admissions I have no doubt that LA schools would be showing the failure of Academies to be even more disastrous.

There is a reason for this. The pedagogy of marketised education is less effective. See this article.

James Coombs's picture
Tue, 24/05/2016 - 08:50

Some comments on your information request

Section 35 is a qualified exemption.  The anonymous DfE responder says (s)he considered the arguments and came to a decision but I can’t see these arguments actually presented anywhere!  An annotation points to, “A statistical working paper on how DfE might assess the performance of schools within academy chains and local authorities at key stage 4 … seeking views on the 2 measures proposed in the paper …”  The document is 62 pages long and life is short but I think it’s proposing the radical new idea that schools should be measured based on Value Add rather than just how selective the school can be about those it admits through manipulating the admissions process (entrance tests, banding, religion, catchment etc.)  Better still, the DfE appeared to be suggesting Value Add should be only compared to pupils with similar prior attainment, avoiding the issues which Schools Datalab found with high prior attainment cohorts making greater progress.  I say ‘radical new idea’ although to anyone who has thought about this for more than 90 seconds would just regard this as common sense. 

So I think you’re asking for the outcome/conclusions from public consultation on how we should measure schools. How could that not be entirely in the public interest if the arguments are properly considered?  Section 40(2) is irrelevant because they can just redact the names of junior staff.  Perhaps they included it in the hope some of it’s 'unqualified-ness' would rub off on s35! 

Whilst it could be argued that the information relates to the formulation of government policy it’s hard to think what sort of statistical information doesn’t inform policy making (unless of course they just make it up as they go along).  The DfE have been providing the public with performance data for schools since the 1990s(?) and the way that factual information is interpreted should be completely open and transparent and not subject to ‘government policy’ … unless they feel the need to censor the part where the politicians said, ‘we need to find some way of ensuring that the MATs come out of this looking good.  You’ve got the spreadsheets.  Fix it!!’ 

Request an internal review.  After they’ve refused this again refer it to the ICO although they’ll then take another six months with no guarantee of a positive outcome.

Add new comment

Already a member? Click here to log in before you comment. Or register with us.