A growing consensus: Multi Academy Trusts are not the answer

Henry Stewart's picture

Last week's Education Policy Institute report is an addition to the growing body of evidence that Multi-Academy Trusts (MATs) do not result in improved school performance:

"The analysis we have produced casts doubt on the Government's previous policy of academising all schools. It is not clear what the gains from this would be in terms of school performance, not least for schools in high-performing local authorities."

The report includes a foreword from David Laws, who was part of the coalition that promoted academisation so strongly. It is written by Jon Andrews. According to his bio, Jon worked as a statistician at the Department for Education from 2003 to 2016, including having been the "lead statistician on academies" and having played a role in the recent white paper. 

The report makes clear that, overall, MATs underperform compared to local authorities (p27):

  • At secondary level, 53% of MATs performed "significantly below average" for current performance, with only 26% performing "significantly above average"
  • At primary level, 40% of MATs performed "significently below average" for current performance with only 18% being signficantly above average

The report also analyses improvement in performance. Here secondary MATs also underperform though it finds that improvement in primary MATs  is as good as local authorities.

Given the data, the conclusions are inevitably opposed to mass academisation: "moving from a school in a high-performing local authority to a school in a low-performing multi-academy trust would appear to risk a significant decline in progress and attainment". 

It is tempting to comment on the change in analysis by the author since leaving the DfE. However this would be inaccurate. While ministers have cherry-picked nuggets of data that appear to show academies in a good light, the actual analysis by DfE statisticians is completely in line with this report. 

The DfE analysis of academy chains in March 2015, produced while Jon was still at the DfE, found that only 3 of the top 20 chains created value added above the national average in their secondary schools. And the follow-up released at the same time as the EPI report came to remarkably similar conclusions: It found 54% of MATs had performance "signficantly below average" for secondary schools, with only 24% being "significantly above average".

When we started producing analysis of the relative performance of academies, there was considerable debate about whether academies or local authority maintained schools performed better. That debate appears to be over, at least among those who analyse the data. Neither the DfE's own statisticians nor independent bodies like the Education Policy Institute claim that, overall, Multi-Academy Trusts perform better. Indeed at secondary level at least it seems clear that MATs underperform. Both the DfE and the EPI find that the majority of secondary MATs are "signficantly below average" in their performance.

The analysis effectively confirms that the focus on structure has been a huge diversion. Is it too much to hope that the new Secretary of State will study the data and conclude that a change of course is needed? It is time to start focusing on the things that actually make a difference, like leadership, quality of teaching and a positive learning culture in schools.


Share on Twitter Share on Facebook

Be notified by email of each new post.


Roger Titcombe's picture
Sun, 17/07/2016 - 18:29

This is more important stuff from Henry, but it still understates the failures of Academy Trusts. The reason can be found in a very careful reading of this article by Warwick Mansell.


Simply focusing on exam results as the sole arbiter of success may tell us how effective the institution is at concentrating on performance metrics, but not much about other aspects of education. It may encourage narrow teaching to tests.

This is from the article.

Problems: teaching to test and inclusion

However, there are two main problems. The first is well-known. It is simply that focusing on exam results as the sole arbiter of success may tell us how effective the institution is at concentrating on performance metrics, but not much about other aspects of education. It may encourage narrow teaching to tests.

Despite the multiple measures used, both of these reports seem to encourage one-dimensional verdicts on which are the ‘best’ academy trusts: the ones which manage to see the pupils who are included in the indicators which the research uses – in the case of the Sutton Trust research, disadvantaged pupils, and in the EPI study, pupils as a whole – achieving the best results.

Yet the reality, it seems to me, is much more complex. A prominent academy chain, which runs schools near where I live, has been known to do well in statistical assessments of its results. Yet some parents I speak to seem not to want to go near it, because of a hard-line approach to pupil discipline and a reportedly test-obsessed outlook. This may generate the results prized in studies such as these, but are these schools unequivocally better than others? I think researchers should at least acknowledge that their results may not be the final word on what counts as quality. My hunch is that these studies may be picking up on academy trusts which are more successful in managing the process of getting good results for their institutions. But is that the same as providing a generally good, all-round education for all those they might educate? The reports offer no answers because they are purely statistical exercises which do not investigate what might be driving changes in results. So we need at least to be cautious with interpretation.

This is especially the case when we move on to perhaps the less obvious concern about these studies. It is that both investigations focus entirely on results at institutional level, counting the success of schools in getting good results out of those pupils who are on their books at the time the statistical indicators are compiled. However, this ignores a potentially serious perverse incentive of England’s results-based, increasingly deregulated education system.

The studies seem entirely uncurious about what is often put to me, by observers of its effects on the ground, as a very serious risk inherent in the academies scheme as currently understood. This is that in deregulating such that each academy trust is given a degree of autonomy, coupled with the pressure on each trust to improve its results, a perverse incentive is created for trusts to become less inclusive.

In other words, they either use admissions to take on more pupils who are likely to help their results, or they try to push out students who are already on their books but less likely to help their results. This concern is referenced in the research review I carried out for CPRT. This quotes a finding from the Pearson/RSA 2013 review of academies which said: ‘Numerous submissions to the Commission suggest some academies are finding methods to select covertly’. The commission’s director was Professor Becky Francis, who is a co-author of the Sutton Trust study, so it is surprising that the latter paper did not look at changing student composition in MATs.

A statistical approach summing up the effectiveness of individual academy chains entirely through the results of individual chains without any way of checking whether they are becoming more selective does not address this issue.

I admit, here, that I have more reasons to be concerned at the secondary, rather than at the primary, level. Since 2014, I have carried out simple statistical research showing how a small minority of secondary schools have seen the number of pupils in particular year groups dropping sharply between the time they arrive in year seven and when they complete their GCSEs, in year 11.

Indeed, one of the top-performing chains in both these reports – the Harris Federation – has recently seen secondary cohort numbers dropping markedly. Harris’s 2013 GCSE year group was 12 per cent smaller than the same cohort in year 8. The 2015 Harris GCSE cohort was 8 per cent smaller than when the same cohort was in year 7. This data is publicly available yet neither report investigates shrinking cohort size. That is not to say anything untoward has gone on – Harris is also very successful in Ofsted inspections, and has said in the past that pupils have left to go to new schools, to leave the UK or to be home-educated – but it certainly would seem worth looking into.

When the Sutton Trust study mentions ‘[academy] chains that are providing transformational outcomes to their disadvantaged pupils’, its figures are based only on those actually in the chains in the immediate run-up to taking exams. Would the analysis change if it included all those who started out at the schools? We don’t know. The fact that DfE data is available suggesting major changes in pupil cohorts but it seems not to have been looked at is remarkable.

In addition, the fact that high-profile research studies purporting to show the success of organisations are not considering alternative readings of their statistics may incentivise organisations not to think about students which they may consider to be harder to educate. Results measures currently provide an incentive to push such students out.

The lack of curiosity is extra surprising, given that the issue of ‘attrition rates’ – schools losing students – has been live in the debate over the success of one of the largest charter school operators in the US, KIPP schools.

Janet Downs's picture
Mon, 18/07/2016 - 10:34

Academization is not the answer.  It never was.  Deception about academies has been going on since they began under Labour.  Bad as this deception was under Labour, it reached stratospheric heights under the Coalition.  Voices which contradicted the official line (Academies goooood;  LA schools baaaaad) were at best ignored (Academies Commission Report 2013) or villified ('enemies of promise', 'Marxists' etc).  Let's hope Justine Greening is influenced by evidence and not by rehtoric.


Janet Downs's picture
Mon, 18/07/2016 - 10:37

Voices pointing out that removing schools from LA control doesn't automatically raise performance (and may actually do the opposite) are getting louder.  Yet the Government is still committed to a fully-academized system in England in the face of this opposition.  Why?   The first clue is in an article by James Croft , Adam Smith Institute, in May 2015.  He said there weren't enough 'high quality sponsors' so the time had come to consider letting for-profit education providers to run schools in England.   Academy failure actually feeds this conclusion.  A cynic might say it was meant to.

The second clue is a statement by Michael Gove when he was shadow education secretary shortly before the 2010 election.  He told Policy Exchange he would let groups like Serco run schools in England.     

Add new comment

Already a member? Click here to log in before you comment. Or register with us.