“So half the local authority schools which had their first inspection under the new framework were rated good or outstanding, compared to three quarters of the first batch of free schools.”
, November 2013
Schools Minister, Lord Nash, identified 16 local authority (LA), known as community schools, which opened in September 2011 at the same time as the first 24 free schools.
A Freedom of Information
response showed 20 mainstream* community schools opened then. 19 were a result of amalgamation. Only one was new provision.
Ofsted** has inspected these 20 schools. 10 were Good, 9 Require Improvement and 1 was Inadequate. The Inadequate school was in the process of becoming a sponsored academy at the time of its inspection and is now no longer a community school.
So, Lord Nash was right: “new” local authority schools performed worse than first-wave free schools judged on the same Ofsted framework.
There were also 26 further non-academies (Foundation, Voluntary aided etc) opened in September 2011. Five were new provision, two were former independent schools and the rest were the result of amalgamation.
How did Ofsted rate these 26 schools? 5 were Inadequate, 7 Required Improvement, 13 were Good and 1 was Outstanding. If we add these 26 to the 20 community schools, the figures still support Lord Nash’s assessment: the new 46 non-academies as a group performed worse than the 24 first-wave free schools.
First-wave free schools weren’t amalgamations, however. They were either new schools or previously independent ones. There were 8 non-academies in this category. 62.5% of these were good. Again, the figures support Lord Nash – the 24 free schools as a group performed better. But if just one of the non-academies had been judged good then they, as a group, would have performed as well as the free schools.
There are other ways to sort the data. How did Foundation schools fare? Did Voluntary Aided (VA) schools outperform free schools? What about Voluntary Controlled (VC) ones?
At the risk of testing the reader’s patience, I’ll look at the data again. One group of schools outperformed all others. 100% of Voluntary Controlled schools opened in 2011 were good or better.
How many were there? Just one.
And that’s the flaw with these figures. The samples are too small to come to any conclusion.
Good schools are good not because of how they’re structured but because they share common features (Ofsted
). Good and improving schools use similar methods which have nothing to do with whether or not they’re academies (PwC 2008).
Free schools are academies so these conclusions apply to them. A free school which shows the qualities identified by Ofsted is likely to be good. A free school that doesn’t will require improvement or worse.
But we are constantly fed the line that academies and free schools are superior as a model. It smacks of desperation when ministers attempt to justify the superiority of one type of school over another with such small samples.
And when faced with such nonsense, remember, 100% of VC schools which opened in September 2011 are good.
*Primary and secondary; not including Pupil Referral Units or Special schools
**Citing Ofsted judgements does not imply agreement
I’ve tried my best to ensure accuracy in my interpretation of the figures, but one missed school would skew the results. One more VC school would make nonsense of my analysis if it were judged Requires Improvement or worse. My conclusions, therefore, should be viewed with caution. However, the point's been made - the samples are too small to come to any conclusion.