Big Improvement in Primary Results - and its nothing to do with Academies

Henry Stewart's picture
 11
The primary school data released today showed a remarkable improvement in results. The % achieving Level 4 in both Maths and English rose by 5.2% on average in community schools, though just 3.7% in those primaries which were academies. A big congratulations to the hard working students, teachers and other staff in primaries across the country.

The DfE noted that only 476 primary schools were now below the "floor targets", compared to 1,310 last year. This transformation had nothing to do with the academies policy that the DfE's promotes as its primary only policy for school improvement.  The schools that had low results in 2011 showed the largest increases. The figures for community primary schools are:

* 21 schools on < 20% in 2011: Average 8% pts increase
* 54 schools on 20-30%: Average 30% pts increase
* 157 schools on 30-40%: Average 27% pts increase
* 451 schools on 40-50%: Average 23% pts increase
* 1,212 schools on 50-60%: Average 17% pts increase

Academies did almost as well, achieving -5%, +23%, +27%, +12%, +17% in these categories. However, according to this data, those figures come from very very few academies - just 1 or 2 in the first 4 categories and just 14 in the 50-60% category. So of the schools under the 60% floor target in 2011, 20 were Academies and 1,895 were community schools. The massive drop in schools under the floor target has therefore virtually nothing to do with academies.

Analysis of the schools making the greatest progress backs up that message. Less than 1% of these strongly improving primaries were academies:


  • 156 primary schools increased their results by at least 40% pts. None of these were academies

  • 464 primary schools increased their results by at least 30% pts. Just 3 of these were academies.

  • 1,540 primary schools increased their results by at least 20% pts. Just 14 of these were academies



Now the lower overall increase in academies can be explained by the fact that most are already high performing schools. But what is clear from the data is that the big improvement in "underperforming" schools has nothing to do with academies. Academies have no track record in improving such schools. Local authority primaries, on the other hand, have a very successful record in this.

 

An evidence-based Department for Education would step away from its obsession with academies and look at what caused this big turn round and seek to replicate it further.

 

 

Data Note
These results are dependent on the accuracy of the DfE data. The DfE press release states that there are 970 primary academies. However the data table only categorises 412 as Academies. Of these only 270 have results listed for both 2011 and 2012. Also some of the figures seem unlikely. One primary school in Bolton, for instance, is listed as having its results fall from 97% to 0%.

 
Share on Twitter Share on Facebook

Be notified by email of each new post.





Comments

Janet Downs's picture
Fri, 14/12/2012 - 09:48

I groan when these league tables appear. They're seized upon to "prove" that (take your pick) academy converters, sponsored academies, voluntary aided schools, voluntary controlled schools, faith schools, community schools, all schools have either (a) pushed more children through Key Stage 2 Sats, or (b) improved faster than other types of schools, or (c) both.

According to the stats, a child stands the best chance of gaining Level 4 in their Sats if they attend a Sikh converter academy in the City of London or the Isles of Scilly.

Janet Downs's picture
Fri, 14/12/2012 - 10:28

The performance of all schools has improved since 1995 following a steep rise in attainment in the late 90s and early 2000s when it started to level off. Even schools judged "failing" by Ofsted (such as Downhills where 67% of pupils achieved Level 4 in English and Maths in 2012) are performing at a much higher level than in 1995 when the figure for England was 45% reaching Level 4 in Maths and 49% in English.

This improvement obviously has nothing to do with academy status. So what drives it? Have pupils become more intelligent? Has teaching improved? If so, what has driven this - greater collaboration, shared expertise, better training, initiatives such as the London Challenge? Or are schools simply more experienced at getting pupils through the tests? Are they teaching-to-the-test?

How does this rise in raw test results reflect the education the children receive? Are other subjects and skills being neglected? Is there undue concentration on Sats especially as they are so high-stake? Do children have superficial knowledge (enough to get them through the tests) and not enough deep understanding (see link below)?

http://www.bbc.co.uk/news/education-20678353

And is it fair to lump all schools in one local authority together in one category? It's clearly not the case that all schools in the "top" LAs perform at the same high level. Conversely, not all schools in one of the "bottom" LAs is poor. And as the Education Endowment Fund found in 2011, many below-floor schools were nevertheless doing a good job in difficult circumstances.

http://www.localschoolsnetwork.org.uk/2011/07/disadvantaged-pupils-do-wo...

Michael Dix's picture
Fri, 14/12/2012 - 22:06

You pose many questions about this year's rise Janet. Here's another one: why, after many years of plateauing with the odd percentage rise and fall, has there been such a large increase across all subjects and levels? Strange, especially as there has been little change in schools other than a lot of fear for the future regarding Ofsted, academies and the new curriculum.

I suspect that the answer to all of your questions is probably yes because, despite the insistence by DfE/Ofsted/LAs that everything a school does has to show impact, any improvements are usually down to a complex mixture of causes.

As for the ridiculous DfE pronouncements, they do beggar belief. Surely there must be intelligent people working there who must cringe at having to speak this nonsense.

Leonard James's picture
Sat, 15/12/2012 - 04:42

Pretty sure I heard on Radio four that the levelling for English involved no writing and was based on teacher assessment.


Janet Downs's picture
Sat, 15/12/2012 - 10:25

Leonard - according to the BBC there was no externally marked test in writing and the subject's score was solely based on teacher assessment

http://www.bbc.co.uk/news/education-19660112

Fiona Millar's picture
Sat, 15/12/2012 - 10:25

I think there has been a change to the KS2 tests. Wonder how long it is until we are told that this is down to grade inflation, although will be interesting to see whether the Coalition is willing to resort to this when success happens on their watch.


Michael Dix's picture
Sat, 15/12/2012 - 10:45

This is correct - a result of the Bew Report and one of three small concessions to the concerns of headteachers to come out of the report. The overall English level awarded was made up of the reading (externally marked) score and the writing (teacher assessed score).

One irritating byproduct of this is that a level 4 in writing was calculated as 4b in working out the English score resulting in a child who scores 5c and 4a in reading and writing respectively gets a level 4 English overall but at KS1 a 3c and 2a would result in a level 3 overall. Therefore it is possible to make 2 levels progress in reading and writing and be awarded 1 level progress in English. Mathematically impossible and plain stupid.

Of course teachers can't be trusted to assess all the elements of writing so the government is introducing the Spelling, Punctuation and Grammar test next summer. This will be reported as a separate score rather than contributing to either the reading or writing levels. Sample materials were published this week on the DfE website.

Leonard James's picture
Sat, 15/12/2012 - 11:53

Given the high stakes nature of Gove's benchmarking and the ease at which this system of assessment can be exploited I'd be astounded if there wasn't widespread gaming going on here.


Michael Dix's picture
Sat, 15/12/2012 - 17:10

I expected to see a rise in writing due to the end of much criticised external marking, but the rise was across reading, writing and maths at level 4 and level 5. If it is so easy to fix the high stakes results then why haven't schools done it before? Results have stagnated for years and then suddenly, up they all jump. It's very strange.


Ian Taylor's picture
Sat, 15/12/2012 - 20:41

Does anyone know how the SATs results are standardised from year to year? Is there an OFQUAL at work behind the scenes? OFQUAL managed to bring about Mr Gove's desired end to GCSE grade inflation in English. Surely "they" can bring about Mr Gove's desired improvement in KS2 SATs results.
I would imagine that someone like Warwick Mansell would be able to calculate the chances of such large improvements happening this year against the background halting of progress over the last few years. I am not a statistician but I would say that these results are "unbelievable", in all senses of that word.

Michael Dix's picture
Sun, 16/12/2012 - 18:03

I know that whoever is in charge of the tests takes a largish sample early in the marking process and use these to try to ensure consistency with previous years when setting pass levels. The pass marks this year were roughly equivalent to previous years and the papers no more or less demanding. Of course, the only way to ensure a fully accurate comparison from one year to the nest is by giving pupils the same test.

Obviously the content of the test can have a major impact on results. Last year my year 6 teachers came out of the reading SAT complaining that there were far more words to read than in previous years, resulting in many children not finishing the paper. Being stuck for time, many of the level 5 type questions went unanswered. Nationally there was no change at level 4 but a significant drop in those achieving level 5. Government and media, at their most ludicrously unintelligent, immediately blamed this on some mass subconscious decision in schools across the country to abandon better readers, although not better writers or mathematicians. Yeh, that's what we did. Results this year were more in line with previous as the test was far fairer.

Add new comment

Already a member? Click here to log in before you comment. Or register with us.