DfE 2015 data: Maintained primary schools improve faster than sponsored academies

Henry Stewart's picture
 27

The improvement in Key Stage 2 results is significantly greater in primary schools that are not academies, than in similar sponsored academies.

That is the clear conclusion from analysis of the 2015 primary schools results data, released last week by the Department for Education. Overall there was an increase in the percentage of pupils achieving the new benchmark of 4b in Reading, Writing and Maths - of 1.9% over the last year and 6% over the last two years.

However when similar schools are compared, the increase for maintained schools was larger. For instance, for those primaries where less than 41% of pupils achieved the 4b benchmark in 2014, sponsored academies increased their figure on average by 15%. But the average increase in non academies was 21%. Indeed in each of the five comparison groups it was the non academies that increased most:

Primary 1415 Level 4b

Growth in Level 4b KS2 benchmark, 2014-15
2014 quintilesSp AcademiesNon academies
Below 41%15.0%21.0%
41% to 49%6.3%13.4%
50% to 48%3.4%8.5%
59% to 67%-1.1%4.2%
68% and above-3.6%-3.4%


The schools have been grouped according to their 2014 results. With the Education Bill due for its 3rd reading in the Lords this week, this is an important finding. The Bill will force underperforming schools to become sponsored academies. Yet the data indicates that this will slow their recovery.

The key question under the Education Bill is what should happen to an "inadequate" school. The key is therefore not to compare sponsored academies with schools overall, as the DfE likes to do, but to schools starting from a similar point. We already know that a school is more likely to remain inadequate if it becomes a sponsored academy. We now know it is also likely to take longer to improve its results.

Note: The schools have been grouped by quintiles. This means that the range in percentages have been set to ensure each group has roughly the same number of sponsored academies. Overall this data covers 416 sponsored academies and 12,203 primaries that are not academies. The sponsored academies included are those with results, as an academy, in both 2014 and 2015 as the DfE data does not include results before academies converted.

Change from 2014 to 2015: Level 4 and Level 5


The DfE data also allows comparison of the % achieving the old benchmark of Level 4 and also the % achieving level 5. In each case, if similar schools are grouped together by prior year results, the non academies perform better in nearly all categories.  Primary 1415 level 4

The only exception is that of the best performing schools for Level 5 achievement.

Primary 1415 level 5

 

 



Over Two Years: Non academies do consistently better


The same picture emerges for the change in Level 4, Level 4b and level 5 results from 2013 to 2015.

Again in 14 out of the 15 groups, the primary schools that were not academies that increased their results more than those that were sponsored academies. (The 2013 data includes 144 sponsored academies and 12,091 non academies.)

 

 

Primary 1315 Level 4bPrimary 1315 Level 4

 

Primary 1315 Level 5What is the problem with sponsored academies?


The evidence, based on the DfE's own data, is clear. In 28 out of 30 comparisons, maintained schools improved their results more than similar sponsored academies.

There appears to be something about the nature of sponsored academies that restricts their ability to improve. The same is not true of converter academies, which do not do worse than similar maintained schools.

Converter academies, which are rated "Good" or "Outstanding" when they convert, are normally stand-alone schools. Sponsored academies, which are converted because they are seen as struggling, have to join Multi-Academy Trusts - also known as academy chains.

The DfE produced data earlier this year indicating that the value-added in established chains of secondary academies was generally below average. Adding this new evidence to that data suggests that academy chains are in fact a poor vehicle for school improvement.

The evidence indicates that, overall, a primary school will improve its results (whether Level 4, level 4b or level 5) more if it remains within the local authority sector as a maintained school.

Data Notes


Source: DfE school and performance tables 2015, http://www.education.gov.uk/schools/performance/download_data.html

Analysis has not been extended to 2012 as only 29 sponsored academies had results for that year. I have previously carried out analysis of 2012 to 2013 improvements in results, by linking sponsored academies to their previous school. This also showed that maintained schools performed better.

The move from a Level 4 benchmark to a Level 4b benchmark means that those pupils who achieved only a Level 4c are not included.

Detailed Data

These are the figures underlying the above graphs:

Growth in Level 4 KS2 benchmark, 2014-15
2014 quintilesSp AcademiesNon academies
Below 59%12.6%19.0%
59% to 67%6.1%9.8%
68% to 75%1.7%5.2%
75% to 81%0.0%2.2%
82% and above-5.7%-3.3%


 

Growth in Level 5 KS2 benchmark, 2014-15
2014 quintilesSp AcademiesNon academies
Below 6%5.5%10.1%
6% to 8%2.3%7.6%
9% to 13%2.1%6.0%
14% to 19%1.3%3.5%
20% and above-2.0%-3.1%


 

Growth in Level 4b KS2 benchmark, 2013-15
2013 quintilesSp AcademiesNon academies
Below 38%24.0%26.9%
38% to 45%10.8%18.5%
46% to 50%8.5%14.5%
51% to 61%-0.9%10.3%
62% and above0.4%0.2%


 

Growth in Level 4 KS2 Benchmark 2013-15
2013 quintilesSp AcademiesNon Academies
Below 55%20.2%24.0%
55% to 61%13.0%15.3%
62% to 67%11.2%11.5%
68% to 77%-0.2%6.8%
78% and above-3.9%-1.2%


 

Growth in Level 5 KS2 benchmark, 2013-15
2013 quintilesSp AcademiesNon academies
Below 4%8.9%15.0%
4% to 7%4.2%9.7%
8% to 10%7.0%8.3%
11% to 17%-2.6%5.7%
18% and above-1.4%-0.5%
Share on Twitter
Category: 

Comments

rogertitcombe's picture
Tue, 15/12/2015 - 17:59

Warwick Mansell has drawn my attention to this article.

http://www.theguardian.com/education/2015/jul/14/department-for-educatio...

Barry Wise's picture
Wed, 16/12/2015 - 09:38

On the other hand, if you click the scale the other way, you'll find 9 out of the 10 worst performing schools and 14 out of the worst 20 had small year groups too (<20).

Janet Downs's picture
Wed, 16/12/2015 - 10:08

Henry and Patrick - anther thing to consider is the prior attainment level of the pupils. I looked at large primaries which returned results of 100% and had 59 pupils or more taking SATs.

South Farnham School, Surrey: 135
Mudeford Junior School, Dorset: 66
St Anselm's Catholic Primary, Harrow: 62
The Greetland Academy, Calderdale: 61
Cunningham Hill Junior, St Albans: 61
Wykeham Primary, Brent: 61
Akiva School, Barnet: 60
Curwen Primary and Nursery, Newham: 60
King David Primary,Manchester: 60
Kingsgate Primary, Camden, 60 pupils
Middleton Primary, Milton Keynes: 60
Broadoak Primary, Manchester: 59 pupils
Wood End School, Harpenden: 59 pupils

Only two, Wykeham Primary and Middleton Primary, had a sufficient number of previous low attainers to be included in results. The rest had so few that the number was labelled 'SUPP'. It's misleading to claim these are 'top' primary schools when the intake has so few previously low attaining pupils. It's obvious the intake has a bearing on results.

Janet Downs's picture
Wed, 16/12/2015 - 10:42

Roger - the next PISA round has actually passed. Pupils took the tests this year. Most of the countries which took the tests (including Wales and Scotland) took them earlier in 2015 during our late Spring or early Summer terms.

England and Northern Ireland were given permission to move the tests until the Autumn term 2015 because of clashes with GCSE. This has been the case since 2006.

The means the cohort taking PISA tests in England and Northern Ireland is not the same as other countries. The pupils taking PISA tests in England and Northern Ireland would have been in Year 10 when other countries took the test. They took the tests in the Autumn Term when they were in Year 11. This means these Year 11s had had less schooling (about 5 months) than the previous Year 11s would have had if they had taken the test in late Spring/Early Summer.

This is another reason why PISA results for the UK as a whole and England as a separate country should be approached with caution. The other reasons are the supposed relative decline must be seen in context of more countries taking part and the fact that the absolute score hardly moved ('stagnated' or 'consistent' depending on the spin) from 2006 to 2012. Then there's the margin of error... (See Full fact for discussion of this and other issues).

And then there is the increasing criticism of PISA methodology (start here).

Janet Downs's picture
Wed, 16/12/2015 - 10:47

Roger - there has been no 'spectacular growth' in GCSE results in the last three years. There's been a decline (probably as a result of the removal of high equivalence values). In 2012, 59.4% of pupils in England reached the benchmark of 5+ GCSEs A*-C including Maths and English. That dropped to 53.4% in 2014.

Barry Wise's picture
Wed, 16/12/2015 - 11:08

Roger

How much does cramming for exams really hold development back? The children who go on to produce the very best results at GCSE and A Level are those from independent or grammar schools who between 10 and 12 were most intensively tutored/prepared for 11+ or Common Entrance. Compared with that regime, Y6 panic over SATS is a doddle.

Maybe it's the richer curriculum, rather than worrying over much about 'behaviourist' methods that's key.

Janet Downs's picture
Wed, 16/12/2015 - 11:35

Barry - for 'intake' read 'cohort'. That said, you're right that each year's intake was not always above average as Ofsted noted. In some cases, the intake was below average and these schools should be applauded.

South Farnham: 'Children begin the Reception year with language and number skills which vary, but which are generally above average.'

Mudeford Junior: 'The attainment of pupils when they first enter Year 3 is broadly average'

St Anselm's: From skills that are broadly average on entry to the Early Years Foundation Stage, standards are well above average by the end of Key Stage 2.

The Greetland Academy: not inspected as an academy, Ofsted report for predecessor school not on Ofsted's website.

Cunningham Hill: 'Pupils enter the school with generally above-average levels of attainment, although this fluctuates a little from year-to-year'

Wykeham Primary School: 'Children join the early years with very low levels of development, not typical for their age.'

Akiva School: 'Pupils start school in Reception with skills that are similar to those expected for their age.'

Curwen Primary: '...below average'

King David Primary: 'Pupils achieve well from their skill level on entering the school, which is generally typical.'

Kingsgate Primary: 'Pupils enter school with levels of skill and understanding that vary. However, the majority start school with skills well below national averages for their ages'

Middleton Primary: 'Children enter the Reception classes with skills that are broadly similar to those typically found.'

Broadoak: not inspected as an academy, predecessor school report not on Ofsted's website

Wood End: 'The attainment of most children starting in the Early Years Foundation Stage (EYFS) is above that expected for their age.'

Janet Downs's picture
Wed, 16/12/2015 - 11:51

Barry - much research suggests state school pupils outperform equally qualified pupils from private schools at university. Sutton Trust went further - pupils from comps outperformed their equally qualified peers from independent and state grammars at uni.

It's sometimes been argued (eg by the Mail) that the higher performance of state school pupils at uni is because state school pupils tend to go to lesser unis which give out degrees like confetti. But research from Cambridge and Oxford disputes this.

Oxford found 'those [state pupils] who do get in [to Oxford], private school students perform about as well as state school students.’

Oxford researchers asked whether those state pupils who did NOT gain a place at Oxford would have achieved the same degree as those private pupils who DID get in. They concluded the more likelihood of private school pupils actually gaining a place was due to ‘short-term teaching effects upon the secondary’school grades of private school students’. Note the description ‘short-term’. In other words, the effects didn’t last.

Janet Downs's picture
Wed, 16/12/2015 - 13:20

Barry - '...what this and other studies have confirmed is that independently schooled students are outperformed by state schooled students who have the same level as prior attainment.'

That's what I said.

It's not surprising that the proportion of state pupils getting top A level grades is smaller than the proportion in independent schools. The latter, like grammars, choose their pupils on the basis of their ability to pass these exams at high level. The former, on the other hands, educates children of all abilities.

I'm not sure that all 11+ pupils are 'intensively prepared'. The Lincolnshire Consortium of Grammar Schools, for example, says no special preparation is needed except for getting candidates familiar with the type of questions asked by using pratice papers available commercially. This isn't the same as 'intensive preparation'.

And just because pupils have been through 'SAT terror', doesn't mean it's right. As I've said many times before, there's already too much emphasis on exam results in England and this risks negative effects (OECD 2011).

rogertitcombe's picture
Wed, 16/12/2015 - 14:15

You are right Janet and this is almost certainly for the reason you give. However, the 'improvements' have continued, if not accelerated at KS2. But the 'improvements' at GCSE since its introduction in 1988 are truly astronomic regardless of any levelling out in the last three years. See Section 1.10 of 'Learning Matters'. It is important to note that this levelling out is seen by the DfE as a problem and a sure indicator of secondary school under-performance, so my general point remains strong.

In the case of PISA, unless there are systematic rather than random errors and unreliabilities, which disproportionately affect English pupils, it is still a fair question to ask why all this school improvement in the English system is not reflected in the PISA outcomes.

Perhaps it will be. We shall see. When will the latest results be published?

rogertitcombe's picture
Wed, 16/12/2015 - 14:40

11+ and Common Entrance exams have more in common with CATs than they do with SATs. The 'richer' the curriculum the poorer the outcomes of behaviourist teaching.

That is why such methods were largely abandoned in the comprehensive school era until the introduction of high stakes floor targets.

Where you find 100 percent pass rates that is a sure indicator of a 'tick box' curriculum and assessment method. The classic examples were the 'vocational alternative' scams at GCSE that Gove was right to abolish. These caused the rebirth of behaviourism in secondary schools.

The best independent schools do not do behaviourism. See the 'slow education' movement that had its origins at Eton College.

See

http://sloweducation.co.uk/

Where you find 100 percent pass rates that is a sure indicator of degraded curriculum and degraded teaching methods to match.

My argument is that the expectation of 100 per cent rates at KS2 (and previously in GNVQ/BTEC) has the effect of impoverishing the curriculum, incentivising behaviourist teaching methods and disadvantaging all pupils (most in poor areas) force fed this educational diet.

Janet Downs's picture
Mon, 21/12/2015 - 11:31

Roger - the Independent reports Anthony Seldon has warned against 'Gradgrindian' approach to schooling. Then he goes and spoils it by praising Nicky Morgan for being behind Gove's reforms and her highlighting 'character building' and children's wellbeing.

Barry Wise's picture
Wed, 16/12/2015 - 12:39

Janet


It's not quite as straightforward as it seems:

A report that said state school graduates get better degrees than independent school graduates was wrong, its authors have said.

The Higher Education Funding Council for England (HEFCE) said in September that 82% of graduates getting first class or upper-second class degrees in 2013-14 came from state schools compared with 73% from independent schools.

But HEFCE has admitted to a "transposition error" - the numbers were the wrong way round.

In fact, a greater proportion of independent school graduates were awarded a top degree than state school graduates.

.... In 2013-14, 73% of state school graduates gained a first or upper second class degree compared with 82% of independent school graduates.


That said, what this and other studies have confirmed is that independently schooled students are outperformed by state schooled students who have the same level as prior attainment.

(The problem is that at the top end, not many state educated students have the same levels of prior attainment).

In any case, it doesn't really have anything to do with what I was saying to Roger. The chances are that a high proportion of the state students going on to win firsts at uni came from grammar schools and were intensively prepared for the 11+. Even those from comprehensives will probably have gone through the SATS terror, which Roger says can stunt development.





http://m.oxfordmail.co.uk/news/13931614.Higher_education_report__got_its...

Barry Wise's picture
Wed, 16/12/2015 - 15:49

I agree broadly with all that.

Janet Downs's picture
Thu, 17/12/2015 - 08:53

Roger - the results of PISA 2015 will be published towards the end of 2016, probably December if the same timetable as previous years is followed. I predict:

1 If UK's relative position rises, the Government will claim it vindicates its reforms.

2 If UK's relative position falls, opposition parties will claim it shows Government reforms have failed.

3 If UK's relative position stays more-or-less the same, the Government will claim it will take time for its reforms to come to fruition (same argument can be made for point 2 above). Opposition parties will shriek that billions has been spent on changing school structure and nothing to show for it internationally.

rogertitcombe's picture
Fri, 18/12/2015 - 09:24

And if 2 or 3, I will make the point that if schools have been 'improving' at 10%+ per year for the last three years, and many decades previously, and the schools in other countries haven't, then why isn't this reflected in PISA outcomes? I will make the same point if there is only a small improvement in PISA - you could put this down to immigration raising IQs.

I suspect that the upper reaches of the PISA national table will be similar to past results. If so it really would be interesting to see data for the school improvement in terms of increase in national exam results in those countries. My prediction is that there won't be any such improvement. Section 3.8 of 'Learning Matters' asks, 'Is school improvement a good thing?"

My argument is that it is not, because high stakes pressure on schools to keep up with escalating floor targets and minimum standards for every child degrades curriculum and learning methods and depresses real educational standards.

This is how I put it on p154 of 'Learning Matters'.

" My hypothesis, an invitation for others to argue about, is that degraded and corrupted curriculum involving the large scale abandonment of pupil practical activity in science lessons and the increased substitution of crude behaviourism for developmentalism as the ruling pedagogy in English schools, combined with successive perverse outcomes arising from the operation of the imposed market are combining to produce an ever tightening spiral of real educational decline that continues to manifest itself in new and often surprising ways."

rogertitcombe's picture
Tue, 15/12/2015 - 14:09

Henry - This is excellent work and vital information. The key question is, why your rock solid findings continue to be ignored. No one can be surprised at the 'blind eye' of the DfE, however disgraceful, but what about the BBC, not to mention the Guardian?

The BBC still frames its education coverage in the form of leading questions and false assumptions that could have been written by the spin merchants employed by the Secretary of State for Education.

But where is HM Opposition? Come on Jeremy, this is a barn door waiting to be kicked open. We are yet to hear much at all from Lucy Powell. OK it is better than the drivel spouted by Tristram Hunt and his predecessor, but it is not good enough. She has no excuses. I sent her a free copy of 'Learning Matters'.

I urge all readers of this post to tweet and retweet links to this post as well as drawing the attention of the media to their abject failure to do their job.

However, not to take anything away from the importance of your work, I ask everybody to think about the percentage improvements that are being achieved. A 10 percent year on year growth in attainment results in a doubling every 7 years. This is on top of several decades of similar growth in both floor target thresholds and the percentages of pupils exceeding them.

Is this really credible in terms of real increases in the cognitive ability of pupils and/or their increased levels of understanding? If the answer to this is 'yes' then what was I doing in my science teaching in the 1970s, 80s and 90s? It must have been complete rubbish. Yet it resulted in progression to top A Level grades and entries to top universities? Did the Nuffield Science era completely fail in its ambition to raise the dire level of science understanding in our secondary school population? That is not how I see it.

So what is really going on? Part 3 of Learning Matters entitled 'Spectacular School Improvement', documents a numbers of scams and educational illusions, but the worst of these have been ended. Yet results are still soaring.

At the risk of triggering the usual 'health warnings' from Janet, why have these apparently soaring achievements by our schools not been reflected in PISA rankings?
The fact is that other country's schools have not been 'improving' (based on their own exam and assessment systems) at anything like the rate of English schools (if at all) yet the English system is still failing to make much impact in the international PISA ratings.

The next PISA round is 2016 (am I right?). Surely after three years of this sort of spectacular growth in English schools exam results we should now expect nothing less than the top spot.

We shall see.

Patrick Hadley's picture
Tue, 15/12/2015 - 15:45

I agree with Roger that this is an excellent piece of work. However all Conservative MPs, (and unfortunately the majority "New" Labour MPs) are so committed to privatisation, that they simply cannot digest any information that proves that it does not work. For them it is a fundamental article of faith that transferring assets from the public to the private sector is always going to be beneficial.

Roger's point about the unbelievable rate of improvement in the test results is important. One of the odd things about the education debate is that all improvement in exam results at 16 and 18 is always caused by grade inflation and therefore a sign of the problems that exist in our schools; whereas all improvement in test results at 11 is a sign of the success of government policies.

The improvement in results in KS2 English and Maths has been going on since SATs were introduced. In 1995 the percentage of pupils who scored level 4 or above in English and Maths was 49% and 45% respectively. http://www.theguardian.com/education/table/2008/aug/05/sats.schooltables
The standard of attainment needed to gain Level 4 was described in 1995 as being the "expected level" which means the level attained by the average pupil. When 51% of pupils did not reach this level in English, and 55% were below this level in Maths in 1995, there was no great outcry - since they were not all that far from the expected result that 50% would be below, and 50% would be above.

In the twenty years since 1995 this "expected" level has come to mean the minimum acceptable level - all children are now "expected" to be above average. We see articles in the newspapers saying that 19% of pupils leave primary school with below the expected level in Maths and English; and read that schools where 65% of pupils are above the expected level are under-performing.

As a result of the pressure on schools to improve test results there is now more time given to English and Maths teaching in primary schools which could go some way to explaining a little of the improvement in test results. But I wonder if experienced secondary school teachers believe that there has been a big improvement over the last 20 years in the standards in English and Maths of the pupils who arrive in year 7. If the higher test results are more than just an illusion, then this should be noticed by teachers. I would really like to know if this is case.

Barry Wise's picture
Tue, 15/12/2015 - 16:00

Patrick

One of the odd things about the education debate is that all improvement in exam results at 16 and 18 is always caused by grade inflation and therefore a sign of the problems that exist in our schools; whereas all improvement in test results at 11 is a sign of the success of government policies.

True. But another thing that is never acknowledged is that
lots of primary schools are now getting 100% to L4 in reading, writing and maths (some in really deprived areas) This shows that L4 is within the reach of pretty well any student.

By contrast, at KS4 no school that is not at a selective school (so far as I know) boasts a 100% record of 5 X A-C inc M&E. This target is simply not universally attainable.

At primary, if students are not reaching the expected level, it is almost certainly the fault of the school. At secondary, there's a good chance it's down to the intake.

Patrick Hadley's picture
Tue, 15/12/2015 - 17:22

An interesting reply Barry. What do you think of the fact that in 1995, after 16 years of Tory education policy, most of which were when Margaret Thatcher, a former Education Secretary, was PM, only 45% and 49% of primary pupils achieved Level 4 in Maths and English respectively?

During the last Labour government the GCSE "5 A*-C pass" rate went from 45% in 1997 to 75% in 2010. Tory politicians said that this was all the result of grade inflation.

The KS2 "level 4 or above" rate in English went from 63% to 83%, in Maths the rates went from 62% to 79%. Rather than say this was all grade inflation, or praise Labour for the big improvement in the numbers reaching level 4 compared to when the Tories were in power, Nicky Morgan makes speeches suggesting that any pupil below Level 4 cannot read, write or do basic maths. She does not mention that this was presumably also the situation for most pupils who left primary school during the previous Conservative government.

My point is that there is far too much emphasis on the test results and that it is possible to drill pupils so that they can pass performance tests at 11 and get grade C or above at GCSE at 16, without actually giving them a good education.

rogertitcombe's picture
Tue, 15/12/2015 - 17:26

Barry - The only scenario in which 100 per cent of those tested reach the required threshold is in vocational training, not education. I am not knocking training. This is what I write in Section 3.3 of 'Learning Matters'

" The aim of vocational education is to bring as many trainees as possible, regardless of ability, up to a threshold level of competence. This is achieved by requiring trainees to demonstrate familiarity and competence with a limited number of closely specified scenarios. It is therefore training in how to respond to the circumstances required to be met in a specified job application. This criterion-referenced approach is entirely appropriate to job training where uniform standards are required. Such training is structured to make minimum possible cognitive demands and is unconcerned with general intellectual development.

Intellectual development however has in the past always been regarded as what schools are for. Subjects are studied not just for their own sake but also for their value in developing the wider cultural, scientific or artistic understanding of the individual.

These fundamental educational assumptions are rooted in the rational values of the European enlightenment, and the comprehensive school movement was about ensuring that the advantages of such an education were made available to all. The levels achieved as a consequence of such schooling obviously depend on the prior cognitive ability of pupils as well as on the quality of teaching so a wide range of performance is to be expected."

In the context of KS2 SATs it is possible to argue that, "It is therefore training in how to respond to the circumstances required to be met in a specified job application" means,'training in how to respond to the circumstances required to be met in a specified KS2 SATs test'. Just as a train driver can be trained to respond correctly to the large but finite number of circumstances (signals, breakdowns, specified emergencies etc) that can be met in driving a train, a Y6 pupils can be trained to respond correctly to the large but finite number of different sorts of questions to be expected in a KS2 SATs test.

The key point is that the teaching method that is most appropriate in both cases is rote learning by memory, repetition, practising past tests, etc. This is tedious so to get the subject to persist, rewards and punishments need to be applied. For a train driver these are passing/failing to get to be a train driver. For a Y6 pupil they are passing/failing to get L4 in SATs (even though the result is of no value whatever to the child as a qualification). The KS2 SATs example is even more 'high stakes' as it puts the pay/jobs of the teachers/headteacher at risk rather than just the sense of failure of the pupils and parents.

However, "Intellectual development has in the past always been regarded as what schools are for. Subjects are studied not just for their own sake but also for their value in developing the wider cultural, scientific or artistic understanding of the individual."

While I wrote this with secondary schools in mind, I believe very strongly that it also applies to KS2. In fact it is even more important if you believe that a key purpose of KS2 is to develop the cognitive strategies of pupils to enable them to pass from the Piagetian 'concrete' stage to the 'formal' one necessary to for coping with the more abstract demands of a rich secondary curriculum.

The implication of my argument is that by insisting that all pupils reach a specified attainment level at KS2, pupils are condemned to behaviourist teaching methods (see Section 1.8 of 'Learning Matters'), which does not develop cognitive ability. To put it crudely the cost of ensuring 100 percent attainment of L4 in SATs is 'dimmer kids'.

Children in KS2 will vary greatly in their individual cognitive development. CATs tests are available standardised for all primary ages of pupils. They always result in a bell curve normal distribution of results because they test normally distributed cognitive ability levels.

A primary school that takes predominantly pupils from less affluent home backgrounds will have have a pupil cognitive ability distribution weighted towards the lower end. To get all these pupils up to the high stakes L4 threshold will therefore require a more behaviourist and less developmental approach than in a school from a posh area with more cognitively able pupils. The former school is forced to prescribe a more degraded, less developmental curriculum than the latter. So kids in schools that serve poor postcodes suffer a triple whammy. Not only do they get the most tedious teaching methods, they are subject to the most pressure and they fail to be cognitively developed as a consequence.

There is an obvious knock-on to the secondary school that takes these children. They have the KS2 levels that allow Wilshaw to claim that they have attended good primary schools, but their cognitive ability has not been developed sufficiently to make the 'expected' progress at secondary school, resulting in the claim that such secondaries are failing!

All this explains why ever improving KS2 test scores do not result in ever better GCSE and A Level results.

This is the thread that runs throughout my book, 'Learning Matters'.

No-one else is making this argument.

Henry Stewart's picture
Tue, 15/12/2015 - 23:04

Roger

Why it hasn't received more attention is a good question. Before publishing it here, this analysis was offered, as an exclusive, to Schools Week, TES and the Guardian CiF. None showed interest.

Though Warwick Mansell understands this issue very well and that was an excellent piece by him,

Henry Stewart's picture
Tue, 15/12/2015 - 23:09

Barry - that is true on 100% getting level 4. There were 771 primaries this year which achieved 100%, of which 130 had the % of disadvantaged pupils above the average of 32%.

For the new level 4b benchmark, those figures are 281 priamries overall getting 100% and 42 with % disadvantaged over 32%.

Patrick Hadley's picture
Tue, 15/12/2015 - 23:51

Thank you again for your analysis.

Looking at the results of the "best 1000 primary schools" in the Daily Telegraph http://www.telegraph.co.uk/education/leaguetables/12041096/Primary-leagu... it seems to me that most of the schools with 100% have rather small numbers of pupils in Year 6. Obviously it is easier to get 100% if you have only 15 children the the year group than if you have 60, not just because it can be easier to concentrate on a small group, but because you are less likely to have an "outlier" pupil who is simply unable to achieve level 4, in a smaller class.

Barry Wise's picture
Wed, 16/12/2015 - 10:39

Janet

I'm not sure 'intake' is the right word here as their prior attainment would be KS1 assessments and their performance there would be in substantial part a measure of the school's own effectiveness at EYFS.

rogertitcombe's picture
Sun, 20/12/2015 - 13:18

While we are again discussing testing and school standards perhaps it is worth reminding ourselves of the very serious downsides of the culture that has captured and entrapped the English education system. See:

https://rogertitcombelearningmatters.wordpress.com/2015/12/20/the-uninte...

Henry Stewart's picture
Wed, 23/12/2015 - 13:12

I have now got a colleague to run a regression analysis on this relationship. This shows that the improvement in maintained schools is greater than the improvement in sponsored academies, once their starting point is taken into account, by 6.4%.

In a one form entry primary that means, on average, two extra pupils in each school achieving the new 4b benchmark. In a two form entry, it means four extra pupils.

So this conclusion can be drawn: taking the 2013 score into account and ignoring other factors, sponsored academy status is associated with lower improvement - and this is statistically significant at the 99% level.

As my statistical colleague put it: "Government claim [that sponsored academies leads to improved results], is unfounded, defo".

(See the article above for more detail.)

Add new comment

Already a member? Click here to log in before you comment. Or register with us.