Is School Improvement a Good Thing?

rogertitcombe's picture
 4
Henry Stewart is to be thanked and congratulated on his work showing the dishonesty of DfE claims that Academies outperform LA schools.

However, towards the end of his post he includes a section congratulating all improved schools and lists the six 'most improved' for special praise.

I have no intention of detracting from the hard work of pupils and teachers in these schools, but I do think that the notion that such school improvement is always of unqualified benefit to pupils is open to question. This is where my interest in educational research began. The start of my personal journey was the work carried out with Roger Davies, a professional statistician and Warwick Mansell, then on the TES staff, into claimed school improvement. This was published on the TES website and featured in the paper in January 2006.

We took the 2004 ‘100 most improved schools’ list published by the DfES and attempted to analyse the 2005 GCSE and equivalent results of all the schools on a subject by subject, grade by grade basis. A further control group of 60 ‘unimproved’ schools was also investigated in the same way. We found a direct relationship between the degree of school improvement and decreased opportunity for pupils to take high cognitive demand academic subjects. In some of the most improved schools there was no opportunity to take GCSE science (rather than a ‘vocational equivalent’) at all. The curriculum areas that were being degraded in the most improved schools were the very ones later identified by Michael Gove as being so important as to require a special place in league tables. These constitute the English Baccalaureate (Ebacc).

We found that it was the ‘unimproved’, ‘coasting’ schools that were providing the best and most enabling curriculum opportunities for their pupils.

This research is described in the first part of my paper, Titcombe R, How Academies Threaten the Comprehensive Curriculum, Forum, Vol 50, No 1, 2008 that can be accessed from the Forum website here.

This was the beginning of the eventual recognition of what has since become accepted as the 'vocational equivalent driven inflation' that became the essential facilitator of 'school improvement in the New Labour era.

Henry Steward's post prompted me to a have a look at some of the characteristics of the six 'improved' schools he mentions. This is a much easier task now because of the breadth of school based data now available on the DfE website that Henry has been using to such good effect. However, the 2013 Performance Tables now provide data of even greater relevance and interest. I have focussed on the section that provides the 'average capped score per pupil' for GCSEs only and for GCSEs and equivalents. Also provided are the 'average grades per GCSE for low, middle and high attaining pupils, presumably based on KS2 scores.

These are the data for the six improved schools in Henry's post. LSN mangles tables so I have given the name of the school followed by the data in the following order.

2013 5+A*-C including English and Maths including equivalents,
The percentage point increase,
The average capped score per pupil GCSEs only,
The average capped score per pupil including equivalents,
The percentage contribution of equivalents,
The average grade per GCSE (not including equivalents) of low/middle/high attainers

Dyke House School, Hartlepool, 75%, +38%, 276.0, 357.6, 23%, E/D+/B
Rushden Com College, Rushden, 40%, +34%, 217.9, 333.2, 35%, F/D-/C+
Sir John Talbot's Sch, Whitchurch, 58%, +32%, 273.0, 314.2, 13%, F+/C-/B
North Shore Academy, Stockton, 53%, +31%, 202.5, 339.7, 40%, G+/E+/C-
Ilkeston Academy, Ilkeston, 44%, +31%, 220.3, 310.8, 29%, F/D/C+
St Thomas the Apostle, London, 73%, +31%, 297.9, 334.2, 11%, D-/C/B

For comparison I include two of my local schools, one which has improved and one which has 'got worse'.

Furness Academy, Barrow, 46%, +11%, 206.9, 307.2, 33%, F/D/C+
Ulverston Victoria HS, Ulverston, 62%, -3%, 324.1, 357.4, 9%, E+/C-/A-

Given the vast amount of data now provided in the Performance Tables, which provide the best guide to the effectiveness of the school in enhancing the life chances through schooling of pupils of all abilities?

It seems to me that the most significant measure is the pattern of the average grade per GCSE for low, middle and high attaining pupils. In a good school this should to a considerable degree be independent of mean intake attainment. There appears to be a negative relationship between the contribution of 'equivalents' in the curriculum and the average GCSE grade per pupil, and this appears to be true for the whole attainment range. I can't see an obvious explanation for this.
There does not appear to be much validity in the huge amount of emphasis the DfE and OfSTED place on school improvement based on %5+A*-C including English and maths.
Share on Twitter

Comments

rogertitcombe's picture
Wed, 29/01/2014 - 11:17

In my penultimate paragraph, " In a good school this (average grade per GCSE for low, middle and high attaining pupils) should to a considerable degree be independent of mean intake attainment", is not correct. This is because, in my ignorance, I assumed that low, middle and high attainers applied to the KS2 national SATs distribution of scores. This is not the case. Middle means pupils with Level 4, low means below Level 4 and high means above Level 4. SATs levels are not designed to produce equal numbers of pupils in each tertile (third) of the population, therefore low, middle and high attainers are no such thing. As KS2 SATs levels continue to rise, in accordance with government expectations more and more bottom tercile pupils will find themselves classed as middle attainers. Secondary schools in Hackney can work out how many low, middle and high cognitive ability pupils they admit from their Y6 CATs score distribution. Bottom tercile pupils are those with a CATs score of 94 or less. Top tercile pupils have a CATs score of 106 or more. There is a high degree of correlation between SATs and CATs but this not mean that the national CATs distribution (a bell curve) will be congruent with the national SATs Levels distribution. If I have got this wrong then one of the LSN statistician posters please correct me. It is interesting to look at the figures for Mossbourne Academy, which has a CATs driven admissions policy that ensures its cognitive ability profile matches the national bell curve. The 2013 performance tables show 28% high attainers compared to 19% low attainers. For Ulverston Victoria HS in my example the figures are 46% high attainers compared to 8% low attainers.

The DfE low, middle, high attainers profile critically depends on the ability of individual junior schools to maximise their high stakes SATs scores, however it is cognitive ability that counts.

Having said all that the average GCSE grades for DfE 'middle attainers' in the 'most improved' schools surely raises questions. A gain of three levels gives GCSE C, four levels gives B. The 5+A*-C EM figure shows that the three levels is gained in English and maths but not in other GCSE subjects, by a significant margin. The greater the degree of school improvement the greater the margin.

This has many implications.

Janet Downs's picture
Wed, 29/01/2014 - 11:54

Roger - it would be helpful if the DfE had used phrases like "previously low attainers" instead of just "low attainers". The latter could lead readers to think that low attainers might be those in the bottom 25% of Sat scores when, as you point out, they are pupils who didn't reach Level 4.

The ability spread in a school's intake should, of course, be taken into account when judging schools. It's obvious schools with an intake skewed towards the top end will have a higher proportion of pupils reaching targets. Schools with an intake skewed towards the bottom end may struggle.

This seems to be recognised in Scotland. This article says:

"Many schools that rank at the 'bottom' for exam performance are rated by school inspectors as among the best in Scotland."

The EEF wrote a couple of years ago that many "below-floor" schools were nevertheless doing a good job in difficult circumstances. But when it comes to enforced academization the "below-floor" schools are at risk.


rogertitcombe's picture
Thu, 30/01/2014 - 15:56

What I am trying to do is to get under the general concept of 'improvement' in marketised public services. This applies across all such services where the unifying issue is unforseen perverse incentives that distort management priorities and allocation of resources. In education we have the 5+A-C inc EM figure that drives not just league tables, but also the data sets drawn up by Ofsted before they inspect schools (through 'triangulation' these even distort lesson observation judgements) and DfE propaganda about Academies and Free Schools.

In the NHS we do not have hospital league tables (yet) so the DoH resorts to 'National Levers' to incentivise Foundation Trusts through penalties and bonuses (Commissioning for Quality and Innovation - CQUIN). There were CQUIN bonuses for putting terminally ill patients onto the Liverpool Care Pathway, to shorten dying times. Despite blaming Labour for its 'target culture', the system remains riddled with penalties. For example, each A & E waiting time incident that exceeds four hours incurs a £200 penalty to the Trust. The result is that when A & E admission demand peaks, less urgent patients may be kept in ambulances parked outside for long periods. Last week there was a case on the national news of a person dying from a heart attack while the ambulance that could have saved his life was parked outside A & E for an unnecessarily extended period.

In the case of 5+A*-C inc EM, I am suggesting that the very high stakes pursuit of this measure results in perverse consequences that can be more damaging to educational standards than any genuine 'improvement' outcome.

Henry Stewart is doing a brilliant job exposing this in respect of Academy and Free School chains as well as individual Academies and Free Schools. From their very creation, government propaganda 'marketed' Academies not on the basis of the quality and cost of the education provided, which the government conspired (and still conspires) to obscure as much as possible, but on the 'rate of improvement' of the single 5+A*-C measure. This has been massively successful in misleading the entire media, and consequently the general public for more than a decade.

This post is also about how the pursuit of a perverse 'improvement' culture has leaked from Academies into the entire English school system. The figures for the featured schools in my post suggest that the degree of improvement is still linked to the use of equivalents. However they also suggest that a consequence may not just be degradation through replacing quality curriculum with low quality multiple GCSE generating courses, but also by lowering the general attainment of pupils in GCSE subjects across the curriculum. In my post I admit to being puzzled about this, but there is a possible explanation. If most pupils in KS4 are denied access to EBacc subjects because 'equivalents' produce much more GCSE value for the school, then there will be staffing consequences. There will be fewer subject specialists employed, so more such lessons in KS3 and KS4 have to taught by non-specialists resulting in lower GCSE grades.

There is also potentially a much more serious issue. Why does a school that achieves at least 75% of pupils getting a C or better in GCSE maths only manage an average of a D grade for its SATs L4 pupils across all the GCSE subjects taken? If three quarters of all its pupils are getting C+ in English and maths this suggests that the performance in the other GCSE subjects must be even lower than indicated in this part of the performance tables.

In TVEI schools in the 1980s virtually all pupils were entered for GCSEs in what are now the EBacc subjects. I recall that the poorest GCSE grades in most schools were often in maths and that this was generally accepted on the basis that it was a 'harder' subject.

In October 2010, an 11-16 LA controlled community comprehensive in Birmingham (now an Academy) , was widely featured in the national media as the ‘most improved school in the UK’ – Ever. This followed a huge increase of 53 percentage points (up from 21%-74%) in the proportion of pupils gaining five good GCSE grades including English and maths over the period 2007-2010. This school made massive use of 'equivalents', but it was also necessary to increase the C grade pass rate in maths from 21% to at least 74% over the same period. So how was it done?

These were the maths grades as a percentage of the entry.

A* 1.9%, A 12.6%, B 9.4%, C 50.9%, D 3.8%, E 6.9%, F 11.9%, G 2.5%

Almost 14 times as many pupils obtained a C grade than obtained D, but only about half as many pupils gained E than gained F. By comparing the school distribution with the national distribution it suggests that the extra C grades were gained at the expense of B and D grades and an excess of F grades. So what, you may ask? Surely this shows the school's success in enabling its pupils to progress in Further and Higher Education and careers? But what about all those Fs that could have been Es and Ds? And what about all those Cs that could have been Bs?

119 pupils gained a C grade or better in maths, and 128 in English, yet just 9 gained the English Baccalaureate (C grades or better in English, maths, a humanities, science & a modern language). These are the foundation subjects needed for almost all combinations of academic A levels. In 2010 there were156 16+ leaver enrolments from this school onto 27 different AS level courses in seven post-16 institutions. However despite 119 pupils obtaining C+ grades in GCSE maths, just 13 of these post 16 enrolments were for AS maths and 14 for AS English or English Literature.

After Cs in English and maths became required for league tables the C grade pass rate rapidly increased in these subjects in the following years. There have been reports of expensive courses where GCSE maths examiners give 'advice' to teachers on how to maximise C grades. A GCSE maths syllabus might be thought of as a maze systematically strewn with 'nuggets' of knowledge labelled with the Grade for which the nugget is needed. Perhaps these courses inform teachers of the optimum route through the maze in order to pick up the C grade nuggets needed without spending time on harder nuggets not needed.

The Independent of 23 January featured a new OECD report about girls lagging boys in maths.

"Andreas Schleicher, deputy director of education and skills at the OECD, criticised the way maths was taught in the UK - saying lessons in the UK concentrated on solving simple problems without attempting to get pupils to understand the underlying concepts, as happened in high performing Asian countries."

Then there is the issue of multiple early entries that result in pupils ceasing to study maths as soon as the necessary C grade is 'bagged'.

I have tried in this post to draw general attention to perverse outcomes of 'school improvement'. It is true that some of this has been recognised by Gove in his reforms of league tables. My response is that the outcome will just be a new set of high stakes targets generating new perverse incentives. My argument is that schools (and hospitals) are too complex and sophisticated for educational (and clinical) priorities and resources to be effectively managed by such crude, perverse generating, incentivising methods.

This is why I favour a return to continuous local monitoring of school effectiveness, in all its true subtlety and complexity, by something like LEA inspectors, themselves advised and regulated by a reformed and independent HMI.

Frustrated Teacher's picture
Thu, 30/01/2014 - 17:06

Roger - many thanks for your post - it contains so much of importance. I completely agree with you about the false concept of 'school improvement'. I can give an example from my own experience. To get the % of maths C+'s up the school employed a range of strategies including the following:
pupils began studying the GCSE curriculum in Y7 and as soon as they were able to get a C they sat the exam (many of them in Y8).
There were many, many resits until the magic C was achieved.
From Y9 the C/D borderline pupils were taught in small groups with multiple teachers - all other groups were larger with just one teacher (and the groups got bigger through the year after each round of exams).
Maths was given more time on the timetable at the expense of everything else. Maths teachers were 'encouraged' to provide daily 'maths intervention' classes in the morning before school and at the end of the day.
Pupils were rewarded for attendance with free take-away food.
C/D borderline pupils were 'paid' with shopping centre vouchers if they got their C in Y10 instead of Y11.
Pupils were withdrawn from other lessons to do extra maths in the fortnight leading up to the exam.
Pupils were entered for muliple exam boards.
Pupils were entered for multiple routes (linear and modular) at the same time. Private tutors were bought in by the school to work one-to-one with individual C/D borderline pupils.

The overall effect is to increase the % getting C in maths but at the expense of higher and lower achivers in maths. It also impacts on the results in all other subjects because of loss of timetable allocation and withdrawal of students from classes on an ad hoc basis. The pressure on pupils to achieve the pass was immense and destructive and led to lower levels of commitment and motivation in other subjects.

Regarding the relationship between use of equivalents and lower attainment in GCSE's - your point about less skilled teaching staff being employed is correct, but a more important point is that once pupils get used to a much lower level of demand in the 'equivalents' lessons they often find it very difficult to raise their game to the level needed to perform in a more demanding subject. 'Cut and paste' assignments and poorly structured, low-level brush-stroke analysis is often sufficient in BTec's but is no good in academic subjects like history, English literature or physics.

Add new comment

Already a member? Click here to log in before you comment. Or register with us.