DfE “Coasting” measures: the worst it could possibly have found?

Henry Stewart's picture

When introducing the concept of “coasting” schools, it was saidthat the aim was “to identify schools that have previously fallen ‘beneath the radar’ in leafy areas such as Oxfordshire and Surrey; results seem good, as they have high-attaining intakes, but the department says children are not being pushed to reach their full potential.” It is hard to think of any set of measures that the DfE could have used that would have been less suited to this purpose than those that were chosen. It is not clear if it is deceit or incompetence, but all three measures being used in secondary schools are directly related to the intake. Schools with “high attaining intakes” are very unlikely to be defined as coasting.

GCSE 60% benchmark: Closely related to intake

The first measure of “coasting” is a school that falls below 60% of pupils achieving 5 GCSEs including English and Maths. There is a close correlation between how likely a school is to pass this benchmark and the average KS2 point score:

Of schools whose children have an average KS2 point score of 30 or more (where a 4a is equal to 29 pts and a 5c to 31 points), precisely none would be caught by this measure. However of schools with an average point score below 26 (where 4b is equal to 27 pts), 97% fail to reach the benchmark. The government claimed that the “coasting” measure will affect schools whose “results seem good” but children are not being pushed. However schools whose results seem good are specifically excluded by this measure from being classed as coaching. Either the policy makers at the DfE have very limited numerical understanding or, for the first two years of this new measure, they never intended to include these schools.

Expected Progress: Also directly related to level of intake

The other two measures of coasting are % of pupils making “expected progress” in English and in Maths. This chart shows how the likelihood of not meeting this criteria relates to the school’s intake in terms of KS2 points:

Of secondary schools with an average pupil KS2 point score of less than 26 (where a 4c is 25 and a 4b is 27), 70% in English and 82% in Maths fall below the threshold. In contrast for schools with an average point score of 30 or more (where a 4a is 29 and a 5c is 31), only 2% in English and 1% in Maths fall below the threshold. The 163 English grammar schools are all in that top category (along with 35 comprehensives) and therefore certain to avoid being classed as “coasting”.

“3 levels of progress” does not stretch the “more able”

The reason is simple. The expected 3 levels of progress means a stretching grade C for a pupil entering with a 4c grade at age 11, but only a grade B for a pupil entering with a 5a. As Michael Wilshaw has argued, grade 5 pupils (especially those on 5b or 5a) should be achieving As. Take an example: If a grammar school had 100% of its intake entering with level 5s (as some grammars do), then it should be expected to get them to As or A*. However “3 levels of progress” only requires them to achieve a B and being above the median would in 2014 have meant getting 74% to a B in English or 67% in Maths. If this grammar got no As or A*s and got only 70% of its students to a B in Maths, then it would still not be deemed “coasting”.

Expected progress: “the very worst indicator”

The independent Education Datalab has described the expected progress measure as “the very worst indicator routinely published about schools“. I would agree entirely and have previously written about how it is a “deeply flawed measure“, a “very silly measure” and an example of “how to use data badly“. This might look like a bit of an obsession but the fact that all three posts are in our top 20 viewed, in 2015, suggests there is wide interest in the subject. Education Datalab has also shown how the likelihood to be coasting, on the current definition, is directly related to the school’s intake. On the current definition, 30% of schools with the weakest intake are set to be rated as “coasting” but virtually none of those with the strongest intake.

Schools with disadvantaged intakes far more likely to be termed “coasting”

Nicky Morgan specifically stated that “For too long a group of coasting schools, many in leafy areas with more advantages than schools in disadvantaged communities, have fallen beneath the radar.” However under the current “coasting” definition, those schools are set to continue to fall below the radar. Taking again the “expected progress” element, those schools with low levels of disadvantaged children will be far less likely to fall below the threshold:

Secondary schools with over one in five students categorised as disadvantaged are far more likely than average to fall below the threshold for “expected progress”, with over 55% falling below one of the thresholds. Of the least disadvantaged schools, only 21% fall below the threshold in English and 11% ijn Maths. The reason is that level of disadvantage in a school correlates with lower average KS2 point score. And, as we saw above, lower KS2 point score makes it far harder to hit the threshold.

Will “coasting” be based on school intake or on school results?

There is some sense in finding schools where students are not achieving their potential, if they then receive the support they need to improve. However the current definitions mean that whether a school is defined as “coasting” will depend more on its intake than its results. As the Education Bill continues through the Lords, Lord Lucas wrote to Lord Nash expressing concern that a grammar school that was “achieving respectable results but failing to stretch its pupils would never be caught by the proposed coasting definition”. In response, Lord Nash acknowledged that this was true and would continue to be true until 2018. “I do accept, however, that the attainment element of our proposed interim definition may prevent a school like this being defined as coasting in 2016 and 2017.” From 2016 the GCSE benchmark will be replaced, in the definition of coasting, by the new Progress8 measure. As the coasting definition is based on three years of figures it will take until 2018 for the measure of coasting to no longer include the GCSE benchmark.

Can the DfE produce a genuine value added measure?

The DfE is currently consulting on its definition of “coasting” and it may be that it will change it. (And I will submit this post to the consultation.) However, since the abolition of Contextual Value Added, the DfE has found it very difficult to produce a measure that genuinely measures value added, without being biased to schools with stronger intakes. The standard current measure (before Progress8 comes in) is Best8 Value Added. This is meant to show the genuine value a school adds, regardless of student starting points. It is one of the main measures used by Ofsted to judge the performance of a school. However if we plot the value added figure against average KS2 pts for school, we find a clear correlation:

Under Best8 Value Added, 1000 is the national average. Each GCSE grade represents 6 pts. The 51 pt difference between the top category and bottom category therefore represents a huge 8.5 grades. This indicates a strong bias towards those whose students arrive with high grades.

Why has the DfE produced such an inappropriate measure of coasting?

Did the government never intend to tackle "underperforming" schools in leafy areas? Or does it really not understand the measures it has chosen to use? This may change in 2018 with Progress8. However there are already major doubts being expressed about this measure: for instance, Schools Week found that it "undervalued low ability pupils" and Tom Sherrington (headguruteacher) has described it as "data garbage".  

Share on Twitter Share on Facebook

Be notified by email of each new post.


Janet Downs's picture
Tue, 24/11/2015 - 08:13

There's been much concern lately about struggling schools in deprived coastal areas. But such schools are often judged without looking at their intake. Non-selective secondary schools in selective areas, for example, will be unlikely to achieve the current benchmark of 5 GCSEs A+-C or the future Progress8 measure.

But schools with an intake skewed to the bottom end face not just being defined as 'coasting' but as 'failing' based on results alone. If they are non-academies, they can expect to be academized. If they are already academies they're likely to receive Warning Letters from over-zealous Regional Schools Commissioners (leap frogging the usual procedure of sending out pre-warning letters as has already happened in Norfolk and Suffolk). This will happen irrespective of Ofsted judgements.

Schools with intakes skewed to the bottom are less likely to enter all pupils for 8 exams. This will reduce any Progress8 or Attainment8 score.

Janet Downs's picture
Tue, 24/11/2015 - 08:15

Schools Week pointed out another flaw with Progress8. It is ' calculated by averaging pupils’ individual Progress 8 scores. The average is then reported as a plus or minus fraction of a GCSE to show how much the school’s pupils over- or under-achieve compared to what might be expected given their starting point. The Progress 8 measure is therefore inherently competitive since it depends not just on doing well, but doing better than other schools.'

Michele -Lowe's picture
Tue, 24/11/2015 - 08:40

Thanks Henry. While I sit on the other side of the Wales/England border and could argue that these matters are in many senses of the word 'academic' to us, it's fascinating to see how these arguments play out. This is not least because educational changes in England tend to filter over to us one way or another anyway. I'm slightly puzzled why Contextual Value Added was dispensed with. I don't hear it spoken of in Wales but came across it as a concept in use at Hereford 6th Form College at a talk by the Principle during an Open Day two years ago. I thought it a useful measure, partly because it would highlight progress amongst the least academically able when it happens. The picture seems to be that the brightest are not being appropriately extended and the weakest in academic terms are being neglected. If you know what its history is (CVA) I'd be very interested to know.

Linda starkey's picture
Tue, 24/11/2015 - 11:03

The use of statistics is now so common to justify any conceptual a priori view. Create the question you need to be able to use known numbers to justify the conclusions. "I can prove anything with statistics except the truth". The above and the ability to manipulate contract law are the two basis' for this governments manoeuvring education into the private sector.

agov's picture
Tue, 24/11/2015 - 11:20

The Importance of Teaching The Schools White Paper 2010 -

"6.12 We will put an end to the current ‘contextual value added’ (CVA) measure. This measure attempts to quantify how well a school does with its pupil population compared to pupils with similar characteristics nationally. However, the measure is difficult for the public to understand, and recent research shows it to be a less strong predictor of success than raw attainment measures. It also has the effect of expecting different levels of progress from different groups of pupils on the basis of their ethnic background, or family circumstances, which we think is wrong in principle."

This has a paragraph on it -


Janet Downs's picture
Tue, 24/11/2015 - 12:45

Michele and agov - in 2011 the OECD said CVA was a 'step in the right direction' even though it was imperfect. It called for the use of 'more sophisticated measures'. The Gov't had already decided CVA would be dumped. Worse, it increased the benchmarking and the 'extensive focus on grades' which the OECD was concerned about.

The OECD said benchmarking was more extensively used in England than in most other OECD countries and high-stakes tests could have negative consequences. It recommended there should be 'less reliance' on GCSE results which it said should be 'lessened' and primarily used to decide post 16 progression.

But Progress8 continues to rely on GCSE results and will use them to judge schools.

(See Chapter 3, 'Reforming Education in England', OECD Economic Surveys 2011)

Martin Johnson's picture
Tue, 24/11/2015 - 14:54

Another problem with CVA: it compared a school's value added with all other schools. The very large majority of schools clustered around the mean (100 at first, then 1000). The only possible interpretation was that most schools did not vary significantly in their effectiveness. This confirmed a range of other research evidence, but was inconvenient for politicians wedded to a 'good school/bad school' market approach.
The problem remains. When people talk about 'good schools' in fact they are almost always describing the school intake. Unfortunately this concept is deeply embedded within English cultures.

Michele -Lowe's picture
Wed, 25/11/2015 - 13:51

Thanks Janet and Agov and Martin. I can see there are flaws with CVA, but I can also see it was an attempt to create a more sophisticated measure of a school/institution. Anything which succeeded in doing this would win my vote, because as Martin says, the habit of judging a school by its intake is so ingrained that it passes unchallenged. Rather like many other myths to do with education.

Add new comment

Already a member? Click here to log in before you comment. Or register with us.