Mind the gap - it can result in success or failure

Janet Downs's picture
Schools and local authorities (LAs) are constantly told they must close the attainment gap between disadvantaged and advantaged children. Ofsted comments on how well schools and LAs are doing on this measure. It influences inspectors’ judgements.

It would be logical, then, to assume that LAs with good results for pupils who’ve been eligible for free school meals or in LA care any time in the last six years (FSM6) would be praised.

Not so. Bath and North East Somerset was described as a 'problem area' despite results for disadvantaged Key Stage 2 FSM6 pupils being slightly above the national average (68% achieved Level 4 in reading, writing and maths against a national average of 67%). This was because the gap between FSM6 11 year-olds and non-FSM6 pupils (18) was still wider than the national average (16).

Why was the gap so wide? It’s because the results of non FSM6 pupils rose even faster: 86% reached the benchmark against 83% nationally. The wide gap is caused by the higher than average score of non FSM6 pupils. Despite the area’s FSM6 pupils doing slightly better than the national average, the LA is still deemed to be doing poorly in narrowing the gap. A cynic might say the best way for LAs and schools to close the gap would be to focus on disadvantaged pupils and less on advantaged ones so results of the latter would be depressed. This would allow disadvantaged pupils to catch up.

However, there is another way of defining disadvantage: pupils who are eligible for free school meals or in LA care in the academic year when tests were taken (FSM1). Nationally, this increases the average gap between advantaged and disadvantaged pupils from 16 to 18 – just two points. The Local Authority Interactive Tool (LAIT), used by Ofsted for its figures in the speech criticising Bath and NE Somerset, defines disadvantage as FSM1.

But measuring Bath and North East Somerset on this FSM1 measure widens the gap from 18 to 26 - eight points. That’s because of the disproportionate effect of small sample size. There were just 160 FSM1 pupils* in Bath and NE Somerset and 319 FSM6 pupils. This would cause the figures in the two datasets to be more variable.

This variability has the effect of increasing a modest gap, which was in any case caused by the better-than-average performance of non-FSM6 pupils, to a considerably wider one. On the latter measure, Bath and North East Somerset is indeed a ‘problem area’ where just 59% of the 160 FSM1 pupils reached the benchmark against a national FSM1 average of 68%.

The publically-available School Performance Tables use the FSM6 definition of disadvantaged. I believe this is fairer. Children are in schools for several years and could sometimes be eligible for FSM and sometimes not. The effect of education is cumulative and not just the result of what happens in the year when exams are taken.

Does it matter if the Tables and LAIT use different definitions of disadvantaged? Yes, it does – the effect on Bath and North East Somerset shows this. A local authority which did slightly better for FSM6 pupils in School Performance Tables at KS2 becomes one that LAIT identifies as doing badly. It turns Bath and North East Somerset from an authority where all KS2 pupils do quite well into a ‘problem area’.

The amount of data available is burgeoning: School Performance Tables, LAIT, RAISEonline the National Pupil Database (containing information about individual pupils). Data is used by politicians, inspectors, the media, schools themselves to judge ‘performance’. Yet few of the data users are statisticians. Data in the hands of the uninformed (or the unscrupulous) can be unsafe. My colleague Henry writes about ‘How to use data badly’ here. Icing On the Cake describes the ‘dangerously flawed’ data in RAISEonline here.

There’s something wrong when definitions in datasets don’t correspond. Conclusions have far-reaching consequences for LAs, schools and academy chains. They can make the difference between success and failure. And in our prevalent culture of blaming, naming-and-shaming, it heaps on more negativity.

Failure, remember, is used to enforce academy conversions, oust heads and pillory local authorities. It’s essential that data is accurate, consistent and used with care. But above all, it’s important to remember that data doesn’t always capture what matters. This seems to have been forgotten – education has been reduced to number crunching.

UPDATE 4 April. Sir Mike Tomlinson, Chief HMI 2000-2002, says Ofsted inspections are inconsistent and too dependent on data. It’s important, therefore, that Ofsted realises the limitations of data. LAIT and school performance tables are two data sets but, as we’ve seen, they differ in definitions used. Interpretations can be equally valid depending on which data set has been used. But, as we see here, the conclusions can mean the difference between success and failure. And this can mean the difference between teachers and LA school improvement staff keeping or losing their jobs.

With stakes as high as this, data should be used with care.

*figures in email to author from Jayne Middlemas, from the Department for Education’s Primary Performance and RAISEonline School Performance Data Unit
Share on Twitter Share on Facebook

Be notified by email of each new post.


John Mountford's picture
Mon, 30/03/2015 - 21:31

Janet, I live in Bath and NE Somerset. My MP is Jacob Rees-Mogg. I have sent him a link to your report.

Dear Mr Rees-Mogg

We met some time ago on a visit you made to XXXXX School in Midsomer Norton. I was a governor at the time and during the discussions following your tour, governors were at pains to express to you their deep concern about how the number of children on free school meals and in receipt of the pupil premium plays such a significant part in all the school does in raising pupil attainment.

We were at pains to explain ways in which pupils, needing it, gained additional support. We expressed our concern at how the small cohort of children entered for SATs in Yr 6 is another factor that depresses the school's results because each individual represents such a disproportionately large percentage of the whole. Throughout, we were at pains to question the fairness of a system that deals with schools like this by insisting that they have to be measured against schools with such different, often less challenging, baseline characteristics.

The outcome of the latest Ofsted inspection confirmed our view that the major judgement about pupil attainment was made before the inspector crossed the threshold after simply analysing the school's data (RAISEONLINE).

I am writing to share with you a posting on the Local Schools Network, written by Janet Downs on March 30 2015,


As the serving MP for Bath and NE Somerset, I would be grateful to receive your observations about the issues raised by Ms Downs in her report. It seems to me that the reputation of BANES is tarnished by the conclusions drawn and I see little scope for a substantially different interpretation.

Most Respectfully
John Mountford

John Mountford's picture
Mon, 30/03/2015 - 22:32

Janet, with a few amendments in light of the suspension of Parliament today, I have sent this and will update you in time.

Being rather cynical, I fear the accuracy of your suggestion:

"A cynic might say the best way for LAs and schools to close the gap would be to focus on disadvantaged pupils and less on advantaged ones so results of the latter would be depressed. This would allow disadvantaged pupils to catch up."

In Henry's latest posting,


In response to a comment I made about the direction schools may be traveling because of additional pressures generated by the New NC, I think Brian might have taken my comment, on the responsibility of schools to refuse to bow to government pressure to 'up the ante', as harsh one.

I said:
" …. why are professionals seemingly aiding and abetting successive governments in turning education into little more than a number-crunching exercise where shallow, rote learning is apparently valued so highly"

To which he replied:

So much commentary at present is attending to the use/misuse of data. Whilst it is crucially important that attention is given to this matter, I feel there is a moral imperative for professionals to resist any government's attention-grabbing, short-term reforms to education. This is needed especially where policies clearly devalue professionals and damage the education experience for children. It should be part of a unified campaign of non-compliance orchestrated by all the professional bodies in close collaboration.

Andy V's picture
Wed, 01/04/2015 - 15:37

It is also noteworthy that when the inspectors call they are bound to evaluate the progress of each cohort and sub cohort of learners e.g. Low, Middle and High Ability, the disadvantaged in relation to the progress of non-disadvantaged pupils. The undergirding principle being that pupils progress is set against their respective starting points.

I would float into the mix that if a school attempted to deliberately depress the progress of one group to give the illusion of the disadvantaged making better progress - and thus CtGs - they would fall foul of the progress for all based on respective starting points.

Brian's picture
Wed, 01/04/2015 - 15:54

I work with a school where progress for disadvantaged pupils so far this year has been excellent. However the changes to provision for disadvantaged pupils have also benefitted other pupils who have made even better progress. Result ... the school has 'failed to close the gap ' between disadvantaged and other pupils.

Andy V's picture
Wed, 01/04/2015 - 16:51

A common cause of that is a focus on generic improvements as opposed to targeting particular groups of learners. I have noticed an increasing frequency of a recommendation in S5 reports that an independent review of pupil premium is required. It appears that many schools do not focus the PP funding on pupils attracting it; preferring to drive on generic whole school improvement strategies, which is not what the purpose or goal of PP.

Brian's picture
Wed, 01/04/2015 - 21:45

Not in this case. The school has a large number of 'pupil premium' children and us able to provide a range of interventions and initiatives focused specifically on them. Some if course are generic ... additional library provision for example. But the main impact seems to be from providing additional targeted teaching, which of course means that the other pupils also find themselves in smaller groups with additional teacher or TA time.

I have no evidence that 'many' schools are spending pupil premium money on generic issues nor that this is, of itself, bad use of the funds.

Andy V's picture
Thu, 02/04/2015 - 08:21

My evidence arises from monitoring Ofsted reports and then viewing school websites. That is to say, I receive inspection reports alerts from Ofsted and for each school recommended for independent PP review visit their website to see how they use their PP funding. The most common findings are:

1. The independent review is invariably linked to schools graded 3 or 4
2. There are statutory minimum requirements regarding what information schools must have on their website and PP is in that category. A handful of schools simply don't disclose but the vast majority do and fall into three broad categories:

a. Generalised description of what PP is and what they spend it on
b. A fuller description of what it is spent on with a stated focus on generic strategies to uplift T&L for all
c. A mix of (a) and (b) above

The information that is commonly missing is an evaluation/analysis of the impact of each intervention/strategy for pupils attracting PP.

On schools graded 1 and 2 the PP funding is full detailed and includes a breakdown expressed in terms of: What the intervention was, Which group was targeted (e.g. Algebra or decimals, Extended writing in English), How much it cost, The impact in terms of progress and achievement compared to non-PP pupils, and a statement about the overall impact on CtG.

I hope that helps in understanding where and how I get my evidence. :-)

Brian's picture
Thu, 02/04/2015 - 09:47

Thank you. But I didn't say you had no evidence of less strong use of PP funding. I simply queried the comment that 'many' schools are not focusing PP spending on appropriate strategies for PP pupils. The schools I work with all are, and that includes generic spending of PP funding, with no targeted spend on PP pupils, but that less than thirty schools so I'm never going to attach 'many' or 'most' to that evidence.

Is it possible that if your evidence is alerts from Ofsted about schools where PP funding isn't having much impact then maybe your 'many' is based on somewhat skewed evidence?

I also recognise that some schools don't report the PP spend and outcomes fully on their website. There are, I'm sure, a number of reasons for this, including it being yet another bureaucratic task which takes time and achieves little. Gives Ofsted inspectors something else easy to check on I suppose.

Roger Titcombe's picture
Thu, 02/04/2015 - 11:01

As I have argued previously, the 'gap' does not exist. Where CATs scores exist, eg in my possession for Mossbourne Academy (recently) and for the whole of Cumbria (some years ago) they show not only that the CATs score is an excellent cohort predictor (the bigger the cohort, the better the prediction), but crucially this relationship is not significantly affected by FSM or other poverty indicators.

I have made this argument a number of times on LSN, as well as in my book, 'Learning Matters' My latest summary of it can be found on my website here.


Not only does the 'gap' not exist, it would be quite wrong to close it, as this could only happen by depressing the attainment of the highest performing pupils.

However, this does not mean that there is not a serious problem. There most certainly is. The problem is not lack of equality in terms of attainment, but failure to fully develop cognitive ability at all levels and especially the lower ones. The argument in 'Learning Matters' is that this failure is a relatively recent consequence of marketisation and the consequent degradation and abandonment of teaching methods that produce deep understanding and which develop cognitive ability, in favour of behaviourist 'quick fixes' designed to hit key marketisation targets. If you don't accept this after reading my book, then fine. But please read the book first.

It is an irony that many of those who reject my argument come from the political left. They do not like the fact that areas of relative poverty correspond closely with areas of low mean cognitive ability. Furthermore, schools that serve the pupils from such backgrounds have been forced by 'zero tolerance of failure' floor targets to deprive these very children most in need of cognitive development from the sort of curriculum and teaching that best develops it. Most of this cognitive 'heavy lifting' should take place in KS2 & KS3, but floor target pressure has seen KS2 SATs cramming and KS4 starting in Y8 or Y9 with lower cognitive ability pupils crammed for C grade passes with multiple annual entries, until the C grades were hit, then the subjects could be abandoned and forgotten.

Gove rightly put a stop to this, but we are yet to see the consequences within our marketised system. They may be unforeseen and perverse. Many heads are now saying that 'Best 8' simply brings about new forms of gaming that still deny all pupils access to a fully development, broad and balance curriculum up to the age of 16.

The cover of 'Learning Matters' shows pupils in rows listening to the teacher. One is passing the other a note on which is written, "Do you get it coz I don't".

Given a highly skilled teacher that gives a clear step-by-step explanation of, for example, Newtons Laws of Motion, why would some pupils 'get it' and some not? It certainly has nothing to do with FSM.

Assuming a compliant, attentive class there is only only one credible explanation and Piaget and Vygotsky provide it. For facts that are individually comprehensible to make sense when combined into a concept, requires the pupil to have already developed a sufficiently sophisticated personal cognitive framework (a schema) that can meaningfully accommodate the new information. Vygotsky put it beautifully in 1986.

"As we know from investigations of concept formation, a concept is more than the sum of certain associative bonds formed by memory, more than a mere mental habit; it is a genuine and complex act of thought that cannot be taught by drilling, but can only be accomplished when the child’s mental development has itself reached the requisite level."

Developmental teaching strategies expose pupils to escalating cognitive dissonance by presenting them with successive factual challenges that require them to continuously refine their mental models. Appropriate 'social plane learning' support must be provided. This develops plastic intelligence.

Shayer and Adey have proved that this works. More recent theorists like Guy Claxton also accept this. Appropriate developmental teaching results in gains in general intelligence that are transferable to all subjects and contexts. Pupils gain in wisdom as well as intellectual ability and confidence. Crappy behaviourist cramming actually makes kids dimmer. This is still only a hypothesis, but Shayer has certainly shown that kids have got dimmer in terms of ability to comprehend complex concepts. I am offering in my book degraded teaching methods caused by marketisation as the explanation.

All the references and arguments are in 'Learning Matters'. I am happy to debate this here or anywhere else, but it will be tedious unless you read the book first.

As for 'Closing the Gap', it is a distraction from the necessity to provide all pupils of all abilities with their entitlement to an individually developmental education. Every child's development should have an equal charge on the resources of the school, not just favouring those on the KS2 L3/4 and GCSE grade D/C borders.

Only a comprehensive school system can provide this but just abolishing selection at age 11 is not enough if selection for resource allocation and good developmental teaching is still taking place.

Janet Downs's picture
Sat, 04/04/2015 - 09:45

UPDATE 4 April. Sir Mike Tomlinson, Chief HMI 2000-2002, says Ofsted inspections are inconsistent and too dependent on data. It's important, therefore, that Ofsted realises the limitations of data. LAIT and school performance tables are two data sets but, as we've seen, they differ in definitions used. Interpretations can be equally valid depending on which data set has been used. But the conclusions, as I've said, can mean the difference between success and failure. And this can mean the difference between teachers and LA school improvement staff keeping or losing their jobs.

With stakes as high as this, data should be used with care.

Andy V's picture
Sat, 04/04/2015 - 09:50

In which case we should be praising Michael Gove for scrapping levels and sub levels and allowing the secondary sector in particular the freedom to explore and implement assessment without levels.

Janet Downs's picture
Sat, 04/04/2015 - 10:30

Andy - but schools will still be judged either against 'national standards' (Key Stage 2) and 'Progress 8) at Key Stage 4. The national standards are ambiguous and Progress 8 impacts negatively on schools where pupils don't take 8 GCSEs. Henry describes how levels of progress can be a flawed measure here.

Andy V's picture
Sat, 04/04/2015 - 10:57

Janet - I don't disagree but this has nothing to do with the phonics thread.

Andy V's picture
Sat, 04/04/2015 - 13:14

Apologies, mea culpa! I got confused with exchanges from earlier this morning.

Janet Downs's picture
Sat, 04/04/2015 - 11:44

Andy - we're not on the phonics thread. I'm going there now - at least I hope I'll end up there! Perhaps I'll fail to 'mind the gap' and fall through it. Or maybe end up back here. I'm beginning to feel dizzy.

Add new comment

Already a member? Click here to log in before you comment. Or register with us.