Last June the Chief Inspector, Sir Michael Wilshaw, complained that too many schools "failed to challenge the brightest" and talked of a "culture of low expectations". The basis of his criticism was that many students who achieved a Level 5 in SATs at age 11 failed to get an A or A* at GCSE at age 16.
This month Ofsted has again published a "data dashboard" for every state school in England, seeking to help parents and governors understand the performance of the school. Last year I explained
how the dashboard used the wrong data, in a post that is (slightly surprisingly) one of LSN's dozen most visited ever. The key fault is well illustrated in the light of Wilshaw's comments:
Whether level 5 students get B, A or A* makes no difference to any of the figures in the Ofsted data dashboard.
In the dashboard a school getting all its level 5 students (even all its 5a students) to a B would appear to be doing just as well as a school getting all its level 5 students to an A*. (The measures used in the dashboard are the % getting 5 A-Cs, with English and Maths, with no credit for any students getting above a C, and % achieving “expected progress”, with no credit for getting above a B.) If Sir Michael wants to raise expectations, he should perhaps ensure the dashboard uses data that encourages those expectations.
The Problem: “Expected Progress”
The DfE has set "expected progress" for English students of 3 levels from age 11 to age 16. This means going from a level 3 to a D, a level 4 to a C and a level 5 to a B. I actually agree with Sir Michael that schools should expect students on a 5b to get to an A at GCSE and those on a 5a should aim for an A*. However, according to the DfE measure, a student that goes from 5a at age 11 to a B at GCSE has made “expected progress”.
The figures for the % nationally making 3 levels of progress shows the variation:
The % achieving 3 levels varies according to the age 11 starting point. While 82% of students arriving with a Level 5 in Maths went on to make 3 levels of progress, just 21% of those on Level 2 and 46% of those on Level 3 made that amount of progress (2013 GCSE transition matrices
). There is more variation among sub-levels: While 61% of 3a students achieve 3 levels of progress, just 21% of 3c students do so. (The reason is simple: both have to get a D to achieve 3 levels of progress and this is a much bigger jump from 3c than from 3a.)
It may be the case that we should be challenging schools to have higher expectations of their level 2 and level 3 students. However having a flat 3 levels as the “expected progress” for all students fails to provide any challenge for level 5 students. For 5b Maths students, 59% make 4 levels of progress (to an A) and for 5a Maths students, 53% make 5 levels of progress (to an A*). Clearly the number of expected levels of progress should vary with the starting point of the student.
Ofsted does not use the dashboard data in inspections
Last December I sent a tweet which suggested Ofsted used these flat measures of "expected progress" in its inspections as well as in its dashboard. HMI inspector David Brown (national lead for ICT) tweeted back that I was wrong. Ofsted's Raise Online includes tables comparing progress for students at a school, based on their KS2 starting point, and the guidelines for inspectors are very clear on this. He directed me to p35 and 36 of the School Inspection Handbook
He is absolutely right on the guidance. For instance, for the outstanding judgement, the handbook states "From each different starting point
(my emphasis), the proportions of pupils making expected progress and the proportions exceeding expected progress in English and in mathematics are high compared with national figures." (I am not sure this is observed by all inspectors. If your school has more students starting from low KS2 levels, do make sure your inspector is comparing their levels of progress with national figures for those starting points and not for the overall average.)
This is curious. For inspections, Ofsted is very clear that a school should not be judged on an overall figure for the proportion making expected progress, but on the range of progress according to their starting point. But this overall figure is, along with the % A-Cs figure, the main measure used in the dashboard.
Ofsted, please change your dashboard
The idea of providing a simple set of graphs which show the strengths and weaknesses of a school, and helps inform governors, is good. However, to be useful, it must use data based on real student progress. Fortunately Ofsted has this in abundance in its Raise Online data in the form of a value added figures, for GCSEs only and for GCSEs with equivalents.
If the data dashboard was based on the value added figure, for all students and for “disadvantaged” students, it would give a fair and clear picture of the progress being made by students in the school.
It would also prepare schools for the changes to be introduced next year. In 2015 the key measure will change dramatically with the introduction of Best8, a measure simialr to the one Ofsted displays in Raise Online. It will be based on the average progress made by all students, relative to their starting points, and no longer encourage the threshold effect of focusing all effort on getting students across the C/D borderline.
Changing from focusing on “expected progress” to value added would turn the charts in the dashboard useful measures. Ofsted, please include these next year.
(Also in 2014 the method of calculating the key measure, 5 GCSE A-Cs including English and Maths, changes with many GCSE equivalents (such as Btecs) no longer being included. Ofsted knows what every school would have achieved in 2013 on the new 2014 measure and have included it in Raise Online. It would have been very useful to include it in the dashboard so governors, and others, were aware of what to expect in 2014.)