Stories + Views
Ofsted Dashboard Uses the Wrong Data
When Sir Michael Wilshaw showed me round Mossbourne some years ago one of the most impressive features was their use of student data. The school leadership and each individual teacher received reports, every six weeks, on student progress against expectation. No student could quietly slip behind and intervention could be quickly targeted at those that needed it.
Given that data was at the core of his success at Mossbourne it is no surprise that Sir Michael has introduced a data dashboard, with the very sound aim of ensuring governors are well informed. However, in using the “expected progress” measure it is as likely to mislead governors as to inform them.
The Expected Progress Measure is Flawed
Ofsted is absolutely right to want to judge schools on the progress their students make and not on their absolute results. One secondary school with 65% achieving 5 A-Cs (including English and Maths) could be coasting on a good intake, while another only achieving 60% may be getting their students to make more progress, given where they started from at age 11. The use of the % achieving “expected progress” is intended to address that, and to stop schools being satisfied with getting students to a C grade.
The problem, as I outlined here, is that the “expected progress” measure is a very crude one. It is not based on the expected progress of each student but on a crude yardstick that all students, whatever grade they start on, should make three levels of progress. In fact how many levels of progress students makes is strongly influenced by their starting point. I reprint here the graph from the previous article. While for Maths over 80% of students with level 5 at age 11 make the expected three levels of progress, only 32% of those on a level 3 at age 11 make that progress.
A Tale of Two Schools
To see the problem, let’s take two extreme schools, High Start and Low Start:
At High Start, all the students arrived with a level 5 and they get 85% of these to a GCSE B grade and 25% to an A in Maths. Any decent governor would be aware that this school is under-performing, but the dashboard would put it in the top quintile for expected progress as only B grade is expected on this measure for level 5 students.
At Low Start, all the students arrived with a level 3. 64% make 3 levels of progress, to a D grade, and many achieve a C. The school is getting twice the national average to three levels of progress, and many beyond that, is probably in the top 10% in the country for value added. However this dashboard will show the school as under-performing.
The “expected progress” measure does not tell us how well a school is doing. Whether a school is above average is, statistically, much more likely to be down to the intake of the school than actual progress made.
What Ofsted’s Dashboard Should Include
The irony is that Ofsted collects data that accurately reflects student progress. Its annual Raise Online report, for each school, gives a measure of value added in each subject. These are not based on the crude 3 levels of progress for all, but on what each student should be expected to achieve (given where they started at age 11), and whether they are above or below that expectation. if Ofsted were to based their dashboard on these value added figures, it would be genuinely useful information for governors and parents.
This morning, on the Today programme, Emma Knight (Chief Executive of the National Governors Association) talked about the more sophisticated data that many governing bodies use. Learning from our visit to Mossbourne we built a similar data tracking system to theirs at the school I chair. Every teacher knows, for each subject, whether their students are red (below expectation), yellow (at expectation) or green (above expectation). The Governing Body receives each term a report that shows, for each year and broken down by target groups, what % of are on target and above target. It ensures both the school and the Governing Body are aware of progress, and of any issues, long before the GCSE results – when it is too late to intervene for those students. This kind of system is probably now quite common and is, I think, the kind of data that Emma was referring to.
I am hugely grateful to Sir Michael for what we learned at Mossbourne all those years ago (even though it seems very obvious now). And I do realise that Ofsted does not have the data to produce that kind of detailed dashboard for progress in all years as its data being principally based on GCSE results. But I ask him to review this dashboard, and to ask whether it informs or misleads governors and others. A switch from “expected progress” to “value added” would provide a genuinely useful resource for all.