Journalists, Please Treat DfE Press Releases with Caution

Henry Stewart's picture
 2
This week the Department for Education will release the school-by-school 2013 GCSE and A level data. This provides a treasure trove of analysis and information for data geeks like me. However this post is for those journalists who, instead of going in for a bit of detailed data analysis, simply refer to the accompanying press release.

A word of warning of the pretty bloody obvious: DfE press releases are not unbiased. They are the government's desired viewpoint and should not be taken as a statement of fact.

(If you have any doubt on this, check this post on how there is one rule, and one set of data, for those schools that Gove likes and another for those he doesn't.)

Ignore any claims about 3 facilitating A level subjects



Last year the BBC led its report on the schools data with the claim that “almost a quarter of England’s sixth forms and colleges have failed to produce any pupils with the top A-level grades sought by leading universities”. This was based on the numbers of students getting 3 "facilitating" subjects (Maths, Sciences, English, History, Geography and Languages).

However there was a slight problem with this. No university asks for 3 facilitating subjects and the Russell Group suggests 2 such subjects as important (and had advised the DfE to use the figure for 2). The DfE had simply invented this new measure out of nowhere, put it in the table and then attacked schools for not achieving it. As I put it last year, "Shock! Horror! schools do badly at measure nobody cares about". But sadly most of the media repeated entirely uncritically this DfE criticism about this made up and irrelevant statistic, not even bothering to check it with the Russell Group. Please, journalists, don't be fooled like this again. (See also "A failure of journalism")

The DfE later quietly added the figure for 2 facilitating subjects. It is still not clear whether it was a deliberate attempt to deter students from taking "non-facilitating" subjects like Art, Music or Philosophy or whether it was simply a cock-up. But this year the tables will still include the irrelevant 3 facilitating subjects figure, as well as the figure for 2 faciltating subjects.

Be cautious with other new statistics



The DfE may add another new statistic and criticise schools for doing badly on a measure they weren't even aware of, as they did with the 3 facilitating subjects and, before that, the English Baccalaureate. Please challenge any new data and check, with other organisations, if it has any meaning.

(Note that Gove never talks about sponsored academy achievement in the English Baccalaureate. This is because only 8% of students in sponsored academies achieved it, half the national average in 2012.)

Be cautious with claims of growth in academy results



The DfE will claim that sponsored academy GCSE results grew at two times or five times or even ten times faster than those of other schools. This is due to what is known in statistics as a "floor effect": those schools with a lower starting point will grow faster than those with a higher one. So in 2012 the GCSE results of schools that were in 2011 in the 30-40% GCSE resutls band grew by an average 7.7%, while those in the 80-90% band fell on average by 3.4%. This has nothing to do with whether the school is an academy or maintained school, but to do with their starting points. As sponsored academies tend to start from a lower base, their increase is greater if compared to all other schools.

Before publishing any claim about greater growth, ask how the sponsored academies did compared to schools previously at a similar level. Or find an Excel expert to do the analysis.

Note: The DfE will only make this sort of claim for sponsored academies. In the case of converter academies they will only ever talk about the absolute level of results, not their growth. This is because converters started out at the upper end and will not grow significantly. (In fact last year their results, on average, fell. I don't think the DfE ever mentioned that fact.)

Check out the effect of GCSE equivalents



To their credit, it was the Daily Telegraph who first spotted, in their article "Academy schools 'inflate results with easy qualifications'", that sponsored academies were far more likely to get their GCSE scores from equivalents such as Btecs. In 2011 sponsored academy results dropped by 11.8% pts when these were taken out, almost twice the drop for maintained schools. In 2012 the drop increased to 14.7% if equivalents were removed and in some of the most admired academy chains, such as ARK and Harris, it was over 20%. Indeed it is possible that the entire increase in sponsored academy results between 2011 and 2012 was due to greater use of equivalents.

(Btecs may be appropriate for some students but their widespread use is regarded by the DfE as "gaming" the results and will, in the main, not be counted towards GCSE results from 2014).

Some questions to ask



(For a new measure) Does anybody care about this measure? Who?

How did the GCSE results of converter academies in 2013 compare to the year before they converted? Did they go up or down?

How did the increase in GCSE results for sponsored academies compare, not to all other schools but to those for maintained schools starting from similar levels?

What % of sponsored academy results were due to GCSE equivalents and not to GCSEs themselves. What would have been their % using the 2014 measure?

What % of students in sponsored academies achieved GCSEs in languages or humanities or the other subjects that Gove favours, and how did this compare to similar maintained schools. (Contrary to the impression Gove likes to give, pupils in sponsored academies were in 2012 less likely to take these "traditional" subjects.")

Let me help you



Education journalists, you will get this data a full day before the rest of us. I understand that you are not all data-savvy or Excel experts. I'm happy to help. Contact me on henry@happy.co.uk or give me a call on 07870 682442 and I'll be happy to check out any DfE claims and see if there are possibly some alternative interpretations.
Share on Twitter Share on Facebook

Be notified by email of each new post.





Comments

Jane Eades's picture
Mon, 20/01/2014 - 21:43

Although I haven't done a systematic look at this, I have noticed that some academies, whether sponsored or converter, claim credit for an improvement in results in the year they changed status. In other words they claim credit for work which was primarily done in the predecessor school.

Again, although I haven't looked at the results from this perspective in the last year or so, much of the progress made in some academies was just a continuation of an improvement in results achieved before the predecessor closed and the academy opened.

A few years ago the headlines screamed about the improvement in results of Bristol Brunel Academy - however, the school which improved its results more than any other school was a community school - but the DfE didn't see that as news. (there was no mention of BBA showing worse results the following year.)

So, yes, Henry, you are right to warn journalists about the way in which the results are presented.

Janet Downs's picture
Tue, 21/01/2014 - 07:54

This is an important and timely post. Far too often journalists churn press releases without bothering to ask questions about the content. I know journalists are up against deadlines but that's no excuse for uncritically publishing press releases.


Add new comment

Already a member? Click here to log in before you comment. Or register with us.