A Level Results - Guardian in First with Nonsense Articles

Henry Stewart's picture
It is A level results day again and we brace ourselves for another set of spurious articles based on press releases from the Independent sector. The Guardian is in first year with possible the most spurious use of data yet seen:

"There are some indications that private schools may have widened the gap with state schools, my colleagues Jeevan Vasagar and Jessica Shepherd write. The Girls' School Association, which represents heads of independent girls' schools, said that of the first schools to announce their results, 32.9% of grades awarded were at A* level and 70% at A and A*. In last year's results, for all private schools, 17.9% of entries were awarded an A* and just over half secured the two top grades."

How much can one paragraph get wrong? A prediction of a widening gap based on 18 out of 2,500 schools? A prediction with no information at all from the state sector. A prediction based on one set of 18 schools and comparing them, not with those school's results last year but with the whole private sector? Did it not occur to the reporters that these 18 schools might be generally above average?

And why does, year after year, the Guardian choose to base its results coverage on press releases from the independent sector. Do they not realise that the people issuing those releases could, just possibly, have their own interests and agenda?

If you would like to comment on the Guardian's approach, please post thoughts on the Guardian's education blog.

To show how ridiculous the Guardian's article is, I did a quick analysis of the first 17 comprehensive schools to report results on the Guardian's own web site:

Indications that comprehensives have overtaken private schools in A level results
For the first 17 comprehensives submitting their results to the Guardian web site an average 28% of A level results have been A*. For all private schools last year, just 18% got an A*. The first indication, therefore, is that state schools have not only closed the gap but sped ahead of private schools.

Of course this has no more basis than the Guardian's own article. It is just possible the first 17 schools reporting are not representative of comprehensives as a whole. But to acknowledge that would be to actually think about what data means, and that might be asking a lot of some journalists.
Share on Twitter Share on Facebook

Be notified by email of each new post.


Francis Gilbert's picture
Thu, 18/08/2011 - 09:32

The trouble is that no one has done a systematic "value-added" analysis of the private sector's results. Most private schools are highly selective and admit only the wealthiest pupils in the country; most comprehensives do very well with these sorts of pupils -- when they get them. Many private schools ask pupils to leave if they are going to get poor results as well. I know that private tutors are very regularly used in the private sector too.
Basically, we know that if you come from a wealthy background, you're likely to do well no matter what school you went to. That's all the private sector results tell us; there's a strong correlation between between wealth and academic results.

Henry Stewart's picture
Thu, 18/08/2011 - 10:24

Francis, that is absolutely right. I have just done some interesting analysis, looking at the strongest students at the school where I am governor. I looked at only those achieving 5b or better in the three Key Stage 2 subjects (English, Maths, Science), which are the sort of children that would get into highly selective schools.

Looking at their GCSE results I found that 87% of the results obtained were A or A*. Now that is interesting because private schools publish the % getting A or A*.

Take Wellington College. This is an elite school, charging £30,000 a year and requiring an entrance test. So its a fair assumption that the students there will have been at least 5b level at age 11.

So how does Wellington compare with my local comprehensive? At Wellington, 63% of GCSE results were A or A*. As I've said, the % of 5b students at my local school getting those grades was 87%. The local comprehensive appears to win hands down.

Davis Lewis's picture
Wed, 21/09/2011 - 11:36

I doubt if Wellington College would have admitted pupils who attained level 2 or 3 at KS2.

Henry Stewart's picture
Thu, 18/08/2011 - 10:50

Postscript: @jeevanvasagar, the author of the Guardian article, tweeted me this response:

@happyhenry you're right. it was ill-judged. sorry

Janet Downs's picture
Thu, 18/08/2011 - 17:42

Well done Henry, for getting a response. But it's a bit late for the author of the article to admit it was ill-judged. It should behove all journalists to check information and not just rely on a press-releases which are always written to show the writers of the press release in a good light (whatever the organisation).

botzarelli's picture
Tue, 23/08/2011 - 13:38

At the moment Value Add isn't particularly interesting or meaningful for parents - no child is likely to put their school's CVA score on their CV.

Perhaps it might be worth lobbying UCAS to change its points system to reflect Value Add so that it was only possible to get 600 points if you received 3 A* from a school with the highest CVA in the country. Schools that do not produce value-add data would by default be allocated an UCAS value add score equal to the lowest published CVA in the country.

However, I can see that this might have a few unintended consequences, although not necessarily malign ones. Etonians might switch to their nearest comprehensive at 16 to game a few extra UCAS points (although this could be avoided by applying the CVA for a child's school up to GCSE to their A level results). Or, private schools might work hard on adding value themselves. Or perhaps that schools generally will focus on value add rather than raw exam scores.

Add new comment

Already a member? Click here to log in before you comment. Or register with us.