Have sponsored academies increased their use of equivalents since 2010/11?

Janet Downs's picture
Sponsored academies were more likely to enter pupils for equivalent exams in 2013. These are usually vocational exams which might appeal to some pupils more than GCSEs. However, there is suspicion that schools have been using equivalent exams to inflate their GCSE results.

Education Secretary Michael Gove has removed many vocational exams from the list of exams which count towards league tables. He has also rightly lowered the equivalent value – it will no longer be possible for a vocational exam to be worth 4 GCSEs.

But has the use of equivalent exams by sponsored academies increased or decreased since the Coalition came to power in May 2010?

Not all sponsored academies open in 2013 submitted results in 2010/11. But where these were available I compared the results for a selection of academy chains in 2010/11 and 2012/13. I looked at the gap between the topline figure (ie the proportion that reached the benchmark*) and the figure with equivalent exams removed. The larger the gap between results with equivalents and without them, the larger is the use of equivalent exams.

ARK academies (5 sponsored academies in 2010/11)

2010/11 average gap: 21; 2012/13 average gap: 17

On average, pupils in ARK academies took fewer equivalent exams in 2013. However, the average figures hide significant differences. Burlington Danes, for example, makes less use of equivalent exams in 2013 (gap reduced from 11 to 5) while Charter Academy increased its gap from 22 in 2011 to 36 in 2013 suggesting a greater use of equivalent exams.

E-Act academies (8 sponsored academies in 2010/11, but see note below)

2010/11 average gap: 13;2012/13 average gap: 17

On average, E-Act academies made greater use of equivalents in 2013

Grace academies (3 sponsored academies in 2010/11)

2010/11 average gap: 7;2012/13 average gap: 17

The use of equivalent exams in Grace academies appears to have increased significantly since 2011.

Harris (8 sponsored academies in 201/11)

2010/11 average gap: 13;2012/13 average gap: 10

On average, pupils in Harris academies took fewer equivalent exams than in 2013.

Oasis academies (11 sponsored academies in 2010/11)

2010/11 average gap: 10;2012/13 average gap: 19

On average, Oasis pupils were entered for far more equivalent exams in 2013.

Ormiston academies (7 sponsored academies in 2010/11)

2010/11 average gap: 16;2012/13 average gap: 19

On average, Ormiston pupils took slightly more equivalent exams in 2013

Priory Federation (4 sponsored academies in 2010/11

2010/11 average gap: 10;2012/13 average gap 12

On average, pupils in Priory Federation academies took a few more equivalent exams in 2013


Two large chains reduced their use of equivalents: ARK and Harris. Two chains, Ormiston and Priory, used a few more equivalent exams. Three chains: E-Act, Grace and Oasis increased their use of equivalent exams significantly. So, five chains used more equivalent exams and two used fewer. If this sample is typical it would appear that the use of equivalent exams by sponsored academies has increased since 2010/11.

But the sample represents only 45 of the 250 or so sponsored academies with results in 2010/11 – slightly less than a fifth. I’m not a statistician so I don’t know if that figure is a representative sample. However, I included the major chains E-Act, Harris and ARK.

*5 A*-C (or equivalent) including Maths and English

Notes: the 2010/11 data for the Crest Girls’ Academy (E-Act) appeared to be wrong. The results without equivalents were given as 52% but school performance tables said the results with equivalents were 43%. The proportion without equivalents can’t be more than the proportion with equivalents. I omitted Crest Girls’ from the calculation above.

I have tried to be as accurate as possible. It was relatively easy to check results for academies where names included the sponsor’s name (eg Harris) but others (eg E-Act) had to be checked individually in their respective local authorities.

References: Topline figures are available for 2010/11 and 2012/13 from School Performance Tables. The 2012/13 figure with equivalents removed is available from the results for each local authority (or academy chain if the name is included in the name of the school) in School Performance Tables under KS4 results. The 2010/11 data for percentage of pupils who achieved 5+ GCSEs A*-C without equivalents can be downloaded here.
Share on Twitter Share on Facebook

Be notified by email of each new post.


A Cooper's picture
Tue, 25/02/2014 - 19:13

Delightful news about ten of the E-ACT academies today, from the DFE.

Andy V's picture
Tue, 25/02/2014 - 19:56

The E-ACT news is a little hollow in that the 10 academies will simply be transferred to another sponsor/chain rather than returning to the state :-(

Chris Manners's picture
Tue, 25/02/2014 - 21:40

Credit to Roger OThornhill, Flythenest

Dates E-Act took over schools. Look at the rush that came under Gove.


Janet Downs's picture
Wed, 26/02/2014 - 08:52

Chris - the Academies Commission 2013 found some academy chains were growing too quickly and many of these had no coherent policy to improve their schools. They didn't name names but shortly afterwards AET was said to have been banned from taking on any more (but still took on schools up to January 2014 so little was actually done by the DfE to stop AET's expansion).

The EFA found AET had made "unusual" payments to its trustees. Lord Nash has sent AET at least 7 warning letters.

The EFA also found E-Act had financial irregularities which prompted the resignation of Sir Bruce Liddington who was once the highest-paid person in education.

In July 2011, Gove said he wanted academy chains to grow as fast as possible:


He's got his wish - and pupils are paying the price (and the taxpayer - £1 billion overspend on the academies programme).

Add new comment

Already a member? Click here to log in before you comment. Or register with us.