FAQs

Summaries of some of the most discussed issues on LSN.

If you would like to suggest a new topic, let us know

About LSN

What do we believe in?

Changes to our public services are happening at a rapid pace under the new government. Coalition plans could mean the beginning of the end of the UK state education system as we have known it with the fracturing of a national system of locally administered schools, free for all, but managed in the interests of all.

We believe in good local all ability schools that exist within a system of fair admissions and fair funding. We don’t think better off parents should be given a larger slice of the educational cake. We don’t believe new schools should be built if they are not needed and we are opposed to schools being allowed to devise ways of attracting the more able or well off students, and shutting out those they don’t  want to teach..

We believe every local school can be excellent and a place where all students can reach their potential and achieve the highest standards. However there are groups, especially those from families with low incomes, who do not always achieve what they should. The key challenge we face in education is to enable all children to succeed, whatever their background. Those schools that focus on the elite – both grammar schools and the independent sector – offer no solution here.

We are committed to building on the successes of the comprehensive movement, learning from the work of great teachers and students in thousands of flourishing local schools while supporting those that are in challenging circumstances and working hard to improve.

We recognise that in some parts of the country, many local comprehensive schools are now academies – independent state schools which are funded directly from Whitehall and subject to different legal requirements.

We regret that successive governments have chosen to create a “two -tier” system because this can make local collaboration, fair funding and balanced  intakes more difficult. We are also opposed to the way the way that the present coalition government is imposing a “one size fits all” solution on many maintained schools (via its forced academy programme) and misrepresenting the evidence about the success of non academy schools.

Fair admissions and fair funding need strong local accountability so that all schools can be self governing but also obliged to play by the same rules. So we will be campaigning for locally accountable public bodies, not to run schools, but to  act in the interests of all , with responsibility for planning places , making sure that all schools (whether  academies or not ) manage  admissions fairly and receive the support and funding they need.

 

 

 

 

 

Academies and free schools

What did the NAO report on free schools (2013) recommend?

The DfE needs to:

1         Strengthen its analysis of applications to ensure they link with wider objectives.

2         Increase transparency of how it uses contextual and practical factors in deciding approvals.

3         Review the reasons why there’s been no demand to open free schools in some areas with significant forecast need.

4         Learn from earlier projects – some approaches such as opening in temporary accommodation or paying over valuation for properties may not offer value for money.

5         Strengthen its framework for intervening in open free schools.

6         Apply lessons from open free schools to approved schools in pre-opening.

7         Assess the effects of open free schools on the quality and viability of other schools (this should be done during the application process to avoid situations where a proposed free school threatens the viability of an existing school or deliberately sets out to attract high- performing children thereby skewing the intake of neighbouring schools).

Note: author’s comments are in brackets.

A summary of the report’s findings is here.

How do schools convert to academies? What’s an academy trust?

Academy conversion is complex Full details are available on the Department for Education (DfE) website.  In brief:

1         Academies are run by an academy trust, which is a charitable company limited by guarantee.

2         Schools can convert singly or as part of a chain.

3         A school applies to convert after its Governing Body has passed a resolution to convert.

4         Schools in a ‘hard’ Federation will already have a single Governing Body.  This Body will apply on behalf of all Federation schools.

5         Chains are called Multi-academy trusts: MATs. 

6         Academy trustees take responsibility for the way academies are run including staffing and curriculum.

7         Academies that don’t want to join a chain can still share governance and the procurement of  services with other academies.  These academies set up an Umbrella Trust: UT.

8         Schools register an interest with the DfE once they’ve made the decision to convert.

9         Schools should have started informal discussions with staff, parents and pupils before Governing Bodies pass a resolution to convert.

10     If the DfE approves the school’s application then the school must set up an Academy Trust.  The school should complete legal documents called Memorandum of Association and Articles of Association.

11     The Secretary of State must agree the Memorandum and Articles of Association.

12     The Academy Trust registers its Trust with Companies House as ‘Private limited by guarantee without share capital, exempt from using “Limited”’  (This ‘limited’ status limits the liability of the Trustees to a nominal sum, say £10, if the Trust collapses financially).  There is no share capital so there are no share holders who would expect a return on their investment.  (However, for-profit firms could set up an Academy Trust as a “vehicle” by which to make a return to shareholders.  See here)

13     The Academy Trust would be listed at Companies House. 

14     (The Academy Trust will not be registered with the Charities Commission because the Secretary of State for Education (SoS) is the Principal Regulator.  The name of the Trust will not be found on the Charities Commission website even though the Trusts are charities.)

15     Schools should consult about whether to become an academy.  Governing Bodies decide who to consult, how they’re going to run the consultation and the timescale. Faith schools need to consult their diocesan board or relevant religious authority.  This must be done before the SoS will sign the Funding Agreement.

Information retrieved from DfE website on 6 December 2013.  The words in brackets are the author’s own.

Do local authorities control their schools?

No. Local Authorities (LAs) do not “control” LA maintained schools. LA responsibility is limited to:

1   Back-room services (eg administration of payroll, legal requirements, brokering contracts for, say, IT services).

2  Co-ordinating admissions to all state schools in the area including free schools and academies.

3   Responsibility for children with special educational needs (SEN) and education welfare services.

4  The legal duty to manage the supply of school places.

LAs retain a small part of the budget of their maintained schools to pay for these services. Academies and free schools receive this small part directly but are still expected to purchase the administration and legal services they need.

LAs do not tell schools how to spend their budget.

LAs do not tell schools what and how to teach.

LAs have no say in who is recruited by a school (with the exception of the head). Teachers are employed by the LA but not recruited – who is appointed and what job s/he does is the responsibility of the school’s governing body. An LA adviser may be present at job interviews to give legal advice but has no say in who is offered the job.

The Organisation for Economic Cooperation and Development (OECD) found that the UK was among only four countries which gave schools a large amount of autonomy. The Academies Commission (2013) confirmed this: LA maintained schools can do most things an academy can do and the extra “freedoms” available to academies aren’t very great. The Academies Commission received evidence from some heads in academies linked to academy chains that they had less autonomy now they were in a chain than they had when maintained by their local authorities.

Many politicans, including Education Michael Gove, and large sections of the media push propaganda that non-academies are “controlled” by the iron hand of local bureaucracy and the only way to escape this intolerable burden is to claim academy “freedom”.  This is untrue.

 

What were the findings of the final LGA/DfE 2012 report into the role of LAs in education?

Findings:

1 Most LAs thought they had a clear vision about how to support the quality of education although some had misgivings about whether this vision was universally shared.

2 Only half were confident about their capacity to implement this vision.

3 LAs have to balance their responsibilities to maintain non-academy schools with new demands from a changing system.

4 LAs are more confident about having discussions with academy sponsors rather than with converter academies (particularly “stand-alone” ones).

5 Pressure to reach a quick solution risks arriving at a “superficial consensus” which avoids difficult questions.

6 The present situation in which LAs responsibilities are changing gives an opportunity to debate LA roles and consider evidence-based policies around school partnerships

7 Good relations between LAs, schools and academies may be threatened when key individuals move on.

8 Schools feel that LAs have the opportunity to demonstrated confident leadership particularly on difficult issues such as fair access.

9 Schools are very concerned about possible reductions in LA services.

10 Schools believe the future education system will be founded on the strength of partnerships but heads fear such links could be fragile.

11 LAs were developing three roles: organizing partnerships; commissioning services; championing pupils, parents and communities.

Managing supply of school places:

This is discussed more fully in the interim report summarised in the faq, What problems do Local Authorities face when schools become academies according to 2012 report?.

School improvement:

1 LAs remain accountable for the outcomes for all children and young people in the area.

2 LAs have a statutory duty to promote high standards.

3 LAs need to ensure that school-to-school support is coherent.

4 LAs and heads were both anxious about how the whole system can respond coherently to school failure.

5 LAs may lack enough intelligence in an autonomous school system to notice signs of declining performance.

6 Roles of responsibility are unclear.

7 LAs were frustrated about perceived lack of transparency surrounding DfE choice of sponsors for poorly performing schools.

Supporting vulnerable children:

1 LAs less confident about they will be able to offer good quality support for vulnerable children.

2 Factor contributing to this low confidence include growing numbers of special needs children, high mobility levels and difficulties in finding places for each vulnerable child.

3 LAs keep responsibilities to manage Fair Access Protocols for hard-to-place children.

4 Academy conversion could lead to schools refusing to take their fair share of such children.

5 The threat of enforced academy conversion could result in schools near the borderline refusing admission to children likely to bring down the results.

6 The redistribution of the LA Central Spend Equivalent Grant (LACSEG) evens out distribution of funding for vulnerable children between academies without regard to the actual level of need.

7 Although many LAs were confident about their ability to commission support for vulnerable children, there was concern about any instability caused by providers (especially of Alternative Provision) to enter and leave the market rapidly.

Looking forward:

1 Proportion of academies is increasing.

2 New Ofsted framework might result in more schools judged to be causing concern.

3 Proposed changes in school funding could have implications for high-needs provision.

Advice for local partners:

1 Develop ways in which relationships can be strengthened.

2 Focus on building a “local education culture” with a strong moral basis.

3 Identify ways of learning from other LAs.

4 Develop high-quality data analysis.

5 Ensure partners, including academies and sponsors, jointly understand the role of LAs in being “a champion of pupil and parents”.

6 Monitor closely whether support for vulnerable children is sufficient.

7 Identify ways in which further responsibilities can be delegated to schools within a strong framework combining partnership with “robust quality assurance.”

Advice for national partners:

1 There has been a wide range in LA performance.

2 This impacts on the ability of individual LAs to adapt to the new environment.

3 Sector-led improvement and the Children’s Improvement Board *(CIB) provide a way the sharing of good practice.

4 There is no “obvious point of accountability in the system” if academies should develop problems.

5 The DfE should be clearer about how it assesses the suitability of academy sponsor and how it will monitor sponsors’ performance.

6 The DfE should review process now in place for disputes around Fair Access to make sure it is still fit for purpose.

*The CIB had its funding axed with no warning in April 2013. Its expertise has, therefore, been lost. Read about it here.

Local authority role in education – policy briefing on final report for the Ministerial Advisory Group (July 2012) downloadable here.

 

What were the policy implications listed in the Sutton Trust 2008 report into the academies programme?

The Sutton Trust recognised these implications for the academies policy:

1 The role of academies within the education system needed to be clarified.

2 The Government should look again at the objectives of the academies programme and hone them if necessary.

3 Academies admission practices, along with other state schools, should be more closely monitored.

4 Banding, where used, had encouraged academies to become more inclusive but area-wide banding would ensure that the use of banding by an academy would not have a negative effect on the intake of neighbouring schools.

5 Consultation should take place to judge the demand and appropriateness of particular subject specialisms in a local area.

6 Staff and parents should be represented on academies’ governing bodies.

7 The emergence of academy chains could be potentially valuable but care would be needed to ensure that individual academies did not lose their ability to make decisions.

8 Any innovative good practice developed in academies should be shared by following the model of Professional Development Schools.

9 The involvement of universities in academies should be extended to other types of schools particularly where progression rates to university were low.

10 Politicians should not regard academies as “a panacea for a broad range of education problems”. Academy conversion “may not always be the best route to improvement. Care needs to be taken to ensure that Academies are the ‘best fit’ solution to the problem at hand.”

11 Pupil level data should be used with more sophistication to evaluate the success of academies.

12 Comparison between academies, and academies and other schools, should take context into account eg admission policies, demographics, performance of predecessor schools and changes in leadership.

The Sutton Trust into progress, problems and possibilities of the academies programme can be downloaded here.

 

What did the Sutton Trust say about academies in 2008?

Academies were a major plank in Labour’s education programme but there had been significant changes to the policy since the 2007 report.

Findings:

1 The picture was mixed about whether academies had raised standards.

2 Academies had increased “diversity” in the sense they were a distinctive type of state school but they were not as distinctive as they were when the programme began.

3 Academies were “inclusive” in one sense: they admitted more disadvantaged pupils than the national average although this proportion had fallen. This suggested that academies were more “inclusive” in another sense: they were admitting pupils from a wider social background.

4 There were concerns about high numbers of exclusions.

5 There were also concerns that some academies wouldn’t meet the deadline of 30% of pupils reaching the benchmark 5 GCSEs A*-C including Maths and English.

Independence:

1 There have been criticisms that academies were not accountable – they were autonomous and not maintained by local authorities (LAs).

2 But this autonomy had been curtailed by recent changes.

3 Academy Principals were paid an average of £18,000 – £32,000 more than state school heads. This led to concerns that talented heads would be lured away from the maintained sector.

Sponsorship:

1 The power of sponsors was considerable eg they owned the school estate.

Attainment:

1 GCSE attainment tended to have improved at a greater rate than non-academies and among similar schools.

2 But this coincided with a drop in the proportion of disadvantaged pupils.

3 There were still concerns about attainment. In 2007, GCSE results in the majority of academies (26 out of 36 which published results) did not meet the benchmark. Only 12 of the 20 academies who had results over a two year period had increased their performance from the previous year (2006).

Admissions and Inclusion:

1 There had been a fall in the average proportion of disadvantaged pupils in academies from 45.3% in 2003 to 29% in 2008.

Effects on neighbouring schools:

1 The drop in the proportion of disadvantaged pupils in academies did not seem to have affected the composition of intake in neighbouring schools.

Building and costs:

1 Many early academies had building costs which went over budget.

2 Building Schools for the Future (BSF) meant that academies’ distinctive quality of having a new building would be less distinctive as more schools of other types were improved or rebuilt.

Specialisms:

1 Academies tended to focus on a narrow range of subjects.

2 Most popular specialism (just over 50%) was business and/or enterprise.

3 However, this was changing as more academies were established.

Changes to the programme:

1 The type of sponsor was no longer just private sponsorship. New types included universities, independent schools and the growth of academy chains.

2 LAs were becoming more involved.

3 Academies were no longer able to opt out of the National Curriculum.

Emerging models:

It was no longer possible to view academies as a homogenous type. They now included:

1 Replacements for failing schools;

2 New schools in areas of underachievement;

3 Conversion of City Technology Colleges (CTCs) and independent schools;

4 Conversion of failing schools as part of the National Challenge.

Alternatives to Academies:

1 Not all academies had unique characteristics.

2 Other non-academy schools showed some of these unique characteristics.

3 Not all academies had been successful despite having unique characteristics.

4 Non-academies appear to have been successful in similar circumstances.

The report contained Policy Implications. These will be dealt with in a separate faq.

The full report can be downloaded here.

 

What did the Public Accounts Committee say about the Academies Programme in 2013?

Commenting on the £1 billion overspend on the Academies Programme identified by the National Audit Office (NAO), the Publics Account Committee (PAC) said that £400 million had been taken from money intended for intervention in underperforming schools.

Henry Stewart in his evidence to the PAC pointed out that the best part of this overspend was spent not on underperforming schools but on converting schools that were already Good or Outstanding.

The PAC was shocked to discover a lack of financial accountability in academies. Academies in chains were not required to make public their expenditure at school level. PAC member, MP Richard Bacon (Con) described this lack of accountability as “mind-blowing” and commented that chains “could hide something by taking a little bit from several pots, and the parents would never know at the individual academy level.” (While the PAC report was at the press, the chain E-Act was  issued with a “financial notice to improve” by the Education Funding Agency which discovered “weaknesses” in the reporting of its schools’ accounts.)

The PAC highlights dangers:

1 Governance, compliance and oversight arrangements for academies are inadequate and “remain vulnerable to failure”.

2 Forthcoming staff cuts at the Department for Education “may threaten effective oversight”.

3 Confusion about the “roles, responsibilities and accountabilities of different organisations across the changing schools system”.

4 Interventions in failing academies may be delayed because it is unclear about the roles played by local government, central government, the academy or academy trust in tackling the failure.

For further details click here. For the PAC Press Release click here.

24 April 2013

 

What problems do Local Authorities face when schools become academies according to 2012 report?

Local Authorities (LAs) have a statutory duty to manage the supply of school places. But a report by the Local Government Association and the Department for Education (LGA/Dfe) in 2012 said LAs could find it difficult to fulfil their legal duty as more schools become academies. The challenges are these:

Ensuring a sufficient future supply of school places

1 Academies are their own admission authorities and are not obliged to expand in order to cater for a growth in the number of children locally.

2 Academies may require incentives to persuade them to accept more pupils. This would be “particularly problematic in view of the more tightly constrained capital budgets”.

3 Although free schools are viewed as one solution, the report found that “the very late notification” of some proposals was “counter-productive” and resulted in much “abortive work”. The report said that free schools were “often more related to the desire to meet parental choice than to ensure the provision of sufficient places at the right time and in the right place for the area”.

4 The process of finding an acceptable solution could be “both time-consuming and inefficient”.

Managing potential over-supply of school places

The ability of LAs to manage the number of school places locally is hampered by Government policy which encourages academy conversion and the establishment of free schools. This is because:

1 Academies can increase their size without consulting with LAs

2 Free schools can open without reference to LAs.

The report described two scenarios:

1 Where a Local Authority must manage the closure of a “poor” school when it becomes unviable because a better-performing school has expanded.

2 Where a Local Authority needs to respond to a situation where a well-performing school is put at risk because other schools have expanded.

Scenario One raised these questions:

1 How can LAs manage the smooth and speedy transition of pupils remaining in the “poor” school to places in other schools?

2 How can LAs work effectively with other schools, including academies, to find sufficient places for the displaced pupils especially when academies don’t have to take extra pupils?

3 What authority (LA or Department for Education) is responsible if the school threatened with closure is an academy or free school?

Scenario Two raised this question:

How can LAs deal with the possible closure of a school when this is not in the best interests of pupils or parents? This may happen where:

1 Pupil numbers in the area are static or declining.

2 Most parents send their children to local schools.

3 All local schools are good or better.

4 Where a rise in the number of selective places or in schools with a similar academic emphasis would narrow choice.

Both scenarios raised a further question: how would LAs respond to the possible closure of a school when school numbers are expected to rise in a few years time and the places available in the school threatened with closure would be needed? This is the problem faced by Lincolnshire County Council in February 2013. An Academy Trust wants to close a small rural secondary school and send the children to another of its academies in nearby Grantham. But the Council Leader says Grantham is a “growth area” and will need its secondary school places in 2020 when the extra primary pupils reach secondary school.

The report warned that in the autonomous system envisaged by the Government it would be schools not the LA which would decide “the future pattern of provision”.

 

The Academies Commission 2013: what did the Commission say?

The Academies Commission reported in January 2013. Its remit was to investigate the implications of an education system in which most schools were academies.

This page will be a reference point for summaries of points made by the Academies Commission. More links will be added when threads about the Commission are posted.

Accountability and social inclusion: Not all academies were committed to social inclusion.  Some academies were not sufficiently accountable or responsive to parents.  The Commission warned that academization could have a negative effect on hard-to-place pupils.

Admissions Academies are their own admissions authority. The Commission highlighted concerns about academies flouting the Admissions Code in order to select particular pupils.

Autonomy One of the central planks of the academy programme is that academy conversion will result in greater “freedom”. The Commission found that:

1 Most things an academy can do, a maintained school can also do.

2 All UK schools already have considerable freedom.

Chains  There are fears that some academy chains are expanding too quickly and some had no coherent policy for improvement.  At the same time, some academy heads complained about centralised control.  Primary schools were reluctant to convert - many valued their relationship with local authorities.

Innovation Academy conversion is supposed to bring the freedom to innovate.  But the Commission concluded that innovation is inhibited by league tables not by lack of freedom.

Freedom, fragmentation and self-interest  Some academy head had qualms about controversial “freedoms” such as being able to employ untrained teachers.  Other evidence feared the education system could become fragmented with academies acting in their own interest.

Recommendations

School improvement (as measured by results. Caveat: a rise in results does not necessarily mean that the education provided by a school has really improved. It may be, for example, that a school is teaching-to-the-test and neglecting other skills, or it has relied on “equivalent”, non-GCSE exams, or it is selecting those pupils most likely to succeed. Nevertheless, results are the Government’s chosen measure of school performance so it’s important to know whether the academy programme is succeeding according to the Government’s preferred measure. The evidence suggests it is not.)

The Commission found many previously underperforming non-academies did as well as similar academies

Comment piece (not necessarily the views of the Commission) about the link between academy conversion and school improvement.

Additional data which debunks the Government’s line that academy conversion is essential for school improvement can be found in the faq Do academies get better results, or improve more quickly, than other state schools? and here.

Structures, systems and initiatives.  The Commission found that the avalanche of Government initiatives and a focus on structures and systems was diverting attention for the classroom.

What warnings about academy conversion were given in the Sutton Trust Report 2007?

The Sutton Trust found that most of the research into academies published up to 2007 was “broadly positive” about the academies programme. However, it did not fully endorse the policy and warned that “Academies are in danger of being regarded by politicians as a panacea for a broad range of educational problems… conversion to an academy may not always be the best route to improvement.”

The Sutton Trust’s findings are listed in more detail here.

 

What did the University of Birmingham find out about academy performance up to 2007?

In 2009, when the University of Birmingham published its report, the then Government was heavily promoting sponsored academies. The Birmingham researchers found that there was “no clear evidence that Academies produce better results than local authority schools with equivalent intakes.”

The researchers sympathised with the academy programme’s aim to improve the results of struggling schools and they expressed admiration for all those who worked in challenging schools whether these were academies or not. However, they concluded that it wasn’t clear that academy teachers were doing a better job than staff in non-academy schools. They found, like PriceWaterhouseCoopers in 2008, that when schools improved they used similar methods.  These had nothing to do with academy status but more to do with such factors as strong leadership and high expectations.

The Birmingham report concluded that evidence of the performance of academies by 2007 suggested that the academy programme was “a waste of time, effort and energy at least in terms of this rather narrow measure of KS4 outcomes”. It concluded that the money spent on academies would have been better targeted on refurbishing deprived schools or towards disadvantaged pupils in any type of school.

 

Do academies get better results, or improve more quickly, than other state schools?

At the end of January the Department for Education published a massive amount of information, 208 items of data on GCSE results for every school in England. This makes it possible for the first time to carry out a full analysis of how academies and non-academies compare. And, despite all the claims of government supporters, there is no evidence of better GCSE performance in academies.

The first thing to watch out for is the practice of policy-making by anecdote. Academy supporters have a tendency to focus on schools like Mossbourne and Burlington Dane, or the ARK chain. These are the best performing academies but the fact that these have done well does not mean academies as a whole have done well (though it would be good to study and learn from these schools, as from high-performing non-academies).

The government tends to quote growth figures for academies, which generally look impressive. However the analysis below shows two faults in this. First, schools in disadvantaged areas have generally done well. When academies are compared to similar schools, there is no clear pattern of extra growth. Further when GCSE equivalents (like Btecs) are removed the academy growth is generally less than in non-academies.

These are the key Local Schools Network posts analysing the data released by the DfE on the 2011 GCSE results, and comparing the performance of academies and non-academies:

Did academies grow more in 2011? Not when compared to similar schools

The main Dfe claim about growth from 2010 to 2011 in the GCSE results of academies, that it was twice that of non-academies, does not stack up when they are compared to similar schools.

Sir Michael Wilshaw is right: Most outstanding schools are not academies

The evidence shows Sir Michael is right on two counts: most outstanding schools are not academies and many schools in disadvantaged areas are doing amazing work.

DfE Press Release condemns academies performance

The DfE criticism of schools where few students take academic subjects is, above all, a condemnation of academies

DfE data shows success of local schools

The last three years has seen a transformation in the performance of schools in the most disadvantaged areas, with % achieving 5 GCSEs including English and Maths rising from 35% to 50%.

Established academies: still no evidence of better performance

The DfE argues that a fairer comparison would be with academies that are at least 5 years old. The evidence shows that these also perform no better than comparable non-academies.

Academy chains: No case for expansion

The record of the academy chains is poor and gives no basis for expansion.

“Failing schools”: Do academies do better?

The answer is No. Even with this group, they fare better as LA schools.

Students with low prior achievement: Inner city London comprehensives do best

Nationally only 6.5% of students of ‘low prior achievement’ get 5 A-Cs including English and Maths. Inner London schools do over twice as well, with Hackney achieving 22% and Tower Hamlets 23%.

Academies: The evidence of underperformance

When compared to comparable schools (in terms of levels of disadvantage), the data show academies under-perform on a range of criteria.

Post-script

The above analysis was used as the basis of an article in the Observer on 26th February. Crucially it states ” DfE spokesman did not deny the accuracy of the statistics”

Sources outside Local Schools Network

Liberal Conspiracy: Why more academies will make education worse

Anti-Academies Alliance: GCSEs, academies and disadvantage: a higher standard of government misinformation

Note: This is a reference page, and further links will be added to make this an easy-to-use link to all the 2011 GCSE data analysis. The date of publication is updated to keep it prominent.

Data Notes: The Academies  figure, throughout these posts, refers to the category of sponsor-led academies, of which there are 249. It does not include the ‘converter academies’, of which there were just 25 at this point. Non-academies include those classified as community, foundation, CTCs or voluntary aided schools, 2,681 in total. Special schools are not included.)

Data sources: The DfE data release can be obtained here:

http://www.guardian.co.uk/news/datablog/2012/jan/26/secondary-school-league-tables-data?INTCMP=SRCH#data

Some people have found it difficult to download this file. If you have difficulty, feel free to email me on henry@happy.co.uk and I will send you a copy of the file. The above analysis was generally done in Excel with Pivot tables.

 

 

 

 

 

 

 

 

 

 

Is it true that schools with more autonomy tend to achieve better results?

Yes. The Organisation for Economic Cooperation and Development (OECD) wrote, “In countries where schools have greater autonomy over what is taught and how students are assessed, students tend to perform better.”

In 2009, the OECD found that the United Kingdom was one of four countries which granted the greatest freedom to schools: secondary head teachers could allocate resources, recruit staff and make decisions about what subjects and examinations to offer. But the present Government says that schools need to embrace academy conversion to gain autonomy despite OECD found that UK schools already had considerable autonomy before the Coalition came to power.

Academies in chains actually risk losing much of this autonomy as John Burn, OBE, warned in his evidence to the Education Bill Committee. And the National Audit Office expressed concerns about sponsors putting pressure on their academies to purchase services from them when other alternatives might be better value.

UPDATE 16 January 2013.  The Academies Commission (2013) found that maintained schools could do most of the things that academies can do.  The Commision wrote, “The reality is that the increased [academy] freedoms are not nearly as substantial as is often suggested,”.  It also confirmed John Burn’s fears: some academies in chains had less autonomy than they enjoyed when they were Local Authority (LA) maintained schools.  Other academies complained that the extra bureaucratic burden and legal responsibilities diverted money and attention for their core purpose: providing education.

 

Did the Public Accounts Committee 2011 report conclude that academies had been a resounding success?

Yes, but only up to a point. In 2011, the Public Accounts Committee endorsed academy conversion as a means of raising achievement. However, closer scrutiny shows that the Public Accounts Committee had missed the downside to the academies programme given in the previously published evidence it consulted. The Committee looked at the Report by the Comptroller and Auditor General HC288 Session 2010-2011. This turns out to be National Audit Office (NAO) report discussed here which makes it clear that not all academies were raising results.  Even where improvements had been made there had not been a similar improvement in the results of disadvantaged pupils who were the very children that academies had been set up to help.

The Committee also took evidence from ARK and ULT academy chains.  ARK has had success with its academies but, as the NAO noted, this success was not uniform across all academies.  ULT had been banned by the Labour government from sponsoring any more academies because some of their established ones had been judged by Ofsted to be inadequate.  The Coalition Government lifted the ban and allowed ULT to take control of the Emmanuel group of academies in a move welcomed by Mr Gove.  An ex-Principal of one of the Emmanuel academies, John Burn, OBE, would later condemn this takeover in his evidence to the Education Bill Committee.  Mr Burn said that ULT was “not properly accountable to its own schools, their leaders and their communities.”

The Committee’s conclusion that the academies programme had raised achievement can only be upheld by a selective reading of the NAO report.   However, the Committee did not wholeheartedly endorse the academies programme.  It had reservations about “potential financial and governance instability”.   The Committee recommended a “strong framework with which academies must comply to ensure probity and effective governance”  because many academies were not complying with the guidance then in place.

 

What did the National Audit Office (NAO) really say about academies in 2010?

The Government also cites the National Audit Office (NAO) report of 2010 to support its claim that academy conversion can raise standards.  But does the Government’s claim stand up to scrutiny?

The NAO found that although “the proportion of academy pupils achieving five or more A*-C GCSEs was improving at a faster rate than in maintained schools with similar intake…there were a small number of academies which made little progress, particularly when English and mathematics were considered”. NAO noted that some academies had been judged by Ofsted to be inadequate.

Academies were specifically set up to improve the results of disadvantage pupils. NAO found that the performance of academy pupils eligible for free school meals, have English as an additional language or have special educational needs had improved. But, on average, the gap in attainment between more disadvantaged pupils and others has grown wider in academies than in comparable maintained schools.

When sponsored academies were established, the support of a sponsor was thought to be crucial to an academy’s success. NAO found that “Academy sponsorship can bring benefits such as a clear ethos, business and educational expertise and additional financial support.” But NAO discovered that a “significant proportion” of academies had not received the financial help originally pledged by their sponsors. [Note: sponsors are no longer required to pledge financial support].  NAO was also concerned about a conflict of interest when sponsors put pressure on their academies to purchase services from the sponsor.  NAO feared that this might not be good value for money.

Academies are supposed to benefit from being able to manage their entire budget. However, NAO reported that some academies were struggling to deal with their finances. It found that the Young People’s Learning Agency had identified that “just over a quarter of academies may require additional financial or managerial support to secure their longer term financial health.”

The NAO concluded: “Many of the academies established so far are performing impressively in delivering the intended improvements. It cannot be assumed, however, that academies’ performance to date is an accurate predictor of how the model will perform when generalised more widely. Existing academies have been primarily about school improvement in deprived areas, while new academies will often be operating in very different educational and social settings.”

Past performance in academies, which was not uniform in any case, is no predictor of future performance and it is, therefore, unwise to press forward with mass academy conversion particularly with primary schools.

 

The Government cites a 2008 report by PriceWaterhouseCoopers (PwC) to justify its academy conversion programme. Does this report wholeheartedly endorse academy conversion?

No.

1. The report considered only 27 academies. Three had previously been City Technology Colleges (CTCs) so were not used to compare attainment. Achievement analysis was based on the results of just 24 academies.

2. Although there had been positive overall progress in securing improvements in performance, this was not uniform across all measures of achievement.

3. Many academies performed better than the national average for progress from Key Stage 2 to GCSE when the background and previous attainment of the pupils was taken into account. This was less true for progress from Key Stage 2 to Key Stage 3.

4. When English and Maths was taken into account, the rates of progress were “less substantial” although higher than the England average. (The rate of progress, however, is calculated from a lower base.)

5. The researchers found “considerable diversity across individual Academies in the levels and improvements achieved.”

6. Some Academies had used vocational courses to boost improvement more quickly. This was at the expense of ensuring a “broad and balanced curriculum” in some cases.

7. Where academies were improving, they were using similar methods to those found in improving LA schools. Outstanding leaders and stability in leadership were “critical” to improve standards. Sponsorship and new school buildings were seen as positive factors.

8. While 80% of academies provided extended programmes of instruction, most LA maintained schools also offered extended days and extra-curricular activities.

9. Ofsted found that teaching and learning was variable in academies. This was attributed to inexperienced middle management and a relatively high percentage of teachers without qualified teacher status (QTS).

PriceWaterhouseCoopers concluded “There is insufficient evidence to make a definitive judgement about the Academies as a model for school improvement” and “the process of change was complex and varied and could not be ascribed to a “simple uniform ‘Academy effect’”.

 

Admissions and exclusions

Would bringing back selection at 11 result in higher educational attainment?

The best-performing school systems tend to be the most inclusive, says research done by the Organisation of Economic Co-operation and Development (OECD).  High-performing countries do not segregate children academically or geographically. OECD found that such selection does not result in higher performance overall but does increase the gap between advantaged and disadvantaged pupils.

References and further information:

Government policy

Trojan Horse – what action did Peter Clarke recommend for Birmingham City Council (BCC) in particular and for all local authorities generally?

Clarke recommended Birmingham City Council should:

1 Review its systems, procedures and policies covering the support given to its maintained schools to ensure patterns in behaviour can be spotted, to achieve an appropriate balance between maintaining community cohesion and the educational and safeguarding needs of children, to ensure BCC is not influenced unduly by a vocal minority and there is effective sharing of information between other agencies such as the police and the Department for Education (DfE).

2 Ensure it is responsive to whistleblowing; that it acts robustly and provides proper protection to whistleblowers.

3 Judge whether concerns indicate extremism; if so refer these to the relevant authority.

4 Consider setting up an independent process for governors and teachers to raise concerns.

5 Review all compromise agreement made within the last five years to assess their appropriateness and whether BCC exercised duty of care.

6 Improve governor support services to ensure effective appointments after suitable vetting and to provide effective governor support. This must be done before BCC starts appointing local authority (LA) governors again.

All Local Authorities should:

1 Ensure governors take on no more than two schools at any one time except in “genuinely exceptional circumstances”.
2 Ensure no single individual has undue influence over several schools.

Trojan Horse – what action did Peter Clarke recommend for Ofsted?

Clarke recommended that Ofsted:

1 Ensure it is responsive to whistleblowing; that it acts robustly and provides proper protection to whistleblowers.

2 Judge whether such concerns indicate extremism; if so refer these to the relevant authority.

3 Consider whether the inspection framework and subsidiary guidance is capable of spotting “indicators of extremism” and whether a school’s character hasn’t changed “substantively without following the proper process”.

4 The Ofsted framework to be extended to check whether headteachers have ensured Child Protection training takes place bi-annually as required.

Trojan Horse – what action did Peter Clarke recommend for all schools?

Clarke recommended that all schools of whatever type should publish details of their governing body on their website. The following information should be included:

1 Full name of each governor.

2 Membership of committees.

3 Method of appointment (eg local authority appointment, elected parent governors).

4 Expected period of appointment.

Trojan Horse – what action did Peter Clarke recommend for Multi Academy Trusts?

Clarke recommended that Multi Academy Trusts (MATs) should:

1 Ensure governors take on no more than two schools at any one time except in “genuinely exceptional circumstances”.

2 Ensure no single individual has undue influence over several schools.

Trojan Horse – what action did Peter Clarke recommend for the Department for Education (DfE?)

Note: comments in brackets are those of the author

Clarke recommended the DfE should:

1 Look at the way schools help individuals gain Qualified Teacher Status (QTS) to make sure the system can’t be abused. (But academies and free schools don’t have to offer training towards QTS for untrained teachers – this is a weak point which would allow academies* to appoint unsuitable people as teachers.)

2 Schools already have a teacher designated as Child Protection Officer (CPO). His/her responsibilities should be extended to include the counter-terrorism strategy Prevent. The mandatory Child Protection training should include Prevent and the CPO should ensure the training is “cascaded to every member of staff, governor or volunteer”.

3 Consider taking against teachers who breach teacher standards. (This is another loophole: use of teacher standards in academies “depends on the specific establishment arrangements” of each school.)

4 Ensure it is responsive to whistleblowing; that it acts robustly and provides proper protection to whistleblowers.

5 Judge whether such concerns indicate extremism; if so refer these to the relevant authority.

6 The new Regional Schools Commissioners’ responsibilities should include receiving complaints about academies* and passing them on to the relevant authority for investigation.

7 Review the way in which schools are able to convert to academies and the way in which single academies can become Multi Academy Trusts (MATs). This should include making appropriate checks on MAT individuals and ensuring the MAT’s capability and capacity.

8 As a matter of urgency, consider how best to collect local concerns during conversion.

9 As a matter of urgency, review the “brokerage system” (or re-brokerage) whereby schools are matched with sponsors to ensure transparency.

10 Consider requiring academies to tell the Department about changes in governing bodies.

11 Consider giving stronger powers to the Secretary of State (SoS) to bar individuals from managing any type of school whether maintained or an academy. (This would need independent monitoring to ensure a SoS didn’t use this power to prevent individuals from being barred just because they opposed government policies or were, say, members of the so-called “Blob”.)

12 Review guidance of governors’ appointments so it includes clear expectations of a governor’s role.

13 Review and analyse the evidence collected during Clarke’s investigation of allegations made in the so-called Trojan Horse letter.

14 Take steps to understand areas of concern highlighted in the investigation – including possible financial malpractice)- and consider further action.

15 Consider whether areas of the country are particularly vulnerable to the concerns raised in the Trojan Horse investigations. (By concentrating on particular areas, say cities with a sizeable Muslim population, this has the potential of (a) alienating Muslims in the same way the first Prevent strategy did by treating them all as a suspect group; and (b) drawing attention away from other areas where governors may seek to impose their limited view of the world onto their schools.)

*academies include free schools.

What risks did the NAO describe in its report about delivering public services through markets?

The 2011 Government White Paper, Open Public Services, makes it clear that the Government prefers to deliver public services via user choice and competition. This method of delivery, says a recent National Audit Office (NAO) report, depends on market forces which are “generally considered effective at promoting efficient outcomes” when the market functions well.

However, the NAO warns that markets can fail for reasons which include users being poorly informed about providers, their quality and prices, or if providers form a cartel to keep prices high. “Left to their own devices, markets may also not be effective at delivering wider policy outcomes such as equity and universal services.”

NAO warned that commissioning services from the private or not-for-profit sector was different from “traditional ways” of delivering public sector services. Government departments and authorities would, therefore, need to have new skills because commissioning brought risks. These dangers are summarised below:

1 Markets can lead to fragmentation which reduces economies of scale.

2 This could result in higher costs if competition is ineffective.

3 Additional costs must be justified by “efficiency gains”.

4 Private provision can bring efficiency but it doesn’t “naturally provide universal services or equity of provision.”

5 Private providers will not offer a service if the cost to the provider is uneconomic.

6 Where end users purchase their own services through direct cash payments there may be an increased risk of mistakes and systematic fraud.

7 The Government retains responsibility if services fail but has less ability to intervene than when providing services directly.

Delivering public services through markets needed:

1 Rules, monitoring, enforcement and remedies when things go wrong.

2 Demand-side effectiveness. Demand relies on reliable information, efficient forecasting and accurate estimation of future demand. Service users need to be willing and able to make informed choices.

3 Supply-side effectiveness. Supply relies on having effective commercial, economic and analytical skills, an understanding of how to enter a market, expand or leave and a knowledge of effective marketing which includes setting fair prices which represent value for money.

4 Continuity – this requires financial and business skills. These include minimising taxpayers’ risk and limiting the dangers associated with interrupting services to vulnerable people.

5 Outcome monitoring – this requires analytical and economic skills.

The report makes it clear that government departments must effectively check how well the market is “delivering the required outcomes”. They must put ensure they have the power and ability to intervene when necessary.

The NAO concludes, “Finally, if after the market has operated for a period, it is not delivering cost-efficient outcomes that represent value for money, the oversight body may wish to consider ways it can move away from a market delivery mechanism”.

 

 

Do market forces in education increase achievement and efficiency?

Academics found the evidence that market forces improve education achievement and efficiency was “fragmented and inconclusive”.

A major review looked at a huge amount of research. This had mostly taken place in the United States although some researchers looked at other countries including Sweden, Chile and the UK. The reviewers found:

1 The scope of much of the research was limited to test scores in reading and maths. This neglected other subjects or other kinds of achievement.

2 Any positive effects were small and statistically insignificant. They were usually limited to reading. Results for maths tended to be lower than reading.

3 Competition between schools carried a risk of increased segregation particularly when schools set out to attract certain types of students like faith groups.

4 Findings about the impact of market forces on efficiency – increasing achievement while lowering costs – were inconclusive.

5 Schools in competition with each other often shifted expenditure from teaching to non-teaching spending such as marketing.

6 No link was found between innovation and market mechanism. The opposite was true in many cases – schools in competition with each other tended to become more traditional.

7 Many researchers had argued about whether raw exam results were reliable indicators of school quality. They found it was not easy to separate “good”, “average” and “bad” schools using test scores alone. Some researchers concluded that published league tables which were supposed to help parents choose between schools was a “somewhat meaningless exercise.”

In conclusion: the authors admitted that little was known about the long-term effects of introducing market forces into the education system. However, they warned, “Compared with government aims when market mechanisms are introduced in education and the fierce tone of the political as well as the academic debate on these issues, the effects as reported in empirical research are modest to say the least.”

UPDATE 16 January 2013:  The Academies Commission (2013) reviewed evidence about US charter schools, Swedish Free schools (friskolor) and Chile’s semi-privatised system.  It found evidence was wide-ranging and issues were “methodologically exceptionally complex”.  Much of this evidence remains disputed.  The Commission found charter schools, friskolor and similar schools had been successful in some jurisdictions*but this success was countered by failure in many of the schools.

The Commission concluded that evidence about the effectiveness of these policies was difficult to establish.  The countries that had most implemented these strategies had not reported a substantial increase in the Programme for International Student Assessment (PISA) tests although they reported improvements on some national performance measures. (Note: these are normally restricted to performance in Maths and English, PISA measures Reading, Maths and Science.   Performance in these tests should not be regarded as a judgement on a country’s entire education system).

The Commission found that reports of raised attainment are accompanied by other reports which showed increased social segregation.

*jurisdictions are not whole countries but parts of countries eg states, provinces, municipalities.

For the latest review of evidence, published 16 January 2013, about market intervention in education in Sweden and Chile, see the faq which deals with this in more detail.

The Academies Commission report can be downloaded here.

 

International comparisons

What did UNICEF say about UK education in 2013?

Overall, UNICEF ranked UK 24th out of 29 developed countries for “educational welfare”. This low ranking was caused by the large percentage of 15-19 year-olds who were not in education, employment or training.

UNICEF considered four indicators to measure educational welfare. These were:

1 Pre-school enrolment (% of children between 4 years and the start of compulsory schooling) in the UK was high – 95%.

2 UK was at the bottom of the table for participation in “further education” (defined as education for 15-19 year-olds). UK was the only one of the 29 developed countries where the participation rate for this age group fell below 75%. UNICEF thought the cause may be an emphasis on academic exams combined with a wide range of vocational qualifications which did not have ‘parity of esteem’ or an ‘established value’ in the workplace.

3 The percentage of young people not in education, employment of training (NEET) was high. 9% of UK 15-19 year-olds are NEET. UK ranked 25th out of 29 on this indicator.

4 The educational achievement of UK pupils at age 15 was measured by an average of the scores in the three PISA tests (reading, maths and science) taken in 2009. UK was ranked 11th out of 29.

The full report can be downloaded here.

BBC articles summarising the UNICEF report are here and here.

Further details of the “educational welfare” findings are here.

8 December 2013

How did English and Northern Irish pupils perform in the TIMSS?

Trends in Maths and Science Survey (TIMSS) 2007

In the Trends in Maths and Science Survey (TIMSS) 2007, English 10 year-olds and 14 year-olds scored the highest marks among European countries in maths and science. Northern Irish pupils did not take part in 2007.

Trends in Maths and Science Survey (TIMSS) 2011: Maths

Northern Irish 10 year-olds were 6th and English 10 year-olds were 11th= out of 57 participants in TIMSS 2011. (Note: some participants were not whole countries but jurisdictions such as US states. The word “countries” refers to whole countries not these jurisdictions.)

The relative position of English pupils fell although the score rose by one point (a statistically insignificant amount). TIMSS still ranks England among the top ten countries for primary maths. Northern Ireland did not enter pupils for the 2007 TIMSS tests so there can be no relative ranking between 2007 and 2011.

Northern Ireland did not enter pupils for the 2011 TIMSS tests for 14 year-olds. The performance of English pupils fell – English 14 year-olds are now placed with average TIMSS performers such as Finland (a top performer in PISA tests) and USA. England is one of several participants, including USA, Sweden and Hong Kong, where the high performance of primary pupils in TIMSS tests is not sustained into secondary school.Trends in Maths and Science Survey (TIMMS) 2011: Science

Although the performance of English pupils in TIMSS 2011 science tests remains high, their position at the top has fallen since 2007 when they were top of the European league at ages 10 and 14.

However, international rankings are “volatile” and performance shouldn’t be judged solely on league table position. English 10 year-olds scored significantly above the centre point of the international scale although eight countries, including Finland, USA and four Pacific Rim countries, scored significantly higher than England. Countries where the performance of 10 year-olds was similar to England’s include Hong Kong, Sweden and Germany. The 31 countries which scored significantly less included Slovenia, Australia and Northern Ireland where maths results had been particularly high.

The performance of English 14 year-olds in science has been consistently high since 1995. Only five countries scored significantly more in 2011: Singapore, Chinese Tapei, Korea, Japan and Finland. Five countries performed at a similar level including Slovenia, Hong Kong and USA. The 31 countries which scored significantly lower than England included Australia, New Zealand and Sweden.

Despite this strong showing at age 14, the score was actually lower than should have been expected based on the 2007 results. The 14 year-olds taking the 2011 TIMSS test would have been in the 2007 cohort so it’s possible to measure against expected performance. 14 year-olds in Hong Kong and USA also didn’t reach the level predicted at age 10.

8 December 2013

How did English 10 year-olds perform in PIRLS?

PIRLS 2011 showed a relative rise in the performance of English 10 year-olds in reading comprehension from 19th out of 45 participants in 2007 to 10th= in 2011. 10 year-olds from Northern Ireland did even better – they were 5th. It should be borne in mind, however, that the participants taking part in PIRLS 2011 were not the same as in 2007 – new countries joined while others dropped out.

8 December 2013

How did the 2012 Learning Curve report rank UK education?

UK was sixth behind Finland, South Korea, Hong Kong, Japan and Singapore, second in Europe and second in the Western world according to analysis by the Economist Intelligence Unit (EIU). The EIU combined the results of three international education tests (PISA, PIRLS and TIMSS) with literacy and graduation rates. The results appeared in The Learning Curve (aka Pearson/EIU report) published by Pearson.

However, the EIU issued warnings about their analysis. Much data was missing and had to be estimated by, say, calculating missing test results by how well a country did in other tests. Other analysts have pointed out that some of the data (eg literacy and secondary school graduation rates) might have come from a country’s education ministry and might not necessarily be accurate. The TES, for example, wrote that Pearson’s report “relied heavily on graduate and adult rather than pupil performance.”

The Sutton Trust also cautioned against the report’s conclusions. It wrote, “a third of the score in the Pearson index, which placed England sixth in the world, was the result of data on secondary school completion and university graduation rates, which may owe more to policy decisions than attainment. When the latter data are excluded, England drops to 12th position, and could be lower if five other countries ahead of England on PISA, but not included in the Pearson index.”

8 December 2013

What does PISA say about UK 15-year-olds?

The Programme for International Student Assessment (PISA) has taken place every three years since 2000.  The figures for the UK in 2000 showed UK pupils doing well.  But the OECD, who runs the tests, later found the figures were faulty and withdrew them.   The OECD did not accept figures for the UK in 2003 because of sampling problems.  The only reliable figures for the UK are for 2006, 2009 and 2012.

PISA 2009

Between 2006 and 2009 there was a relative fall in ranking from 17th to 25th in reading, 23rd to 28th for maths, and 14th to 17th in 2009 for science. But more countries took part in the 2009 tests – 65 compared with 58 in 2006. However, there is little statistical difference between the 2006 and 2009 scores.

In the 2009 PISA tests, UK pupils were at the OECD average for reading and maths, and above the OECD average for Science

In December 2010, when the 2009 PISA results were published, the Government and most of the media ignored the OECD warning not to compare the 2009 results with the faulty 2000 figures.  This led to the “plummeting down league tables” myth.  In October 2012, the UK Statistics Authority expressed concern about the way the Department of Education (DfE) had used PISA statistics.  There has, however, been no apology from either the DfE or newspaper editors.  The faulty figures continue to be used to portray state education in England as being in a pitiful state.  Tory MPs Chris Skidmore and Prit Platel made the false comparison after the UK Statistics Watchdog had warned against their use.  And the Confederation of British Industry  (CBI) published the 2000 figures in a graph in its 2013 Annual Report. 

Programme for International Student Assessment (PISA) 2012

The results showed a slight improvement in the performance of UK 15 year-olds. However, it was statistically insignificant. Performance has remained static. UK pupils still performed at the OECD average in reading and maths, and above the OECD average in science in PISA 2013. See here for more detailed analysis

8 December 2013

Shoud international test results be used with caution?

Yes.   It’s important international education test results should be kept in proportion. They test only a limited number of subjects at set age groups every three to five years.

Unfortunately, governments have a tendency to concentrate only relative rankings (ie league table position) and use an apparent fall in league table position to justify policies. The Government uses PISA rankings to find “evidence” from top-performing PISA countries to support Government policy (see case study below).

The test results are sometimes contradictory. For example, the performance of English 14/15 year-olds in PISA 2009 was contradicted by TIMSS 2007. In top-performing Finland, 15 year-olds outscored most of the world in PISA 2009 but in TIMSS 2011 (maths) Finnish 14 year-olds were ranked as average performers along with England.  The below-average performance of English 16-24 year-olds in literacy/numeracy in the OECD Adult Skills Survey 2013 is contradicted by the performance of UK 15 year-olds in PISA 2013 which showed UK pupils performing at the OECD average for reading and maths (and above average in Science).

These international tests provide an important snapshot – but they should not be used as a sole justification for policies. Nor should they be seen as a judgement on a country’s whole education system. For example, top performing South Korean teenagers are the unhappiest among OECD countries while Finland’s top performing system appears less effective in stretching the most able. The Learning Curve found that there’s no magic bullet for school improvement so picking bits-and-pieces from countries which perform well in international tests and transposing these to very different countries is unlikely to improve performance.

The tests don’t just provide scores and rankings. The organisers research and analyse a range of other factors such as socio-economic background, maternal education, private and state schools and how many books are available in the home. However, this wealth of analysis is often ignored in favour of the raw test results and rankings.

The Sutton Trust (2013) warned against jumping to conclusions about a country’s education system based solely on league table rankings. It could be misleading.

Case study: How the exam system of just one top performer can be used as “evidence” to support the Government’s proposed changes to England’s exam system.

Singapore is a top performer in international education league tables. The Government argues that Singapore’s “rigorous” exam system (O levels at 16) causes Singapore’s success. The Government uses this to justify its overhaul of English exams at 16+. However, top-performing Finland doesn’t test pupils until age 18/19 when they take a minimum of four tests. In high-performing South Korea pupils receive either a High School Certificate or a Vocational High School Certificate at age 18+. Only those South Korean pupils wishing to attend university or junior college take the College Scholastic Aptitude Tests (CSAT). Top-performing Hong Kong has just replaced its O and A level type exams with the Hong Kong Diploma of Secondary Education, comprising 4 core subjects and 2-3 elective subjects, designed to be taken after 6 years of secondary school.

Singapore is out-of-step with other high-performing countries but because it’s in-step with the Government’s ideas about examinations it is used to promote the Government’s policies. (For more information about the exam systems in several other countries see the more detailed faq.)

Updated 6 December 2013

 

How did England do in the 2013 Adult Skills Survey?

Round One of the Organisation for Economic Cooperation and Development (OECD) Adult Skills Survey covered 23 countries.  Participants were tested during the period 2008/13. The results were published in Autumn 2013.  OECD warned the results should be used with caution because of the sampling problems.  This caveat has mostly been ignored.

A second group of countries including Chile, Greece, New Zealand and Singapore will be tested during the period 2012/16. 

A third group will be tested 2014/18.  The OECD is inviting participation.

English participants took the survey between August 2011 and March 2012.  The response rate was 59% (NFER).  The OECD asked countries which didn’t meet sampling requirements to complete Non-Response Bias surveys.  England and Northern Ireland did not complete all of these but OECD still stands by the results.

ANALYSIS FOR ENGLAND FROM NFER

LITERACY

The mean score in England was not significantly different to the OECD average for literacy.

Adults aged 16-65: 8 countries significantly outperformed England.  8 countries performed significantly below.

Young adults aged 16.24: There was particularly poor performance amongst England’s youngest adults compared with other participating countries.

In England, the youngest adults (aged 16-18) scored an average of 6.1 score points less than the oldest (aged 55-65) but the difference was not significant.

NUMERACY

Adults aged 16-65: England’s performance in numeracy was significantly below the OECD average. There were 15 countries that significantly outperformed England and five countries that England significantly outperformed.

Young adults scored significantly below the OECD average.

In England, the difference between the score of the youngest adults (aged 16-18) and the oldest (55-65) was not statistically significant.  The pattern of decreasing numeracy skills in younger adults was found in some high scoring countries as well as in England, but these other countries are not falling so far below the OECD average as England.

PROBLEM SOLVING: England’s scores were significantly below the OECD averages.

AREA DIFFERENCES

LITERACY SCORES

England average: 273.  Highest average was in South East (283); the lowest in the North East (259).

NUMERACY SCORES

England average: 262.  Highest average was in South East (274); lowest in North East (247).

PROBLEM SOLVING

England average: 281.  Highest average was in the Eastern area (289); lowest in North East (268).

NFER repeated OECD warnings:

“…the possibility of biases associated with non-response cannot be ruled out. Readers should, therefore, exercise caution in drawing conclusions from small score point differences between countries or population groups, even if the differences concerned are statistically significant” (my emphasis).

Do English pupils spend fewer hours in class than in other countries?

Education Secretary Michael Gove told the Spectator 2013 Conference that English pupils had to have longer school days and shorter holidays in order to keep up with children in the Far East.

And Chris Skidmore MP wrote that pupils aged 7-14 (inclusive) in Ireland, Canada, France and Australia spent more time in the classroom than English pupils of the same age.

Were they right?

The Organisation for Economic Cooperation and Development (OECD Education at a Glance 2012) surveyed 34 countries to discover the number of hours pupils aged 7-14 (inclusive) spent in class.

The average total number of hours spent in the classroom by 7-14 year-olds was 6,862.

The total number of hours that English 7-14 year-olds spent in the classroom was above the OECD average. English pupils were in class for a total of 7,258 hours.

How does this compare with other countries?

Pupils in 20 countries spent less time in the classroom while pupils in 13 other countries spent more.

What about high-performing Finland and Far Eastern countries?

Only two Far Eastern countries were included in the OECD survey, Korea and Japan.

The figures for these two countries plus Finland are as follows:

Finland: 5,637 Korea: 5,910 Japan: 6,501

English pupils spent more time in the classroom than pupils in these three countries.

What about Far Eastern countries missing from the OECD data?

Singapore, Shanghai and Hong Kong were not included. Data for these countries had to be found elsewhere.

Singapore: hours per day in school: Primary 5 hours including recess, Secondary 6 hours including recess. Holidays 2013: About 12 weeks in total

Hong Kong: *hours per day 7 hours including breaks. *School year is 190 days, same as England.

Shanghai: *hours per day in school: 8 including 1 ½ hours for lunch. Holidays: *about 14 weeks including a full two months for the summer holiday (1 July to 31 August)

What about those countries cited by Skidmore?

Skidmore was correct. Pupils in these four countries spent longer in the classroom but it was only in Australia that the hours spent were significantly more. The figures are here:

Ireland: 7,362 (104 hours more over 8 years)

Canada: 7,363 (105 hours more over 8 years)

France: 7,148 (less than English pupils according to the OECD key facts for France. BUT the graph on page 424 of Education at a Glance 2012 shows French children as having slightly more total hours than Canada).

Australia: 7,907 (649 over 8 years)

Conclusion: English 7-14 year-olds spend more hours in the classroom than pupils in the same age group in Finland, Korea, Japan and 17 other OECD countries. They spend about the same as those in Singapore. They spend fewer hours than pupils in 13 other OECD countries including Ireland, Canada, France and Australia but of these four countries it’s only in Australia where pupils spend significantly more time. In the two Far Eastern countries which are not included in OECD figures, Hong Kong and Shanghai, pupils spend more time in school (although pupils in Shanghai get a longer lunch break and more holidays which offset the extra hours).

*Information re Hong Kong and Shanghai was difficult to track down. I had to rely on Wikibooks for the school day and information on a school’s website for Hong Kong. For Shanghai, I relied on a global expat website. The information might, therefore, not be accurate.

26 April 2013

 

What do schools systems which score highly in PISA have in common?

Every three years the Organisation for Economic Cooperation and Development (OECD) administers the Programme for International Student Assessment (PISA) tests. Some countries,  perform consistently well. Data from PISA found these countries tended to share many attributes:

• They combine quality with equity.

• They put a high value on education.

• Successful countries accept that pupils can succeed if they are given the opportunity and they put in the effort.

• They don’t divide pupils up at an early age.

• Teachers in successful countries employ a range of teaching practices to personalise learning opportunities.

• Pupils in 2012 were more likely to have attended at lease one year of pre-primary education than their counterparts in 2003.  But many of the pupils who did not attend pre-primary schools were disadvantaged.  These were the pupils who could benefit most from pre-primary education.

• Successful countries value teachers. They are careful how they recruit and train them; they provide continual professional development and they seek ways of improving the performance of weak teachers. In short, they provide an environment in which teachers can collaborate and share best practice.

• High performing systems set high standards and enable teachers to decide how best to teach their own pupils.  Pupils know what’s expected in order to succeed.

• If offered a choice of schools, parents tended to prioritise safety and a good reputation over “high academic achievement of students in the school”.

• Successful school systems have moved from “professional or administrative forms of accountability and control” to “professional forms of work organisations”. The emphasis is not on outcomes but on the next stage in a pupil’s education: the next teacher, the next school, the pupil’s future life.

• In successful countries there is high performance across the entire system. Resources are directed where they are most needed. The most talented teachers teach the most challenging classes and the strongest heads lead the toughest schools.

• Good pupil/teacher relationships are strongly linked with positive attitudes towards school.  This reduces truancy, disengagement and poor punctuality.  These three factors reduce performance.

Lastly, education policy is aligned with other public policies – they are coherent, sustained and consistently implemented.

A fuller description of the factors and link to sources are here and here.

Ipdated 8 December 2013

 

How are schools held accountable in OECD member countries?

In education, schools are accountable to a number of stakeholder groups including national and local governments, parents, pupils and the public for the quality of education they provide. The Organisation for Economic Co-operation and Development (OECD) published information about how schools are held accountable in its 34 member countries. For some unknown reason, the OECD sometimes gives figures out of 35, sometimes 31.

Accountability is measured in three main ways:

Performance accountability

Performance accountability focuses on outcomes not on processes. The OECD has warned there is too much emphasis in England on raw exam results. “Fair and effective measures of performance accountability take into account the needs of the students and the families they serve and the resources available to serve them,” the OECD said.

National examinations* are used by 23 out of 35 OECD member countries at the end of upper secondary school (18+). National assessment*, which informs teaching, is more common at lower secondary level and primary schools in OECD countries. However, England, Northern Ireland and Wales are unique in having a large number of high-stakes national examinations at the end of lower secondary (16+).

Regulatory accountability

This is the legal framework which governs how schools operate. The most common regulation is one which expects schools to provide information about pupil characteristics to authorities. Other “compliance” regulations govern such things as safety, curriculum, facilities and grounds, and teacher qualifications. In England, the Government allows academies to opt-out of rules governing teacher qualifications, the quality of school food and the National Curriculum. And planning laws are being relaxed for free schools to make it easier for them to find premises. Whether the premises would really be suitable for school pupils will be up to parents to decide.

School inspections are part of the regulatory framework. The areas covered by inspections were: compliance with the regulations described above, quality of teaching and pupil performance. 24 out of 31 OECD member countries had inspections at lower-secondary level.

School self-evaluation is required by 21 out of 32 OECD member countries. Self-evaluation is cheaper than full inspections and results can be judged in the light of local circumstances. The main disadvantage is that it is less credible to outside audiences.

Market accountability

This is based on the idea that when parents have “choice” then schools will improve in order to attract parents. However, the evidence linking market forces and educational outcomes is “fragmented and inconclusive”**.

The OECD said that there were five factors which were needed to make any market mechanism in education work. These were:

1 A range of school options. (This may work in a city, but it’s unclear how different types of school could be provided in less-populated areas).

2 Accurate information given to parents about schools.

3 Limited powers for schools to select or screen pupils.

4 Financial incentives, in the form of vouchers, scholarships, tuition tax credits or allowing parents to top up fees.

5 A funding mechanism whereby money followed the pupil.

*See faq What are the examination and assessment systems in OECD countries? for more information about exams and assessment systems in OECD member countries.

**See faqs Do market forces in education increase achievement and efficiency? and What does a January 2013 review of evidence say about market intervention in education in Sweden and Chile? for more information.

Note: GCSEs are not used in Scotland. Scotland is in the process of reforming its exam system so fewer subjects are taken at 16+. For details see here.

 

What are the examination and assessment systems in OECD countries?

The data below comes from OECD Education at a Glance 2011. There are 34 OECD member countries. However, OECD data below sometimes gives a figure out of 35. The reason is unclear.

The Organisation for Economic Cooperation and Development (OECD) doesn’t just administer international tests in reading, maths and science for 15 year-olds. It researches education systems.

In 2011, OECD reported their findings about the 34 OECD countries.

EXAMINATIONS

National examinations: primary level

Only 4 of the 34 countries used national examinations at primary level.

National examinations: lower secondary level (age 16)

15 of the 34 countries used national examinations at lower secondary level. The exams were set at state level by 10 of these. In 3 countries the exams were set at local authority level. 2 countries had a mixture of both state and local authority level exams.

The 2 subjects most commonly covered in lower secondary level exams were maths and national language or language of instruction. 3 extra subjects were used to a lesser extent: science, modern foreign languages and social studies.

National examinations: upper secondary level (age 18+)

More countries, 23 out of 35, use national examinations at upper secondary level.

Sharing data from national exams

14 of the 15 countries which had available data shared it with external audiences and education authorities. All 14 shared results with students. 13 of the 14 shared the data with school administrators. 12 shared it with parents and teachers. Only 8 shared it with the media.

ASSESSMENT

OECD made it clear that “the key purpose of assessment is to provide formative feedback to improve instruction and inform about the relative performance of students.”

Assessment: primary level

Most countries used assessment at primary level

Assessment: lower secondary (age 16)

2/3 used national assessment at lower secondary level. 17 out of 22 used national assessments devised and graded at central-authority level. 3 out of 22 used local authority set assessments. 13 out of 22 used criterion-referenced tests, 8 norm-referenced and 1 used a combination of both.

The subjects most covered in lower secondary assessment are maths and national language or language of instruction. Science and modern foreign languages are also commonly covered.

Assessment: upper secondary (age 18+)

Fewer than 1/3 of countries used national assessment at the end of upper secondary.

Sharing data from assessment

21 out of 22 countries shared assessment data with external audiences in addition to education authorities. 20 countries shared the data with school administrators, 15 with teachers, 14 with parents and students, and only 12 shared assessment data with the media.

12 February 2013

What does a January 2013 review of evidence say about market intervention in education in Sweden and Chile?

Sweden

The Academies Commission (2013) found it’s difficult to come to conclusions about the Swedish free schools programme because Sweden doesn’t routinely collect test and demographic data. The Commission cited Bohlmark and Lindahl (2012) who concluded the programme had improved educational performance and this was driven by the effects of competition. But Bohlmark and Lindahl warned against applying findings from Sweden to other countries because school types and external factors differed. The Commission cited Cook (2012) writing in the Financial Times: the improved educational performance attributed to the Swedish free school programme was extremely modest and the slight positive effects were “not very impressive given the scale of the policy intervention.”

Meanwhile, the Bertil Ostberg, a senior civil servant in Sweden’s Education Ministry, told the BBC (2012) that his Government was setting up an inquiry into profit-making companies running many of Sweden’s free schools. The latest information about the inquiry is here.

Chile

The Academies Commission (2013) found that Chile’s voucher system resulted in over 60% of pupils attending privately-run schools. These schools can charge extra fees above the value of the voucher. A 2000 study found no difference in performance between subsidized private schools and state-funded municipal schools. Chile’s performance in PISA tests remains relatively poor.

Meanwhile, students in Chile have been rioting for more than a year against Chile’s for-profit schools. And the Chilean education minister looks to England for inspiration – not to the present Government’s free school and academy programme but to the Cambridge Primary Review (CPR). The CPR, which took six years to produce, was rebuffed by the last Government. This rejection was described by the late Mike Baker as a “dismissive, knee-jerk response.” Nevertheless, 147 countries have accessed the CPR website for ideas to reform primary education.

The Academies Commission Report can be downloaded here.

Published 16 January 2013   Updated 30 June 2013

What are the examination systems in other countries?

Governments and pundits often compare the UK with graduation systems abroad. It’s important, therefore, to know how other countries and jurisdictions organise their school-leaving examinations. Below is information about high-performing countries/jurisdictions (as judged by PISA and TIMSS tests) most often cited by pundits.

Most of the evidence is from a December 2011 report prepared for the National Curriculum Review by the National Foundation of Educational Research which summarised information from the International Review of Curriculum and Assessment Frameworks Archive. Some jurisdictions (like Shanghai) were not included in the NFER report. Other sources of evidence are highlighted in blue.

Australia:

Pupils have external exams at 18. These exams act as a certificate of school completion and, depending on grade, entry into tertiary education.

Canada: Alberta

At age 15 pupils’ achievement is tested in 5 subjects: Maths, Science, Social Studies, English and French. This is not a formal leaving certificate but shows which pupils are eligible to attend senior high school. At age 18 pupils can obtain one of the following:

• Alberta High School Diploma ;

• Certificate of High School Achievement (for students enrolled in knowledge and employability courses);

• Certificate of Achievement (for students on specific integrated occupational programmes);

• Certificate of School Completion (for students with significant cognitive delays).

The majority of students receive the High School Diploma.

Canada: Ontario

To gain the high school diploma ‐ students must:

• obtain thirty credits in high school;

• successfully complete compulsory Grade 10 literacy test (students aged 15/16);

• complete 40 hours of community involvement.

China: Hong Kong

Until 2012 16 year-olds took the Hong Kong Certificate of Education Examination (HKCEE) after 5 years of secondary education. HKCEE in Chinese Language and English was graded from 5 to 1, with 5 being the highest. Achievement in other subjects was graded A-F with A being the highest. A pass at grade C was equivalent to a GCE ‘O’ Level pass, while an E grade was the basic level of achievement for employment purposes. This structure equals that of GCSE when it was first set in the UK in 1987 (except that Hong Kong omitted the G grade).

In 2012 the Hong Kong Diploma of Secondary Education (HKDSE), designed to be taken after 6 years of secondary school, replaced HKCEE. It comprises 4 core subjects – Chinese Language, English Language, Maths and Liberal Studies – and 2 or 3 elective subjects. There are 5 levels of performance: 5 being the highest and 1 being the lowest.  Level 5 pupils with the best performance can be awarded 5** or 5*. HKDSE grades 3 to 5* in all subjects apart from Maths are benchmarked against UK ‘A’ levels with grade 3 (UCAS 40 points) being equal to ‘A’ level grade E. 5* equals an ‘A’ level grade A or A* (UCAS 130 points). The points equivalence of 5** has yet to be decided. Maths is benchmarked separately – see link for more details.

The Hong Kong Advanced Level (HKALE) is being phased out. It is taken after two years in the sixth form. Pupils usually sit 5 subjects which for most candidates include Chinese Language & Culture and Use of English. HKALE is graded A-F with A being the highest grade. The last HKALE will take place in 2013 for private candidates only.

Hong Kong has established Basic Competency Assessment (BCA)  comprising Chinese Language, English Language and Mathematics have two components: student assessment (on-line system giving feedback to teachers and learners) and Territory-wide System Assessment (pen-and-paper tests).

China: Shanghai

Pupils in China (except Hong Kong) sit examinations at the end of lower and upper secondary school although it is unlikely that pupils who do not attend elite lower secondary schools will perform well on upper secondary entrance examinations. Lower secondary exams are locally administered and their content differs across localities. The results of the lower secondary exams determine whether a pupil enters an academic or vocational upper secondary school or even leaves school altogether. 52.5% of students in China attend academic upper secondary school; in Shanghai, it is 97%.

Shanghai pupils sit a senior school entrance examination (Zhongkao) at age 15 and a High School Graduation Examination (Huikao) at the end of senior secondary school (age 18). Students wishing to enter university take a higher education entrance exam (Gaokua). In many provinces, the Gaokua is criticised for being based on too much memorization which leaves students ill-equipped to cope with the analysis required at university.

Shanghai and an increasing number of other provinces have the right to create their own higher education entrance examinations. Since 2001, the Shanghai exam has been based on testing what students can do, rather than what they can learn by rote. These tests include “integrated papers” which require candidates to show knowledge from multiple disciplines. The university entrance exam usually requires knowledge of the Chinese Language, English Language and Maths plus a fourth subject which can be examined in non-traditional ways (eg oral, written or even practical).

Shanghai universities are basing their admissions process less on test results and more on other criteria (eg overall student performance). Shanghai’s rigorous education system and expanded options for applying to university results in 80% of shanghai students entering university compared to 24% in the rest of China.

Estonia:

16 year-olds take three examinations at the end of compulsory education: either Estonian language and literature or Estonian as a second language, Maths and one subject chosen by the student from English, German, Russian as a foreign language, French, biology, chemistry, physics, geography, history, civic studies and Russian language and literature.

Upper secondary school pupils  (age 19) take a minimum of five upper secondary school final examinations chosen from Estonian, Estonian as a second language, Russian, Russian as a foreign language, civic studies, mathematics, English, German, French, biology, geography, chemistry, history, and physics. At least three of these must be state examinations. The remaining two can be school examinations or state examinations (the latter are integrated with higher educational institution entrance examinations).

Finland:

At age 18 to 19 students usually take the matriculation examination: This comprises at least four tests:

• mother tongue (compulsory)

• three other compulsory tests from second national language, foreign language, maths, and one test from sciences or humanities

• one or more optional tests.

France:

Lower secondary education ends at 15 when pupils take a lower secondary leaving exam, the brevet, which comprises tests French, maths, history/geography and civics education together with continuous assessment from 13-15. After one year of upper secondary education, pupils can leave or continue their education. Those that stay on can choose from a range of Baccalauréate, a technical brevet or vocational certificates.

Japan:

No national assessment. Individual institutions arrange assessment at the end of lower secondary education (age 15) which may influence entry to senior high school although entry tests for these are often administered by municipal boards of education. Each local senior high school selects its own pupils under supervision of boards of education and in accordance with individual board’s regulations. National and private upper secondary schools conduct their own entrance exams. Individual institutions issue a Certificate of Upper Secondary Education at age 18. This Certificate is just one of the requirements for entry to higher education.

Korea (South):

No national examination on completion of lower secondary phase education. However, students may need to take an entrance exam at age 15+ for some upper secondary schools.

At age 18+, pupils receive either a High School Certificate or a Vocational High School Certificate.  All students who wish to go to junior college after upper secondary school (high school) have to take the national College Scholastic Ability Test (CSAT). CSAT involves written tests in subject domains: Korean; Maths; English; Social studies, science and vocational education (pupils choose 9 tests from a range of options); and a second foreign language.

Higher education institutions announce annually their student admission criteria which include elements such as CSAT score, comprehensive high school records, institution‐administered examinations, interviews, essays and recommendation letters.

The CSAT is currently being revised (applicable from 2014). The pressure on students taking CSAT will be significantly reduced ready for when the college entrance system is changed to one centred on an admissions office system.

The Netherlands:

At age 15, schools assess whether students have acquired the knowledge, understanding and skills described in the attainment targets for basic secondary education (known as the first cycle). Pupils then enter the second cycle which prepares students for specific, differentiated terminal examinations:

• VMBO, pre-vocational secondary education qualification: comprises a compulsory common component (Dutch, English, social studies I, physical education and arts I), an optional component, and a sector‐specific component (chosen from: engineering and technology, care and welfare, business or agriculture)

• HAVO, senior general secondary education qualification: common component as above, specialised components and an optional component chosen from subject combinations: “science and technology”, “science and health” and “economics and society”. As well as terminal exams, pupils have to write a project which is expected to take 80 hours.

• VWO, a pre-university qualification: as HAVO but with a greater study load.

New Zealand:

The National Certificate of Educational Achievement (NCEA) is the main qualification at all levels of the senior secondary school. It allows a diverse range of students in an increasingly wide variety of courses in schools to have their achievements recognised and reported. Students completing Year 11 – the final year of compulsory education (age 15/16) obtain credits towards the NCEA.  Approved courses are listed in the New Zealand Qualifications Framework (NZQF).

Students can achieve the NCEA from a wide range of studies within the school curriculum and beyond. Each subject is assessed externally and by internal assessment (externally moderated) using achievement standards developed by education, industry and national standards bodies. These standards are in place for general/academic (school curriculum) subjects and for vocational and technical subjects.

NCEA is intended to be a comprehensive record of what pupils achieve and a ‘launching pad’ for their ongoing learning and future careers. It is standards‐based and complements external assessment with internal assessment in all conventional school subjects at three levels (Level 1 NCEA, Level 2 NCEA, and Level 3 NCEA, broadly equivalent to Year 11, Year 12 and Year 13).

Updated 20 January 2013

Singapore:

Pupils take GCE ‘N’ Level or GCE ‘O’ level at the end of lower secondary education (age 16). Pupils in the Normal Academic stream take a maximum of 8 subjects leading to ‘O’ level. Pupils in the Normal Technical stream study a maximum of 7 subjects that are more technical, eg Design and Technology.

Students with good GCE ‘O’ Level passes are normally admitted to junior colleges, where they complete GCE ‘A’ Level after two years, or to centralised institutes to complete ‘A’ Level in three years. Pupils who complete pre-university education also receive a School Graduation Certificate which includes details of personal qualities, academic and non-academic achievements.

Both GCE ‘O’ Level and GCE ‘A’ Level students can apply for entry to polytechnics. Pupils with GCE ‘N’ Level or GCE ‘O’ Level may also apply for various technical or business study courses.

Students who have completed secondary school education and taken the GCE ‘N’ Level or GCE ‘O’ Level examinations, but who do not qualify for the next higher level examination, usually seek employment.

Update Singapore 23 June 2012:  The Ministry of Education in Singapore clarifies the information given in the NFER report.  Pupils take the Primary School Leaving Examination (PSLE) at the end of primary school (age 12).  PSLE results are used to select pupils for secondary education.  Express Stream pupils spend 5 years studying for ‘O’ level.  Normal (Academic) stream pupils take Normal Academic exams after 4 years.  Normal (Technical) stream pupils take Normal Technical exams also after 4 years.  In theory, NA and NT students who do particularly well in their exams can remain for a further year and take ‘O’ levels.  It is unclear how this transition works in practice when NA and NT study for fewer examinations.  In 2013 NT students will not be offered examinations in Humanities, Literature in English, separate Sciences, or other languages (apart from ‘local’ languages: Basic Tamil, Basic Chinese, Basic Malay).  In 2013 ‘O’ level candidates, on the other hand, can choose from a far wider range of subjects.

Sweden

During lower secondary schooling, Sweden has multiple layers of assessment controlled by schools and teachers. Students receive grades in each term of year 8 (age 15) and the end of the autumn term of year 9 (age 16). These grades, pass, pass with distinction, or pass with special distinction, are based on the goals of the syllabi and are based on nationally approved assessment criteria.

Update Sweden 3 July 2012: This grading system will cease in Autumn 2012 and be replaced with a scale from A to F.  A to E are passing grades and F is a fail.  Grades will be assigned in Autmn 2012 starting with year 6 (age 13).

Schools can also use diagnostic materials from years 6 (age 13) to 9 (age 16).

Pupils take national approved exams in year 9: Swedish, Swedish as a second language, English and Maths. Attainment in these exams is one factor in determining students’ grades. Although these exams are compulsory for schools they are not compulsory for all pupils. Sweden uses these test scores to ensure that grades compare to national standards.

Update Sweden 3 July 2012:  From 2012 mandatory national subject tests are held in years 3, 6 and 9 of compulsory school to assess student progress.  There are also new qualification requirements for areas including high-school studies.

Towards the end of upper secondary schooling, Swedish students receive a grade in each course and a final grade or “learning certificate” that comprises all course and project grades, coursework, teacher designed assessment based on syllabi and nationally approved exams for core subjects (Swedish, English, Maths) and some other areas.

Taiwan (as at May 2010)

Primary school (six years from age 6): Students graduate from primary school with a primary school diploma. There is no test to enter junior high school.

Junior high school (three years from age 12):  Traditionally junior high pupils spend three years preparing for admission exams for entry into senior high schools, vocational schools and junior college.  Despite the introduction of the more holistic Nine-Year Integrated Curriculum in 2004 and the desire of the Taiwanese government to lift stress from junior high pupils, it’s still common for pupils to memorize facts by rote, attend cram schools and take school-based “optional supplementary classes” during holidays as well as after normal school hours.

The Taiwanese Government is attempting to introduce a new “examination-free admission system” for entry into senior high schools and vocational schools.  It hopes the removal of high-stress exams will encourage the broad-based learning-outcome goals of the Nine-Year Integrated Curriculum.  Routes under the exam-free admission system include:

  • Recommendation by a pupil’s school;
  • Direct application by the pupil;
  • Direct application by district registration.

At the same time the number of entrance tests (Basic Competency Tests*) would be reduced from two to one.  It’s envisaged the exam would become a supplemental tool with less weighting thereby reducing student stress.

However, there’s been considerable opposition to the plans from junior high teachers, parents and pupils, and groups which say they haven’t been sufficiently consulted.  Critics complain the new exam-free admission system, due to begin in the 2012 academic year, will not solve the problems of the current exam-based system.  Pupils will still have to take tests (18 in all) during their three years at junior high.  Critics say this would result in more stress as pupils would have to achieve consistently high test scores because they are likely to form the basis of any recommendation for future progression. Critics also say the relentless focus on regular tests would not encourage creativity or research beyond that required for the examinations.

* Basic Competency Test (BCT): Multiple-choice exam taken at end of junior high school.  Pupils are assigned to senior high schools based on results. The BCT comprises: Chinese, English, Maths, natural science and social sciences.  The test is scored out of 300 – there is no Pass or Fail.   There is a separate test for pupils wishing to attend vocational school.

Senior high schools: three years at either senior high school or senior vocational school.  The main academic focus is to score well in the national university entrance exams at the end of three years.

Academic track: Senior High School Leaving Certificate (Diploma) is awarded to students who successfully graduate from high school.

Vocational track:  Students graduate with 162 credits and the Senior Vocational School Certificate of Graduation (Diploma).

University entrance: students can be admitted to university by:

  • Recommendation from senior high school and a test set by college/university departments OR
  • Taking the central university admissions examination.

Both of the above routes require students to take two exams:

  • Subject Competency Test taken in the last term of senior high school.  It includes 100-minute exams in Chinese, English, Maths, natural and social sciences;
  • Designated Subject(s) Examination to test knowledge of particular specialities (1-3 subjects).

The more competitive universities also expect students to have been involved in extra-curricular activities eg student societies, non-governmental organisations and international competitions.

USA – Massachusetts:

No qualification is awarded at the end of compulsory education (age 16). However, one of the requirements for a high school graduation diploma – received on completion of Grade 12 (age 18) is that students pass the Massachusetts Comprehensive Assessment System (MCAS) Grade 10 “competency determination” tests in English, Maths, Science and technology.

18 year olds who complete high school and have passed the MCAS tests are awarded the high school graduation diploma. This is the minimum requirement for US higher education. However, university applicants are also judged on their high school record, courses taken and marks received, teachers’ recommendations and marks in college/higher education admission tests.

Updated: 8 March 2013

 

Is the UK tumbling down the international league tables?

Scare stories about UK pupils plummeting down league tables are exaggerated and not statistically robust.

Programme for International Student Assessment (PISA) 2009

“We used to be fourth in the world for our science education, now we are 16th. We used to be seventh in the world for the quality of our children’s literacy. Now we are 25th. We used to be eighth in the world for the quality of our maths, now we are 28th,” said Michael Gove to the Conservative Conference in 2011. But these figures can only be upheld by using the results of the PISA tests for the UK in 2000. These showed the UK apparently doing well. However, OECD has since found that the PISA statistics for the UK in 2000 were flawed and warned that they should not be used for comparison.

The only figures which can be used to show UK performance are those for the years 2006, 2009 and 2013 The results for the UK in 2012 showed a slight, but statistically insignificant, rise in performance (see faq re PISA for detailed data).

The Learning Curve 2012

This analysis by the Economist Intelligence Unit (EIU) and Pearson ranked the UK in 6th place behind Finland, South Korea, Hong Kong, Japan and Singapore (see faq re The Learning Curve 2012 for more information).

Progress in International Reading Literacy Study (PIRLS) 2011

PIRLS 2011 showed a relative rise in the performance of English 10 year-olds in reading comprehension from 19th out of 45 participants in 2007 to 10th= in 2011. 10 year-olds from Northern Ireland did even better – they were 5th. It should be borne in mind, however, that the participants taking part in PIRLS 2011 were not the same as in 2007 – new countries joined while others dropped out (see PIRLS faq).

Trends in Maths and Science Survey (TIMSS) 2007

In the Trends in Maths and Science Survey (TIMSS) 2007, English 10 year-olds and 14 year-olds scored the highest marks among European countries in maths and science.

Trends in Maths and Science Survey (TIMSS) 2011: Maths

TIMSS still ranks England among the top ten countries for primary maths. 

English 14 year-olds were placed with average TIMSS performers such as Finland (a top performer in PISA tests 2009 and 2012) and USA (a weak performer in PISA 2012).

Trends in Maths and Science Survey (TIMMS) 2011: Science

Although the performance of English pupils in TIMSS 2011 science tests remains high, their position at the top has fallen since 2007 when they were top of the European league at ages 10 and 14.

The performance of English 14 year-olds in science has been consistently high since 1995.  (See faq re TIMSS for detailed analysis)

UNICEF 2013

UNICEF ranked the UK for, among other things, “educational welfare” (see UNICEF faq for information).

OECD Adult Skills Survey 2013

The first OECD Adult Skills Survey 2013 placed England/Northern Ireland at 18th out of 20 countries. The USA was bottom (See FAQ How did England do in the 2013 Adult Skills Survey for data).

Updated 8 December 2013

How are UK pupils measured against children in other countries?

Three major surveys which measure pupil achievement worldwide are TIMSS, PIRLS and PISA.

TIMSS: Trends in International Mathematics and Science Study, first conducted in 1995, reports every four years on the maths and science achievement of fourth grade (year 5, age 10) and eighth grade (year 9, age 14) pupils worldwide. For 2007 and 2011 results see faq about whether the UK is “plummeting” down league tables.

PIRLS: Progress in International Reading Literacy Study, first conducted in 2001, reports every five years on the reading achievement of fourth grade (year 5) pupils worldwide.  PIRLS targets primary school pupils and assesses the reading skills needed to make the transition to “reading to learn.”  For 2011 results see faq about whether the UK is “plummeting” down league tables.

PISA:  Programme for International Student Assessment (PISA), run by the Organisation for Economic Co-operation and Development (OECD) measures the knowledge and skills of 15 year-olds in reading, maths and science.  PISA tests, first conducted in 2000, are taken every three years.  Each PISA survey has a particular focus:  in 2009 it was reading, 2006 was science.  The focus of the 2012 PISA tests will be maths.  In the UK only the results for 2006 and 2009 are valid.  OECD found the response rate for 2003 was too small and on checking the 2000 results, which had previously been published, OECD found that these, too, were flawed.  OECD has warned, therefore, that the 2000 PISA results for the UK should not be used for comparison.  The government has ignored this warning and continues to compare the 2000 figures with those of 2009 to show English state education in a negative light despite the fact the UK pupils achieved the OECD average in reading and maths, and were above average for science.  In late 2012 the UK Statistics Authority expressed “concern” about the Government’s use of PISA figures.  For further details see the faq about whether the UK is “plummeting” down league tables.

Both PISA and PIRLS focus on an expanded idea of reading, ie “reading literacy” rather than simply “reading”.  Both surveys regard reading “as an interactive, constructive process and emphasise the importance of students’ ability to reflect on reading and to use reading for different purposes”.  PISA and PIRLS do not just test decoding in the way that the Year One Phonics Screening test does.

Updated 1 January 2013

 

Narrowing the Gaps

How successful have recent policy initiatives been in closing the social class gap?

According to this report by the RSA The Social Class Gap for Educational Achievement: A review of the literature initiatives over the last 20 years have failed to make a significant impact on the social class gap and more innovative thinking is needed about how schools work with children from working class families and how the curriculum is organised.

School performance

How can school governors help in holding schools to account?

Background: The accountability system in England (Ofsted, league tables) exerts a “tight grip” on schools. This can focus attention on maximising results rather than on the needs of learners.

Two secondary school governors*, Chris Williamson and Jo Field, believe a good school is more than just high results. Parents want their children to become “confident, well-rounded and employable.”

Governors are essential in holding school leaders to account for the quality of all-round education. But there’s a fine line between offering support and hold school leaders to account and intruding in ways that make it more difficult for school leaders to do their job.

Williamson and Field said school governors should:

1 Work in partnership with school leaders to develop a strategic plan;

2 Analyse regularly-received reports;

3 Ask challenging questions;

4 Monitor all-round performance not just test results;

5 Appoint the head and carry out reviews of the head’s performance;

6 Get an accurate picture of school life by visiting regularly;

7 Act as a “critical friend”;

8 Monitor the “social, moral, spiritual and cultural development” of pupils.

The future will see greater collaboration between schools to raise school improvement.

Chris Williamson and Jo Field are both governors of the Howard of Effingham Secondary School, Leatherhead, Surrey. Their advice was published in the Wellcome Trust report, Effects from accountabilities (2013).

More information about the role of school governors is here.

 

Is Ofsted friend or foe? How did an ex-Chief Inspector of Schools answer this question in 2013?

Sir Mike Tomlinson, ex-Chief Inspector of Schools, writing particularly about science teaching, said Ofsted had improved the quality of teaching in science.

However, he said Ofsted had also had a negative impact which resulted in teaching to the test, over-simplified use of data, and the stifling of innovation.

Sir Mike concluded that the “weight of all the accountability measures needs to be reduced and test and examination requirements overhauled”. Ofsted should rely less on test data and more on direct observation.

This change would give teachers the room and the confidence to innovate and develop good teaching practices.

Note: the Academies Commission (2013) also found that another accountability measure, league tables, stifled innovation.

A longer summary of Sir Mike’s article is here.

*Sir Mike Tomlinson’s essay, “Inspection: friend or foe?” is published in the Wellcome report Effects from Accountabilities (2013) available here.

 

What are the key factors of a school accountability system? The OECD lists ten.

Andreas Schleicher, OECD, listed ten key findings in the Wellcome Institute’s report on accountability (2013).

Schleicher’s Introduction

Autonomy and accountability go together. School systems which allow schools a high degree of autonomy in allocating resources, deciding what’s taught and how it’s assessed, tend to do better than school systems which give little autonomy to schools. But this autonomy needs to be tied to accountability.

Pupils tend to do better overall in countries which use external, standards based exams. However, there isn’t a clear-cut relationship between pupil performance and using standardized tests. The gap between advantaged and disadvantaged pupils tends to be lower in countries which use external exams. (This isn’t the case in the UK which has one of largest gaps between advantaged and disadvantaged pupils).

Schools which compete for students tend to raise performance. However, this is often a bi-product of pupils’ socio-economic status. Parents with a higher socio-economic status are more likely to choose schools with high academic results. Where exam results are published, this data can influence choice. But published statistics don’t identify what factors influence performance. Such data can, therefore, give inadequate help to politicians and educators in planning and running effective assessment and accountability systems.

The OECD Review of Evaluation and Assessment Frameworks for Improving School Outcomes* lists 10 key findings:

1 The pupil must be at the centre of a framework for evaluation and accountability. There needs to be coherence between pupil assessment, teacher and school leader appraisal, judging schools and evaluating the school system as a whole. This coherent framework can then feed back into classroom practice.

2 Planning and intervention should be informed by a “balance of components” such as pupil outcomes, the performance of the entire school system and contextual information.

3 There should be agreed general principles for school evaluation, pupil’s formative assessment and so on. But there should be sufficient flexibility to allow for local variations.

4 The non-public sector should be integrated into the accountability system. Some countries require private schools to comply with the national framework. Or there could be “protocol agreements” which lay down general principles which private schools are encouraged to follow.

5 There should be a balance struck between evaluation and accountability. The relationship between assessment designed to inform teaching (formative assessment) and “criterion-based summative assessment” needs to be managed.

6 There needs to be sufficient communication between different parts of the accountability framework so that evaluation feeds back into effective practice.

7 Reputable and authoritative agencies should provide advice on implementing evaluation. These agencies should encourage innovation based on research.

8 Priority should be given to develop and sustain an effective accountability framework. This takes considerable time and resources.

9 Emphasis should be placed on how evaluation and accountability can improve teaching practice.

10 The purpose behind evaluation and its results must be communicated clearly. A long-term vision is essential if stakeholders and society as a whole understand the reasons behind the evaluation framework and its role in national strategy.

The essential focus of any accountability system is the pupil. This should never be forgotten. There are huge challenges in building effective evaluation.  It’s essential that any accountability system does not have “unintended negative consequences” which undermine the quality of learning.

(The OECD warned in its Economic Survey UK 2011 that there was too much emphasis on raw test results in England. This had negative effects such as teaching to the test, “gaming”, neglecting other essential skills and producing incentives for schools to discourage applications from parents of children likely to reduce a school’s exam performance.  But the consultation document of secondary school accountability published in 2013, no longer available on the DfE website, promised even more emphasis on exam results.  It remains to be seen whether the proposed accountability system will be changed after the consultation results are analysed.)

(Note: the words in brackets are those of the author not the OECD.)

2 May 2013

 

 

Is it true that two thirds of businesses in 2012 complained about school leavers’ basic literacy, numeracy and computer skills?

“Last year, the CBI reported that two thirds of businesses were complaining that too many school leavers were struggling with basic literacy and numeracy and were unable to use a computer properly. Does my right hon. Friend agree that it is unacceptable to ask our employers to set up remedial classes in these most core basic skills?” said Jonathan Lord, MP for Woking, on 22 April 2013, at Education Questions.

But are his figures correct?

The 2012 CBI report found the number of businesses expressing dissatisfaction with the standard of literacy among school and college leavers was 35%. 30% were not satisfied with numeracy. The CBI gave no data for dissatisfaction with school leavers’ IT skills although it did find that 66% of employers were not satisfied with the IT skills of their existing employees.

CONCLUSION: It’s inaccurate to say that two-thirds (66%) of employers are not satisfied with the literacy, numeracy and IT skills of the school leavers they employ.

But what about the number of employers who set up remedial classes for school and college leavers? Do 66% of them provide remedial classes?

The CBI said that 42% of employers reported that they organized some kind of remedial training for some school/college leavers. This is broken down as follows (some employers provided remedial training in more than one area):

18% of employers provided remedial training in numeracy.

20% provided training in literacy.

23% provided remedial training in IT.

What we don’t know is what form this remedial training took. Was it on-the-job or day-release at a college? Was it really “remedial” or was it because the employer wanted to improve a school leaver’s skills not just correct a deficit? Was it part of progression such as raising a school leaver’s skills from Level 1 to Level 2?

CONCLUSION: It’s misleading to imply that two-thirds of employers provide remedial classes.

26 April 2013

 

Would Performance-Related Pay improve educational outcomes?

There is little evidence that Performance-Related Pay (PRP) improves the quality of teaching.

The Sutton Trust found that relating pay to performance had “Low or no impact for moderate cost, based on very limited evidence.” It concluded that “investing in performance pay would not appear to be a good investment without further study” and “Performance pay has been tried on a number of occasions, however the evidence of impact on student learning does not support the approach.”

The Organisation for Economic Cooperation and Development (OECD) studied PRP after the 2009 PISA tests. It concluded:

“Performance-based pay is worth considering in some contexts; but making it work well and sustainably is a formidable challenge.”

The Project on Incentives in Teaching (POINT) researched the use of bonuses by the Metropolitan Nashville School System over three years. Researchers found the use of bonuses linked to performance did not contribute to improved pupil outcomes. POINT had little effect on teachers’ classroom practice. The researchers pointed out they only studied one particular model of performance pay, however, and other systems such as linking incentives with professional development might work. However, this had not been tested.

Dan Pink, author of Drive: The Surprising Truth About What Motivates Us, found that encouraging workers or learners with external rewards like money was a mistake. What really motivates people is:

(a) Autonomy – self direction is more motivating than top-down direction

(b) Purpose – people like to feel they are making a contribution.

(c) Challenge – people are motivated by being able to improve their own competence

Pink’s philosophy can be summarized as the APC of motivation.

CONCLUSION: There is little evidence that PRP would improve educational outcomes and may be counter-productive if it leads to a narrow focus on measurable goals like test results.

See this thread for more details.

Updated 26 March 2014

Has there been grade inflation in GCSEs and A levels?

The evidence is inconclusive.

The BBC Radio 4 programme, More or Less, (21 August 2009 about 20 minutes in) found there had been grade inflation at A level of two grades.

A year later, in August 2010, FullFact investigated allegations of grade inflation. It concluded “on the testing question of grade inflation in the UK, it seems difficult to offer any kind of conclusive answer.”

In 2011, the Organisation for Economic Cooperation and Development (OECD)* said the apparent rise in GCSE grades in England was not matched by a similar rise in the Programme for International Student Assessment (PISA) scores which had remained static.

The media regularly reports that there has been grade inflation but this is not always upheld by evidence. For example, FullFact cast doubts on claims in a Daily Telegraph article that maths standards had declined. The academic whose research had been used by the Daily Telegraph admitted that the Trends in Maths and Science Survey (TIMSS) actually showed a different trend – that English pupils’ maths ability improving . TIMSS is a smaller survey than PISA – nevertheless it showed that English pupils were among the top performers globally.

Witnesses to the Education Select Committee (January 2012) gave mixed answers to the question of what caused rising GCSE results and whether this increase was actually grade inflation. One academic claimed there was “not a great deal of evidence for grade inflation” – there had been “interesting” research but “all had methodological issues”. The cause of rising results was explained variously as “teaching to the test”, “more efficient teaching”, “more appropriate provision in schools”, pupils choosing courses they were more motivated to study and schools concentrating on pupils at the C/D borderline. One witness said that outcomes would go up over time when exams test “a pre-defined set amount of subject matter” and where question papers and mark schemes were available. He thought the term “grade inflation” when applied to these rising outcomes was a “kind of a negative terminology”.

Written evidence to the Education Select Committee (28 February 2012) said that “English public examinations are respected internationally and emulated in many countries…there is a great deal of public confidence in the examination system in England. Nonetheless, recent press reports will reduce confidence levels, at least temporarily.” The authors said that falling exam standards were a “prevailing media narrative”. Although the authors did not say so, this perception has been fuelled by the present Government. One recent example is Nick Gibb, former Minister for Schools, who told the BBC that “There is evidence from academic researchers that show that there has been grade inflation over the decades.” But evidence to the Education Select Committee did not back this up.

CONCLUSION: there is no firm evidence to support fears about grade inflation.

*OECD Economic Surveys 2011, not available freely on the internet, but details of how to obtain a copy are here.

UPDATE 24 April 2013

The Oxford University Centre for Educational Assessment (OUCEA) published a report summarising evidence about examination systems.  It found that grade inflation at GCSE had not been established.

Updated 24 April 2013

 

Do academies get better results, or improve more quickly, than other state schools?

At the end of January the Department for Education published a massive amount of information, 208 items of data on GCSE results for every school in England. This makes it possible for the first time to carry out a full analysis of how academies and non-academies compare. And, despite all the claims of government supporters, there is no evidence of better GCSE performance in academies.

The first thing to watch out for is the practice of policy-making by anecdote. Academy supporters have a tendency to focus on schools like Mossbourne and Burlington Dane, or the ARK chain. These are the best performing academies but the fact that these have done well does not mean academies as a whole have done well (though it would be good to study and learn from these schools, as from high-performing non-academies).

The government tends to quote growth figures for academies, which generally look impressive. However the analysis below shows two faults in this. First, schools in disadvantaged areas have generally done well. When academies are compared to similar schools, there is no clear pattern of extra growth. Further when GCSE equivalents (like Btecs) are removed the academy growth is generally less than in non-academies.

These are the key Local Schools Network posts analysing the data released by the DfE on the 2011 GCSE results, and comparing the performance of academies and non-academies:

Did academies grow more in 2011? Not when compared to similar schools

The main Dfe claim about growth from 2010 to 2011 in the GCSE results of academies, that it was twice that of non-academies, does not stack up when they are compared to similar schools.

Sir Michael Wilshaw is right: Most outstanding schools are not academies

The evidence shows Sir Michael is right on two counts: most outstanding schools are not academies and many schools in disadvantaged areas are doing amazing work.

DfE Press Release condemns academies performance

The DfE criticism of schools where few students take academic subjects is, above all, a condemnation of academies

DfE data shows success of local schools

The last three years has seen a transformation in the performance of schools in the most disadvantaged areas, with % achieving 5 GCSEs including English and Maths rising from 35% to 50%.

Established academies: still no evidence of better performance

The DfE argues that a fairer comparison would be with academies that are at least 5 years old. The evidence shows that these also perform no better than comparable non-academies.

Academy chains: No case for expansion

The record of the academy chains is poor and gives no basis for expansion.

“Failing schools”: Do academies do better?

The answer is No. Even with this group, they fare better as LA schools.

Students with low prior achievement: Inner city London comprehensives do best

Nationally only 6.5% of students of ‘low prior achievement’ get 5 A-Cs including English and Maths. Inner London schools do over twice as well, with Hackney achieving 22% and Tower Hamlets 23%.

Academies: The evidence of underperformance

When compared to comparable schools (in terms of levels of disadvantage), the data show academies under-perform on a range of criteria.

Post-script

The above analysis was used as the basis of an article in the Observer on 26th February. Crucially it states ” DfE spokesman did not deny the accuracy of the statistics”

Sources outside Local Schools Network

Liberal Conspiracy: Why more academies will make education worse

Anti-Academies Alliance: GCSEs, academies and disadvantage: a higher standard of government misinformation

Note: This is a reference page, and further links will be added to make this an easy-to-use link to all the 2011 GCSE data analysis. The date of publication is updated to keep it prominent.

Data Notes: The Academies  figure, throughout these posts, refers to the category of sponsor-led academies, of which there are 249. It does not include the ‘converter academies’, of which there were just 25 at this point. Non-academies include those classified as community, foundation, CTCs or voluntary aided schools, 2,681 in total. Special schools are not included.)

Data sources: The DfE data release can be obtained here:

http://www.guardian.co.uk/news/datablog/2012/jan/26/secondary-school-league-tables-data?INTCMP=SRCH#data

Some people have found it difficult to download this file. If you have difficulty, feel free to email me on henry@happy.co.uk and I will send you a copy of the file. The above analysis was generally done in Excel with Pivot tables.

 

 

 

 

 

 

 

 

 

 

What makes a successful school?

OECD found that the best-performing school systems tend to be those that are most equitable – they don’t segregate children academically or by virtue of where they live. The first question, therefore, is how equitable is the English education system and whether government policies will make it more or less equitable. OECD (1) warns that the free school/academy programme needs careful monitoring if it is not to impact negatively on disadvantaged children.

The second question is what makes an effective school. But before that there needs to be a consensus about how to measure effective schools. OECD (1) has warned that there is too much emphasis on raw examination results in English education and praised the contextual value added measure (CVA) as a step in the right direction. However, the Government has now abolished CVA and pushes a “no excuses” mantra which fails to take into account the context in which a school operates.

Professor Pamela Sammons, writing in late 2007, identified processes found in effective schools. These were effective leadership, effective teaching, focus on learning, a positive school culture, high and appropriate expectations for pupils and staff, emphasising responsibilities and rights, monitoring progress at all levels, developing staff skills and involving parents in productive and appropriate ways. More recent research from Harvard identified five factors: giving pointers to pupils more frequently, using assessment to plan instruction, meeting pupils in small tutor groups more often to discuss their progress, teaching for more days and longer hours than other New York schools, and having high expectations. The Harvard report, however, came with warnings: the researchers did not look at other possible factors, such as school leadership, and they focused only on a small number of charter schools.

The third question is what is effective teaching. Muijus and Reynolds (2) noted:

“In the absence of a substantial body of knowledge about effective practices from the research community, much use was made of definitions of what ‘effective teachers’ did as judged by Ofsted, the General Teaching Council and/or the Department for Children, Schools and Families, yet these judgements have not always been research based and may be open to political manipulation.”

Instead of focussing on a one-size-fits-all solution to raising standards (ie academy status) it would be better if there were a full, unbiased and rigorous assessment of the research into what makes a school effective and what comprises effective classroom practice. The Government says it wants its policies to be evidence-based. It should, therefore, look at all the evidence and not just that which supports its narrow, preconceived ideas.

(1) OECD Economic Survey UK 2011, not freely available on the internet but details of how to obtain a copy are here.

(2) p6, Muijs, D, and Reynolds, D, “Effective Teaching: Evidence and Practice”, 3rd Edition, 2011, London, Sage Publications

 

What about maths? How many school children leave school innumerate?

The threshold level for functional numeracy is Entry Level 3 (which is actually lower than Level One). Anyone who reaches Entry Level 3 in mathematics has just enough numerical skills to cope with everyday life. Level One is achieved by any school pupil who gains a GCSE grade G in mathematics. So how many school leavers failed to gain a GCSE grade G in Maths in 2012? The answer is 1.8%.  And some of these could still be at Entry Level 3 – the threshold level for functional numeracy.

But there will be some 16 year-olds who were not entered for GCSE Mathematics. According to Skills for Life, “51,000 pupils (around 8%) left school without Level 1 in Mathematics in 2006/7.”  But a later Skills for Life (2011) found that 27% of 16-18 year-olds were functionally innumerate in numeracy tests taken after leaving school.  Skills for Life 2011 expressed concern about possible loss of skill between gaining GCSE and being tested for numeracy at a later date.  No reasons were given but they could include such things as GCSE Maths covering the full range of Mathematics (geometry and algebra as well as numeracy) or that pupils had been drilled to pass GCSE but lacked deep knowledge*.

A Sheffield University study in 2010 said about 22% of 16-19 year-olds had insufficient numeracy skills for full participation in today’s society.  However, it went on to say that most young people had functional skills and and “those with the highest skills are up with the best in the world.”

CONCLUSION: The figures for functional innumeracy in 16-19 year-olds vary from 8% (school leavers, published 2007) and 27% (16-18 year-olds taking numeracy tests after leaving school, published 2011)  However, these figures are contradicted by the low proportion (1.8%) of GCSE Maths entrants who failed to achieve a GCSE Grade G.  Of course, not all 16 year-olds would have been entered for GCSE for reasons including illness, insufficient grasp of English, gaps in school attendance and disaffection*.  Nevertheless, the figures for functional innumeracy are a cause for concern.  Skills for Life found that those who stay in school beyond 16 are less likely to be functionally illiterate or innumerate.  The participation age for education and training will soon be 18.  It’s to be hoped that there will be a corresponding fall in the number of school leavers with poor numeracy skills.

That said, it would be wise to heed the warning given at the end of the Sheffield research.  It said comments about functional illiteracy and innumeracy should be made with “due humility” because judgements about necessary skills were made by “experts” and not by people themselves.  The latter may consider that they can function quite well in society even though “experts” think they cannot.

 

*NOTE: these sentences are opinion.  They are not backed up by any research.  The apparent loss of skill between GCSE and later tests, and the reasons why some pupils don’t take GCSE would be fruitful areas for investigation.

UPDATED 7 June 2013

Are 20% of school leavers illiterate?

No.  Illiteracy means unable to read and write.  Illiteracy is often confused with “functional illiteracy”.  Functional illiteracy means being able to read and write but below the level of competence needed for everyday living.  Someone who doesn’t reach this basic level, called the threshold level, is fuctionally illiterate.  According to the Government the threshold level is Level One.  Any school leaver who gains a GCSE G or above in English has reached Level One.  Pupils who gain GCSE C or above in English have reached Level Two.

So how many pupils did not gain at least a GCSE G in English in Summer 2012?   Only 0.7%% did not gain this basic grade.  So, only 0.7% % of pupils entered for GCSE English in Summer 2012 failed to reach Level One*.  In the November 2012 resits, 0.2% failed to reach Level One*.   In 2011, 1.3% of GCSE English candidates were awarded a U.  The number of pupils, therefore, failing to reach Level One in GCSE English in 2012 is lower than in 2011.

There will be some 16 year-olds who were not entered for GCSE English at all. According to the Government’s Skills for Life Survey in 2011, 15% of 16-18 year-olds didn’t reach Level One.  This is a far higher figure than the percentage that didn’t reach Level One in GCSE English exams in 2011 (1.3%).  However, there may be good reasons why pupils were not entered for GCSE.  Reasons can include severe educational needs, profound disability, illness, missing months of schooling for whatever reason, inability to speak English, and so on.

Another figure for functional illiteracy comes from the Programme for International Student Assessment results for the UK in 2009.  According to the OECD, which administers the PISA tests, “Students who do not attain the PISA baseline proficiency Level 2 in reading lack the essential skills needed to participate effectively and productively in society.”  So PISA Level 2 is the same as the Government’s Level 1 – the threshold of functional literacy.

How many UK pupils did not reach PISA Level 2 in reading in 2009?  The answer is 18%.  Note: this figure is for “functional illiteracy” NOTabsolute illiteracy.  The figure for the UK is slightly less than the OECD average of 19%.  Other countries with the same level of functional illiteracy among 15 years-olds as measured in PISA tests are Ireland, Sweden, USA, Germany and France.

A Sheffield University report** came up with a figure of 17% in 2010: “In particular, about 17% of young people age 16-19 have poorer literacy…than is needed for full participation in today’s society.”  But the Sheffield academics gave a warning: literacy has been defined by “experts” based on what they think other people should be able to do rather than on surveys about what people actually need to be able to do for their own purposes. The report ends:

“Meanwhile, all ascriptions of poor literacy and numeracy, whether to 13- to 19-year-olds or to adults, should be made with due humility – those who have the power to decide what other people should be able to do have imposed their views on those who do not.”

So illiteracy, functional illiteracy and poor literacy and numeracy are terms which are volatile and can vary according to the person doing the judging.  Based on GCSE results the number of school leavers who are functionally illiterate is less than 1%.  But this of course doesn’t include pupils who didn’t take GCSE.  The Department for Business, Innovation and Skills (BIS) Skills for Life Survey 2011 gave a higher figure of 15%.  Sheffield University said 17% but warned that definitions of “functional illiteracy” should be made with humility – those who are said to be “functionally illiterate” may not think themselves to be so.  PISA put the figure higher at 18%.

Conclusion: claims that 20% or more of children leave school unable to read and write are false.  Being “unable to read and write”, ie being “illiterate”, is not the same as “functional illiteracy” ie the ability to read and write but not to a sufficiently high standard to be able to cope with everyday life.  The figures for functional illiteracy among school leavers ranges from less than 1% to 18%.  Confusing the two terms “illiteracy” and “functional illiteracy” is misleading.

*GCSE results for English/English Language for 2012 and other years downloadable here.

***Sheffield University 2010 report,” THE LEVELS OF ATTAINMENT IN LITERACY AND NUMERACY OF 13- TO 19-YEAR OLD IN ENGLAND, 1948–2009” downloadable here

Updated 16 February 2013

What effect, if any, does disadvantage have on school children?

All pupils, whether advantaged or disadvantaged, tend to do worse in schools where there are a large number of disadvantaged children.   In a school with a majority of advantaged children, all children do better.  It doesn’t matter if an individual child is advantaged or disadvantaged – all are more likely to gain higher grades when in a school with a majority of advantaged children.

The ideal school system, therefore, would be one where every school had a majority of advantaged children.  But this is not always possible and is likely to be unpopular with parents especially if children have to be bussed around.   However, the Government will need to address this problem of disadvantaged schools and not just judge them as “failing” because their results are not as high as those of more advantaged schools.  The Pupil Premium which gives extra money to schools for each pupil eligible for free school meals is a step in the right direction.

References and further information:

So what, if anything, can be done to help all pupils in schools where there is a large number of disadvantaged children?

Firstly, resources needed to be directed where they are most needed – to disadvantaged children.  The Pupil Premium is one way of addressing this as it might encourage advantaged schools to take in more disadvantaged children.  Another way is to attract the most talented teachers to disadvantaged schools.  The Labour government tried to do this by offering a golden handcuff for young teachers with high grades if they worked in disadvantaged schools.  However, this would not have attracted the most experienced teachers.  In any case, the Coalition government has abolished the scheme.   High quality teachers raise test scores so the government should consider measures to attract and retain the best teachers.

Another way in which all pupils in disadvantaged schools could be helped is to recognize that being in a disadvantaged school impacts on results.  However, when an examination board, AQA, published a discussion paper suggesting a way in which A level results could be weighted according to the type of school, the suggestion resulted in howls of protest from advantaged schools, particularly independent ones, politicians, including Mr Gove, the Secretary of State for Education, and large sections of the media.

References and further information:

 

 

 

Teachers

What is a qualified teacher?

According to the Department for Education (April 2012) a qualified teacher is one who has been awarded qualified teacher status (QTS) either by successfully completing a course of initial teacher training (ITT) or through other approved routes.

Teachers comprise the following:

a. Teachers with QTS or with the equivalent gained elsewhere in the European Economic Area (EEA);

b. Teachers without QTS, but with a professional qualification gained outside the EEA who have been in service for less than the 4 years (beyond which full QTS status gained in the UK is required);

c. Instructors without QTS, but with special qualifications in, or experience of, a particular subject. These can only be employed for only so long as a qualified teacher is unavailable.

An unqualified teacher is either a trainee working towards QTS; an overseas trained teacher who has not exceeded the four years they are allowed to teach without having QTS; or an instructor who has a particular skill who can be employed for so long as a qualified teacher is not available.

 

How can the Government attract talented people into the teaching profession?

Attracting and retaining high quality teachers starts with training.  In Finland, the top-performing European country in PISA tests, teachers are trained to a high level in both subject content and teaching theory.   The English Government says it wants high quality graduates to enter training and then undermines its rhetoric by saying that teaching is a “craft” and free schools need not employ trained teachers.  And there needs to be high quality in-service training and professional development.

Other measures could include improving pay and work conditions.  Starting salaries for teachers are relatively high in England but low top wages may discourage experienced teachers from remaining in the profession.  Attacks on teacher pensions and a rise in the number of “no hours” contracts are not likely to encourage graduates to become teachers when they can earn more and have a better work/life balance elsewhere.