Earlier this year, Professor Robin Alexander, perhaps our foremost authority on curriculum reform, provided a definitive analysis of how international comparisons between schools could be put to much better use than they currently are. In particular, Alexander critiqued the ways in which politicians, policy “wonks” and other relevant people abuse these international comparisons to suit their own particular agendas.
His full article is published down below. It really needs to be read in full if you are going to absorb its full import, but there are a few highlights that are worth picking out now because I firmly believe this analysis should change the way all of us think about international comparisons between schools.
Alexander’s paper examines the reasons why we compare schools systems in the first place. He notes how making such comparisons has become ‘political and media obsessions, generating celebration in some quarters and panic and blame in others.’ He then proceeds to show how this obsession has a long-standing history and provides this gem of a quote from a British educationalist,
Michael Sadler, writing over a century ago: “ ‘In studying foreign systems of education we should not forget that the things outside the schools matter even more than the things inside the schools, and govern and interpret the things inside…No other nation, by imitating a little bit of German organisation, can thus hope to achieve a true reproduction of the spirit of German institutions…All good and true education is an expression of national life and character…”
Using Sadler’s views as a starting point, Alexander queries the approach taken by many educationalists and politicians who have sought to cherry-pick ‘little bits’ of what they perceive is the best practice from countries all over the globe. In particular, Alexander is troubled by the ways in which
PISA and
TIMSS data – perhaps the best known ways of comparing school systems – are used by the media and politicians to paint our schools in either a positive or, as is increasingly the case, a negative light. For Alexander it is “ludricrous” to infer that students are failing after they’ve done badly on a brief PISA test, and yet as he points out, both media and politicians in many countries, not just our own, do this.
Alexander’s paper explores and critiques the different methodologies of various international comparisons, unmasking their strengths and weaknesses. He finds that the
National Research Council (NRC) of US National Academies categorisations are particular useful. The NRC identifies three main types of international comparison:
Type I = large-scale international student achievement studies such as TIMSS and PISA, which are typically “quantitative”.
Type II = desk-based extrapolations of existing international data in order to propose policy such as the McKinsey reports which Michael Barber was closely involved with.
Type III = other more descriptive and “qualitative” accounts of different education systems which may involve ethnographic studies of schools across different countries such as Patricia Broadfoot’s exploration of being a learner in England, France and Denmark (Osborn et al 2003)
The NRC noted that while the vast majority of studies are Type III, the research that receives the most funding is Type I and Type II studies with reports like the
McKinsey reports receiving millions. But as Alexander says: “when Type II comparative studies are linked to the imperatives of policy they can be highly selective in their use of evidence.” Alexander is particularly withering about McKinsey, pointing out that its conclusions are firstly very obvious (such as good teaching makes a difference) and offer an “impoverished view of teaching and learning, a thin evidence base, implausible arguments about the mechanisms and processes of school improvement, technocratic and authoritarian language and a pervasive neglect of culture and political context.”
In his conclusion, Alexander says: “a world class educational vision needs to be matched by a world class approach to defining and assessing educational performance, whether of students, schools or nations. Is our account educational performance consistent with what education should be about? I don’t think so.”
Alexander is not critical of PISA and TIMSS per se -- he acknowledges that those involved in this research are fully aware of its limitations – what he does have a problem with is the way such research has been so narrowly interpreted. He makes a strong case for us to look more closely at the Type III reports. In this research, you are much more likely to find teaching and learning explained, analysed and explored in its context. This research often gives a much richer and more nuanced picture of what is happening in schools in other countries and, as a result, can be better places to start when considering school improvement. And yet, governments are not funding such reports.
He ends his paper with a plea: “we must replace the rampantly supremacist or narrowly nationalist view of education by a vision which is more in tune with the true complexities of globalisation, with the perilous condition of our world and with the needs of the world’s children.”
For me, Alexander’s report is a “game-changer” because it conclusively shows how these international comparisons are increasingly abused by politicians and media in order to pursue narrow and dogmatic agendas.
The full paper can be read here:
Robin Alexander on making better use of international comparisons between schools
Comments
So, one law for the rest of us and quite another for the junketing education establishment, eh?
Priceless.
When I was at the HMC last week there was a strong sense - explicitly confirmed by Graham Stuart, Chair of the Education Select Committee, who spoke in another session - that there's far too much ill thought out change going on in relation to education. And now Boris Johnson has jumped on the bandwagon.........
There is indeed, as Melissa said, "too much ill thought out change going on in relation to education" - rushed and careless, relying on cherry-picked bits of evidence which is misrepresented or quoted out of context, by politicians eager for glory and who appear to have half-an-eye on their future careers.
The last Government launched the London Challenge in 2003. There were several elements to this but the three most important were:
Sponsored Academies
The use of outstanding schools to mentor others
A focus on improving the quality of teaching – especially through Teach First
Each of these strands has had a profound effect on performance and on my thinking. In each case this Government has learnt the lessons and is spreading the benefits of these reforms across the rest of the country.'
http://www.education.gov.uk/inthenews/speeches/a00210308/michael-gove-at...
The self-styled "school reformers" would have us believe that a hurricane wiped away all the rubbish public schools and teachers unions and, in their wake, brand new shiny high performing charters sprang up and, virtually overnight, transformed education in New Orleans.
This chart. educatenow.net/wp-content/uploads/2011/07/2011_Voucher_vs_RSD_Performance_by_School.pdf shows the proportion of students in each charter school and voucher school who were rated basic or above on state tests in reading and maths.
The statewide average (in a low-performing state, Louisiana) of students who reached basic or above was 75%.
Only six of the charter-voucher schools met or exceeded this average (and two more came close).
The average proportion of students in the New Orleans Recovery School District that reached basic or above was 49%.
Something like 80% of the poor black population never returned after Katrina, so the "reformers" can't claim that the charter effect performed a miracle in raising attainment for poor black children.
It was Gove, then, who said how he'd been inspired by "Singapore, Finland and New Orleans". Gove's praise of schools in New Orleans doesn't stand up to evidence as this post revealed:
http://www.localschoolsnetwork.org.uk/2012/06/gove-looks-to-new-orleans-...
Gove went on to praise the London Challenge which a recent report has shown was more successful than the academy programme in raising achievement. However, that's not what Gove said. He said sponsoring academies was the main strategy in the London Challenge but it was the support given to weaker schools that raised standards. The report into the City Challenge (called London Challenge in the capital) concluded:
"...pupil attainment in underperforming schools supported by City Challenge improved “significantly more” than in other weak schools, including sponsored academies."
But Gove used the success of City Challenge to plug academies and academy conversion.
http://www.localschoolsnetwork.org.uk/2012/09/city-challenge-was-more-su...
I experienced a little of this when my school was selected (at random) by PISA . The pupils selected (based on birth dates) from Year 11 were really fed up. The tests were taken very close to GCSE exams and I had a several outraged letters from parents,most of very bright children, about this additional burden at what as I recall was very short notice.
The key point was the total lack of motivation on the part of the children for the tests compared with their attitude to GCSE. Its difficult to believe this didn't affect their performance
Add new comment