10 Things You Need to Know about International Assessments

James Harvey's picture
 3
School Myths and the Evidence that Blows Them Apart by Melissa Benn and Janet Downs should be required reading for everyone concerned about high quality education for all. The negative narrative about public (state) schools holds, according to Downs, that, “Public education is broken. Only competition, choice, standardization of outcomes and test-based accountability can fix it.”

Precisely that narrative also governs discussion of schools in the United States. It is a story line underpinned by international large-scale assessments (ILSAs) such as PISA, which pretend, in defiance of the scholars who created these assessments, that a single assessment number reveals all you need to know about national school systems. Officials throughout Europe and North and South America are making decisions about their schools on the basis of very questionable ILSA results.

Here are Ten Things You Need to Know about International Assessments. You probably know that they compare apples and oranges. Still, you may be surprised to learn that these assessments were never intended to create football league tables that line up and rank nations against each. Nor is the “international average” a weighted average representing all the students in participating nations.

There’s more in that vein, much more. The “Ten Things . . . ” grew out of an analysis, School Performance in Context: The Iceberg Effect, completed in January by the National Superintendents Roundtable and the Horace Mann League in the United States. The analysis compared the educational systems in nine nations (Canada, China, Finland, France, Germany, Italy, Japan, the United Kingdom, and the United States) on 24 critical indicators divided into six broad dimensions: equity, social stress, support for young families, support for schools, student outcomes, and system outcomes. In many ways, the picture that emerges from the study of schools in the United States and the United Kingdom is similar.

Below is a table summarizing the results by the six dimensions for each of the nine nations. It’s been described as a sort of “consumer’s guide” to the quality of national school systems. In this table, the number in each box represents the number of points each nation received within each of the six dimensions, with a maximum of 40 and a minimum of 8. The higher the number, the better. Blue is a positive finding; gray is neutral; and nations don’t really want too many negative maroon boxes. What stands out here? Finland is the only one of the nine nations with five blue dimensions. The United States is the only one with three maroon dimensions. And China is the only one of the nine nations in which it is impossible to draw conclusions on three of the broad dimensions.



Why should educators in England worry about an analysis completed in the United States? Apart from the fact that the study examines the United Kingdom, the most direct answer comes surprisingly from Brazil. Commenting on “Ten Things,” Professor Luiz Carlos de Freitas of the State University of Campinas agrees the rankings need to be “demystified.” Regarding this North American analysis, he tells his colleagues in Brazil, “Valem para nós.” — “This applies to us too.”

And so it does. It applies to every nation that misguidedly focuses on the numbers generated by a single assessment. Nowhere is the damage wrought by these international large scale assessments greater than it is in England and the United States.
Share on Twitter Share on Facebook
Category: 

Be notified by email of each new post.





Comments

Janet Downs's picture
Sun, 15/02/2015 - 09:03

For readers puzzled how 'systems outcomes' in the UK and US are in the highest category when economic equality, social stress and, in the case of US, family support are in the lowest category, I have an explanation.

'Systems outcomes' measure the historic performance of national systems. This includes such things as:

1 How many years of schooling do adults possess?
2 What proportion have a high school diploma (or equivalent)?
3 How many have a Bachelor's degree?

The UK and US do well on this measure because they have a strong record on these measures over decades when other countries have not. The average US person over 25, for example, has 13 years of formal education.

It should not be assumed, therefore, that because the UK and US have historically high systems outcomes, that no steps need be taken to address poverty, inequality and social stress.

Thanks to James Harvey for explaining this to me.

John Mountford's picture
Sun, 15/02/2015 - 19:09

Fascinating reading, James. The fact is, there should be a health warning posted with ALL these international large scale assessments. It could read something along the lines of "Interpreting the Data - Not to be taken as advice for meddling politicians on how to run and reform a national education service."

In a recent blog on this site we touched on this subject. My conclusion was that we can maybe appreciate how compelling it is for 'busy' people to take things at face value when influential commentators (Andreas Schleicher not least among them) interpret the finding of the PISA and other tests, but they would be better advised not to.

http://www.localschoolsnetwork.org.uk/2015/02/schleichers-list/

I will repeat here what I said there about the 'apparent' strength of the evidence from international research, "Schleicher is prepared to settle for ‘good enough’ on a lot of counts and in so doing overlooks some key facts, thus bringing into question the validity of his ‘evidence’."

The last word, James, should be with you:

"It applies to every nation that misguidedly focuses on the numbers generated by a single assessment. Nowhere is the damage wrought by these international large scale assessments greater than it is in England and the United States."

James Harvey's picture
Mon, 09/03/2015 - 18:27

Thank you so much for the clarification, Janet, and for your comment, John. I apologize for taking so long to respond, but I was overwhelmed with this, that, and the other since the "Iceberg Effect" report appeared.

As you note in your earlier comment, John, Mr. Schleicher is economical with the truth about his PISA data. Most of the myths he offers are simply straw men, constructed to be knocked down. The most obvious example of that is his claim that educators believe the disadvantaged are "doomed" to fail in school. I don't know any reputable educator that believes that canard.

The main point I think is that Mr. Schleicher has had an easy time convincing people that his data represent comparisons of like populations with like populations. (Of course, he has an easy time with that argument. The expectation with which we come to research is that the comparison are legitimate.) What people in Europe and the United States need to understand is that the PISA data compare results for the entire student population in most western nations with results for that small proportion of 15-year-olds in Shanghai who have survived a savage culling process that has already eliminated somewhere between 27-50% of all potential 15-year-olds in Shanghai before they reached the age of 15.

The coalition government in England and the Democratic-Republican consensus in the United States may persuade the English and American people that results in Shanghai are desirable and what western nations should strive for. But before yielding to that impulse, the people of both nations need to understand how those Shanghai results were achieved -- and at what cost.

I appreciate your comments. And I deeply appreciate the opportunity to be a part of this lively and well-informed blog.

Add new comment

Already a member? Click here to log in before you comment. Or register with us.