The Power of PISA

This space had been reserved for reporting the results of an interview I did last Thursday with Nova Scotia Education Minister Zach Churchill. I thought I’d have some new insight into the government’s determination to push through the recommendations in educational consultant Avis Glaze’s administrative review of the Nova Scotia education system, Raise the Bar.

Zach Churchill

Zach Churchill

Unfortunately — and despite a general trend away from rote learning and memorization in education — Churchill simply repeated the talking points I’d heard him share with CTV’s Steve Murphy a few days earlier and would hear him repeat to the CBC’s Steve Sutherland the next morning (“We need bold decisive action,” “Principals have told me they’ve found themselves conflicted because of their membership in the Teachers’ Union,” “We’re not doing enough for our students.”)

Left with a blank page and nothing with which to fill it, I asked myself, “What can the Spectator add to this debate, on which buckets of ink and hours of air-time are currently being expended? Is there anything in the Glaze report that might bear closer scrutiny that hasn’t already been scrutinized beyond recognition?”

And the answer, I think, is to start at the very beginning which, as a gifted (and musical) Austrian educator once said, is a very good place to start.

Both Glaze’s report and Nova Scotia’s 2014 Report of the Minister’s Panel on Education are premised on the notion that Nova Scotia’s students are not performing as well as they should be — or as well as many of their Canadian and international peers are performing. It’s a conclusion based on the results of two sets of standardized tests: the Pan-Canadian Assessment Program (PCAP) and the Program for International Student Assessment (PISA).

I hope to write about the PCAP tests in a future article, but for today, let’s focus on PISA, about which I’ve discovered more than enough information to fill my blank page.

 

Why the OECD?

PISA is overseen by the Organization for Economic Cooperation and Development (OECD), a 35-member transnational organization which, according to its website, is currently focused on helping governments around the world:

  • Restore confidence in markets and the institutions that make them function.
  • Re-establish healthy public finances as a basis for future sustainable economic growth.
  • Foster and support new sources of growth through innovation, environmentally friendly ‘green growth’ strategies and the development of emerging economies.
  • Ensure that people of all ages can develop the skills to work productively and satisfyingly in the jobs of tomorrow.

That last goal? The one about “developing skills” that allow you to work “productively?” That’s what the OECD means by “education.” It’s not a definition without its critics — in fact, the OECD’s very involvement in student assessment is not without its critics. In 2014, 83 academics signed an open letter to Dr Andreas Schleicher, director of PISA, in which they raised a number of red flags about the tests:

As an organization of economic development, OECD is naturally biased in favor of the economic role of public schools. But preparing young men and women for gainful employment is not the only, and not even the main goal of public education, which has to prepare students for participation in democratic self-government, moral action and a life of personal development, growth and wellbeing.

Unlike United Nations (UN) organizations such as UNESCO or UNICEF that have clear and legitimate mandates to improve education and the lives of children around the world, OECD has no such mandate. Nor are there, at present, mechanisms of effective democratic participation in its education decision-making process.

For Gita Steiner-Khamsi, a professor of comparative and international education at Columbia University, even our reliance on those UN organizations for educational assessment bears scrutiny:

Over the course of twenty years (only), we have come to accept the existence of transnational regimes in education, such as the OECD, IEA (International Association for the Evaluation of Educational Achievement), the World Bank, UN organizations (notably, UNESCO and UNICEF), and many non-governmental organizations, that influence policy agenda setting at national levels. These organizations have existed for a long time, most multilateral organizations since World War II and IEA and OECD since the late 1960s, but it is only for the past few years that experts and policy makers instrumentally evoke them as sources of authority whenever there is a need for an (international) stamp of approval to push through domestic reforms that otherwise would be contested. (Nordisk Pedagogik 1/2009)

Reading comprehension: Do you know any “experts” or “policy makers” who are evoking the OECD to “push through domestic reforms that otherwise would be contested?” Discuss.

 

Reliable estimates

PISA, as noted above, hasn’t been around that long: the OECD introduced the program in 2000 and testing — which, in Canada, is conducted under the aegis of the Council of Ministers of Education, Canada (CMEC) — is done every three years in all 35 OECD countries (and lots of non-OECD member countries as well). According to the OECD website:

The Programme for International Student Assessment (PISA) is a triennial international survey which aims to evaluate education systems worldwide by testing the skills and knowledge of 15-year-old students.

In 2015 over half a million students, representing 28 million 15-year-olds in 72 countries and economies, took the internationally agreed two-hour test. Students were assessed in science, mathematics, reading, collaborative problem solving and financial literacy.

Typically, between 5,000 and 10,000 students from at least 150 schools are tested in each country, but in Canada, approximately 20,000 15-year-olds from about 900 schools participated across the 10 provinces in 2015. According to CMEC:

The large Canadian sample was required to produce reliable estimates representative of each province and for both French- and English-language school systems in Nova Scotia, New Brunswick, Quebec, Ontario, Manitoba, Alberta, and British Columbia. PISA was administered in English and in French according to the respective school system.

Obviously, this represents the key role played by the provinces in education. Interestingly, in 2015 in the US, where education is chiefly a state responsibility, only Massachusetts, North Carolina and Puerto Rico “elected to participate” as individual education systems, meaning paid to have organizers draw sufficiently large samples to assess their systems separately.

As for when and how the Canadian tests were conducted:

The 2015 PISA assessment was administered in schools during regular school hours in April and May 2015. The assessment was a two-hour computer-based test. Students also completed a 35-minute student background questionnaire providing information about themselves and their home, while school principals completed a 20-minute questionnaire about their schools. As part of PISA 2015, international options could also be implemented. Canada chose to add a one-hour financial literacy assessment as well as a five-minute paper-based questionnaire to collect information on the attitudes of 15-year-old students toward trades; however, only some provinces chose to participate in these options.

 

Financial literacy

I asked the CMEC and the Nova Scotia Department of Education and Early Childhood Development how much Canada and the province, respectively, paid for PISA in 2015 and whether Nova Scotia opted for the model with the financial literacy assessment and the trades-attitudes questionnaire. As of press time, I had not received a response.

All is not lost, though, because PISA does provide per-country cost estimates (and helpful ways to think about them):

Though PISA is conducted every three years, it’s easiest to think about the cost to countries per annum.

I actually thought this might be a test of my financial literacy, which I think I passed with flying colors by thinking: “But, that’s just a way to convince yourself PISA is less expensive than it really is.”

Apparently, there are “international” and “national” PISA costs. For an OECD member country, like Canada, the international costs:

…vary widely by country, reflecting the original agreement with the OECD when the country joined the Organisation, with an average per annum cost of around €150,000.

If we take that average “per annum” figure of €150,000 and convert it to Canadian dollars according to the exchange rate from May 2015, we get $204,633. If we then multiply that by three, we get a total of $613,900.

The “national costs,” according to the OECD:

…also vary by country, according to factors such as population size, the number of languages in use and the nature of the political system: a small country might spend around €75,000 per annum and a medium-sized country €300,000 per annum; a large country could spend up to two or three times the latter amount.

I can’t make a calculation here because, although we’re not a small country we test a very large number of students — and we do it in two languages — which is going to drive the price up. Our “national” costs, then, are somewhere between $306,950 and $3.7 million. (That’s the “per annum” estimates converted to Canadian dollars at 2015 rates then multiplied by three. Seriously, I think this is a financial literacy test. I better not get billed by the OECD.)

Bottom line, testing is a) expensive for the participating country; b) lucrative for PISA.

 

Fish farms

Each time the tests are given, they focus on one of the three “core” competencies — science, math or reading. The 2015 “cycle” of tests focused on “scientific literacy.” The PISA site offers sample questions and I totally recommend you try them. Especially the one that allows you to participate in a “group chat” with two imaginary classmates, Alice and Zach.

The three of you are a team tasked with answering a number of questions about the geography, people and economy of an imaginary country. You are competing against your classmates and the first team to answer all the questions wins. Alice wants to discuss how best to approach the problem but Zach — I could not make this up — just wants to do everything as quickly as possible:

Source: PISA Interactive question sample, 2015

Source: PISA Interactive question sample, 2015 (Click to enlarge)

I don’t want to give anything away, but let’s just say Zach almost drags his team down to defeat before learning a valuable lesson about planning and cooperation. (Seriously, I could not make this up).

Another sample question is about “sustainable fish farms” which begins with the assumption that fish farms are, in fact, sustainable (tell that to the good people of the State of Washington).

Students are presented with the following information which they are expected to digest and respond to while on the clock. The idea, you see, is not that they’ve been studying fish farming and are being tested on what they’ve learned, it’s that they can take in the information on the screen in front of them, combine it with stuff they just know (because they’re 15 and it’s FISH FARMING), and answer questions like this:

Source: PISA 2015

Source: PISA 2015 (Click to enlarge)

I am not going to admit how long it took me to come up with an answer to this — and I wasn’t presented with it as one question on a two-hour examination.

 

PISA Consortium

Okay, enough testing, it’s hard on the blood pressure.

So, who designs the PISA tests? Well, it’s the PISA Consortium, which is led by the Australian Council for Educational Research (ACER) and in 2015 included:

In other words, the tests are designed by what researchers like Gita Steiner-Khamsi call the Global Education Industry or GEI — a fact not lost on those 83 educators who wrote to the OECD’s Andreas Schleicher in 2014:

To carry out Pisa and a host of follow-up services, OECD has embraced “public-private partnerships” and entered into alliances with multi-national for-profit companies, which stand to gain financially from any deficits—real or perceived—unearthed by Pisa. Some of these companies provide educational services to American schools and school districts on a massive, for-profit basis, while also pursuing plans to develop for-profit elementary education in Africa, where OECD is now planning to introduce the Pisa programme.

Take Pearson, which won the tender to design the 2018 PISA “frameworks” for the OECD. In announcing the win, Pearson chief executive John Fallon said in a press release:

We are developing global benchmarks that, by assessing a wider range of skills, will help more young people to prosper in the global economy. We are very pleased to be supporting the OECD and academic colleagues in this crucial work.

Pearson is also, of course, selling a full-range of high-priced products designed to help improve student performance — performance to be assessed by tests devised by Pearson. Pearson will also evaluate your teachers for you if, say, you’re a College of Educators looking for a way to “ensure teacher quality.” Pearson really has your back.

 

Lost in translation?

PISA critics, it will not surprise you to hear, question the enormous power granted the OECD and the PISA Consortium to a) decide what skills every 15-year-old in the world needs, b) design a test for these skills that works across multiple languages and cultures and c) interpret the results of these tests.

Sharon Murphy

Sharon Murphy

In theory, the private companies in the PISA Consortium are advised and assisted in this by representatives of the countries on the OECD’s governing board. But in a 2010 paper called, “The Pull of Pisa: Uncertainty, Influence, and Ignorance,Sharon Murphy of York University in Toronto suggests this isn’t really how it works. For one thing, the board meetings are conducted in either English or French which immediately puts many members at a disadvantage. For another, interviews done with English and Scottish participants in PISA governing board and National Project meetings in 2009 suggest the OECD’s Andreas Schleicher runs a tight ship:

In the Board meetings…the members from each country often appear to represent national ‘stereotypes’ and argue for ‘national’ recognition…In a sense, the Board meetings were described as the place where national differences and traditions are ‘ironed’ out, in order to reach a consensus. Nonetheless, it was also very interestingly noted that the ideas put forward are those that are more likely to lead to a compromise amongst the members. The meetings therefore were described as heavily managed and controlled by the OECD Secretariat and Andreas Schleicher himself.

In terms of technical issues arising in meetings, another interviewee commented that technical issues are almost never discussed — instead, ACER experts often offer technical presentations which Board members never challenge but ‘trust.’

In the board members’ defense, mastering the technical issues involved in designing and interpreting the tests would be a full-time job — the Technical Report for the 2015 PISA tests ran to 468 pages.

Murphy points to a number of design flaws in the tests, particularly in the tricky area of translation — for example, the tests are timed and are based on a reading of the English version at a rate of 3,500 words per hour:

The technical manual for PISA 2000 notes that, in some instances, lengthier versions of texts resulted from translation and, while the burden on test takers for timely completion did not ‘seem to be substantial…the hypothesis of some effect on the students’ performance cannot be discarded.’

Then there’s the question of culture. Officially, PISA’s goal is to represent cultural diversity rather than cultural neutrality. Murphy says that during test development, country representatives are asked to comment on cultural diversity with respect to the test, which can lead to some units being dropped (although, as she points out, it is not clear who makes the decision to drop the units). Murphy writes:

An example of a unit that was deemed inappropriate was one about the alternative independent singer Ani DiFranco. That this unit was deemed inappropriate, because “feminism, the focus of the song’s lyrics, was an unfamiliar concept to students in certain countries”…makes one wonder how cultural diversity was imagined in general.

For Murphy, such flaws mean that “[large]-scale assessments like PISA are flawed instruments. They are not robust enough to bear the kinds of comparisons and policy decisions that flow from them.”

 

Selective attention

Murphy also makes a really interesting point about how nations (or, in the Canadian context, provinces) use PISA results.

Gita Steiner-Khamsi

Gita Steiner-Khamsi

Remember, the OECD’s stated interest in education is ensuring students “develop the skills to work productively and satisfyingly in the jobs of tomorrow.” If this link between education and productive, satisfying work exists, then it’s reasonable to assume that the countries that lead the PISA rankings should also lead the world in economic terms (which Murphy represents as gross national income per capita as calculated by the World Bank).

But if you look at the 2015 PISA tables, Singapore led the world in science, math and reading — but came in 18th in gross national income per capita. The United States, which didn’t crack the Top 10 in science, math or reading, ranked 14th in gross income per capita. Canada, which ranked 7th in science, 10th in math and third in reading, ranked 25th in gross income per capita.

I am not, let me be clear, arguing against the importance of education generally, but it seems countries are “selective” in the way they view this claim of a connection between the science, math and reading skills of 15-year-olds and their economic futures.

Steiner-Khamsi provides an even more interesting example of a country paying “selective attention” to international assessments. She compares Germany’s response to its performance on two different standardized tests in 2001.

The first was the IEA Civic Education Study which ranked German 14-year-old students “lowest with regard to positive attitudes towards immigrants as compared to students from the other 27 countries that participated in the study.”

The second was the PISA study in which German students scored below the OECD-average in reading literacy.

The release of the PISA study, says Steiner-Khamsi:

[L]ed to a major uproar in the media, whereas the alerting findings from the IEA Civic Education study did not find much resonance in German public debate. Given that the German students did far worse with regard to xenophobia in the Civic Education Study, why was there such a political spectacle about the reading literacy scores of German students? Why did German politicians, policy makers, and educational researchers criticize the German education system for failing to prepare their students for reading literacy but remained silent when it came to scrutinizing the effects of schooling on attitudes towards immigrants?

Steiner-Khamsi says the issue is one in need of further research but presents three possible answers. The first is that civic education is not considered a “core” subject matter in schools, so attracts less public interest. The second is that civic education is not simply a matter for schools — “political literacy” is developed via family, peers, media, clubs, etc — and changing it would require much more than a school curriculum overhaul.

But Steiner -Khamsi points to a third reason which is interesting and possibly applicable in the current Nova Scotia context:

References to international comparative studies or to league tables tend to be made if (and only if) they resonate with ongoing domestic policy debates…In Germany, debates about introducing standards, accountability measures, quality monitoring, expanding school choice and school-based management, etc., already existed prior to the release of the PISA findings.

Murphy also comments on the way PISA results are used. Governments on the national stage (or, I think we can say, provincial stage):

…are using PISA results towards their own ends, and are taking advantage of the cloak of educational assessment discourse to do so. In this scenario, the opportunity for democratic participation is doubly thwarted; it is thwarted once by the use of a test which, despite the facade of encouraging participation in development by nation-states, seems to be managed and controlled by elites behind the scenes, and it is thwarted again by elected political representatives who take advantage of the professionalization of assessment to use test data to satisfy their own political agendas.

 

Gold mine

Test data can be useful to politicians but supplying that test data is a bonanza for the companies in the Global Education Industry.

According to Antoni Verger, Christopher Lubienski, and Gita Steiner-Khamsi in the World Yearbook of Education 2016: The Global Education Industry:

The GEI can be considered as an industry sector in expansion. Merrill Lynch-Bank of America calculated in 2014 that the value of the education sector, globally speaking, is $4.3 trillion…And GSVAdvisors (2012) consider that the market size of the education for-profit sector is expected to grow by 17 percent in the next five years. In the US alone, the for-profit education industry revenues more than doubled in the last decade, going from $60 billion in 1999 to $125 billion in 2012 (BMO Capital Markets 2014). Traditionally, there have been more market opportunities in those sub-sectors and educational levels, such as pre-kindergarten and post-secondary education, where the state is not so present. Nonetheless, in the last decade, we have witnessed a significant penetration of primary and secondary education levels by the for-profit sector as well. In fact, this is a development that has been apparent in both the global north and the global south.

This was the image used to advertise the 2017 Global Education Industry Summit in Luxembourg. I don't know why. (http://globaleducation.onetec.eu/)

This was the image used to advertise the 2017 Global Education Industry Summit in Luxembourg. I don’t know why.

Getting a handle on just what is included in the GEI is not easy. According to Verger, Lubienski and Steiner-Khamsi, it includes companies “selling educational resources and services of a technological nature to schools, including e-books, software, courseware, learning devices, learning platforms or dedicated IT solutions” but it also includes “school improvement services, on-line education, tutoring or supplemental/ “shadow education,” edu-marketing, consultancy services for governments and schools, testing preparation services, and so on.” (Researchers, by the way, have a fabulous term for those offering consultancy services to governments and schools — the Consultocracy.)

And the reason the GEI is so lucrative — and poised to become so much more lucrative — is precisely because it is “global.” That’s why education companies are so keen on standardized international tests: they create a massive market for their products. Students taking the same PISA exams can all potentially benefit from the same courseware or tutoring services. (Never mind that the translation of the exams and the elimination of all problematic cultural references from the tests are both bones of contentions for PISA critics.) And school systems that have been advised to reform themselves based on poor (or moderate) PISA results can be advised on those reforms by the same consultancy firms.

 

You Now Know

At the end of every chapter in my Grade 8 science book (taught by a great teacher, the late Mike Hawrylak of Sheriff Junior High School) was a section called, “You now know” which basically recapped everything you were supposed to have learned in that chapter.

If I’ve done my work well, it’s possible you now know a few things you didn’t when you began this article.

If you’ve only learned one thing, however, I hope it’s that what’s playing out in the Nova Scotia education system is not playing out in a vacuum — it’s playing out within the broader world of the Global Education Industry and it really doesn’t hurt to keep that in mind while you’re evaluating what you’re being told — from any side — about the state of education in this province.

 

 

 

 

 

 

The Cape Breton Spectator is entirely reader supported. Please consider subscribing today!