Programme for International Student Assessment (2000 to 2012)

The Programme for International Student Assessment has had several runs before the most recent one in 2012. The first PISA assessment was carried out in 2000. The results of each period of assessment take about one year and a half to be analysed. First results were published in November 2001. The release of raw data and the publication of technical report and data handbook only took place in spring 2002. The triennial repeats follow a similar schedule; the process of seeing through a single PISA cycle, start-to-finish, always takes over four years. 470,000 15-year-old students representing 65 nations and territories participated in PISA 2009. An additional 50,000 students representing nine nations were tested in 2010.[1]

Every period of assessment focuses on one of the three competence fields of reading, math, science; but the two others are tested as well. After nine years, a full cycle is completed: after 2000, reading was again the main domain in 2009.

Period Focus OECD countries Partner countries Participating students Notes
2000 Reading 28 4 + 11 265,000 The Netherlands disqualified from data analysis. 11 additional non-OECD countries took the test in 2002.
2003 Mathematics 30 11 275,000 UK disqualified from data analysis. Also included test in problem solving.
2006 Science 30 27 400,000 Reading scores for US excluded from analysis due to misprint in testing materials.[2]
2009[3] Reading 34 41 + 10 470,000 10 additional non-OECD countries took the test in 2010.[4]
2012[5] Mathematics 34 31 510,000

Results

PISA 2012

PISA 2012
The results for the 2012 "Maths" section on a world map.
The results for the 2012 "Science" section on a world map.
The results for the 2012 "Reading" section on a world map.
OECD members as of the time of the study are in boldface.
Mathematics Science Reading
1 Shanghai, China 613
2  Singapore 573
3  Hong Kong, China 561
4  Taiwan 560
5  South Korea 554
6  Macau, China 538
7  Japan 536
8  Liechtenstein 535
9   Switzerland 531
10  Netherlands 523
11  Estonia 521
12  Finland 519
13=  Canada 518
13=  Poland 518
15  Belgium 515
16  Germany 514
17  Vietnam 511
18  Austria 506
19  Australia 504
20=  Ireland 501
20=  Slovenia 501
22=  Denmark 500
22=  New Zealand 500
24  Czech Republic 499
25  France 495
26  United Kingdom 494
27  Iceland 493
28  Latvia 491
29  Luxembourg 490
30  Norway 489
31  Portugal 487
32  Italy 485
33  Spain 484
34=  Russia 482
34=  Slovakia 482
36  United States 481
37  Lithuania 479
38  Sweden 478
39  Hungary 477
40  Croatia 471
41  Israel 466
42  Greece 453
43  Serbia 449
44  Turkey 448
45  Romania 445
46  Cyprus 440
47  Bulgaria 439
48  United Arab Emirates 434
49  Kazakhstan 432
50  Thailand 427
51  Chile 423
52  Malaysia 421
53  Mexico 413
54  Montenegro 410
55  Uruguay 409
56  Costa Rica 407
57  Albania 394
58  Brazil 391
59=  Argentina 388
59=  Tunisia 388
61  Jordan 386
62=  Colombia 376
62=  Qatar 376
64  Indonesia 375
65  Peru 368
1 Shanghai, China 580
2  Hong Kong, China 555
3  Singapore 551
4  Japan 547
5  Finland 545
6  Estonia 541
7  South Korea 538
8  Vietnam 528
9  Poland 526
10=  Liechtenstein 525
10=  Canada 525
12  Germany 524
13  Taiwan 523
14=  Netherlands 522
14=  Ireland 522
16=  Macau, China 521
16=  Australia 521
18  New Zealand 516
19   Switzerland 515
20=  Slovenia 514
20=  United Kingdom 514
22  Czech Republic 508
23  Austria 506
24  Belgium 505
25  Latvia 502
26  France 499
27  Denmark 498
28  United States 497
29=  Spain 496
29=  Lithuania 496
31  Norway 495
32=  Italy 494
32=  Hungary 494
34=  Luxembourg 491
34=  Croatia 491
36  Portugal 489
37  Russia 486
38  Sweden 485
39  Iceland 478
40  Slovakia 471
41  Israel 470
42  Greece 467
43  Turkey 463
44  United Arab Emirates 448
45  Bulgaria 446
46=  Serbia 445
46=  Chile 445
48  Thailand 444
49  Romania 439
50  Cyprus 438
51  Costa Rica 429
52  Kazakhstan 425
53  Malaysia 420
54  Uruguay 416
55  Mexico 415
56  Montenegro 410
57  Jordan 409
58  Argentina 406
59  Brazil 405
60  Colombia 399
61  Tunisia 398
62  Albania 397
63  Qatar 384
64  Indonesia 382
65  Peru 373
1 Shanghai, China 570
2  Hong Kong, China 545
3  Singapore 542
4  Japan 538
5  South Korea 536
6  Finland 524
7=  Taiwan 523
7=  Canada 523
7=  Ireland 523
10  Poland 518
11=  Liechtenstein 516
11=  Estonia 516
13=  Australia 512
13=  New Zealand 512
15  Netherlands 511
16=  Macau, China 509
16=   Switzerland 509
16=  Belgium 509
19=  Germany 508
19=  Vietnam 508
21  France 505
22  Norway 504
23  United Kingdom 499
24  United States 498
25  Denmark 496
26  Czech Republic 493
27=  Austria 490
27=  Italy 490
29  Latvia 489
30=  Luxembourg 488
30=  Portugal 488
30=  Spain 488
30=  Hungary 488
34  Israel 486
35  Croatia 485
36=  Iceland 483
36=  Sweden 483
38  Slovenia 481
39=  Lithuania 477
39=  Greece 477
41=  Russia 475
41=  Turkey 475
43  Slovakia 463
44  Cyprus 449
45  Serbia 446
46  United Arab Emirates 442
47=  Thailand 441
47=  Chile 441
47=  Costa Rica 441
50  Romania 438
51  Bulgaria 436
52  Mexico 424
53  Montenegro 422
54  Uruguay 411
55  Brazil 410
56  Tunisia 404
57  Colombia 403
58  Jordan 399
59  Malaysia 398
60=  Argentina 396
60=  Indonesia 396
62  Albania 394
63  Kazakhstan 393
64  Qatar 388
65  Peru 384

PISA 2012 was presented on 3 December 2013, with results for around 510,000 participating students in all 34 OECD member countries and 31 partner countries.[5] This testing cycle had a particular focus on mathematics, where the mean score was 494. A sample of 1,688 students from Puerto Rico took the assessment, scoring 379 in math, 404 in reading and 401 in science.[6] A subgroup of 44 countries and economies with about 85 000 students also took part in an optional computer-based assessment of problem solving.[7]

Shanghai had the highest score in all three subjects. It was followed by Singapore, Hong Kong, Chinese Taipei and Korea in mathematics; Hong Kong, Singapore, Japan and Korea in reading and Hong Kong, Singapore, Japan and Finland in science.

They were a sample of about 28 million in the same age group in 65 countries and economies,[8] including the OECD countries, several Chinese cities, Vietnam, Indonesia and several countries in South America.[5]

The test lasted two hours, was paper-based and included both open-ended and multiple-choice questions.[8]

The students and school staff also answered a questionnaire to provide background information about the students and the schools.[5][8]

PISA 2012 was presented on 3 December 2013, with results for around 510,000 participating students in all 34 OECD member countries and 31 partner countries.[5] This testing cycle had a particular focus on mathematics, where the mean score was 494. The mean score in reading was 496 and in science 501.

The results show distinct groups of high-performers in mathematics: the East Asian countries, with Shanghai, scoring the best result of 613, followed closely by Hong Kong, Japan, Chinese Taipei and South Korea. Among the Europeans, Liechtenstein and Switzerland performed best, with Netherlands, Estonia, Finland, Poland, Belgium, Germany, Austria all posting mathematics scores "not significantly statistically different from" one another. The United Kingdom, Ireland, Australia and New Zealand were similarly clustered around the OECD average of 494, with the USA trailing this group at 481.[5]

Qatar, Kazakhstan and Malaysia were the countries which showed the greatest improvement in mathematics. The USA and the United Kingdom showed no significant change.[9] Sweden had the greatest fall in mathematics performance over the last ten years, with a similar falling trend also in the two other subjects, and leading politicians in Sweden expressed great worry over the results.[10][11]

On average boys scored better than girls in mathematics, girls scored better than boys in reading and the two sexes had quite similar scores in science.[9]

Indonesia, Albania, Peru, Thailand and Colombia were the countries where most students reported being happy at school, while students in Korea, the Czech Republic, the Slovak Republic, Estonia and Finland reported least happiness.[8]

PISA 2009

PISA 2009

The PISA 2009 cycle included results in mathematics, science and reading for all 36 OECD member countries and 37 partner countries.[3][12][13]

Of the partner countries, only selected areas of three countries—India, Venezuela and China—were assessed. PISA 2009+, released in December 2011, included data from 10 additional partner countries which had testing delayed from 2009 to 2010 because of scheduling constraints.[4][14]

OECD members as of the time of the study are in boldface. Participants in PISA 2009+, which were tested in 2010 after the main group of 65, are italicized.
Mathematics Science Reading
1 Shanghai, China 600
2  Singapore 562
3  Hong Kong, China 555
4  South Korea 546
5  Taiwan 543
6  Finland 541
7  Liechtenstein 536
8   Switzerland 534
9  Japan 529
10  Canada 527
11  Netherlands 526
12  Macau, China 525
13  New Zealand 519
14  Belgium 515
15  Australia 514
16  Germany 513
17  Estonia 512
18  Iceland 507
19  Denmark 503
20  Slovenia 501
21  Norway 498
22  France 497
23  Slovakia 497
24  Austria 496
25  Poland 495
26  Sweden 494
27  Czech Republic 493
28  United Kingdom 492
29  Hungary 490
30  Luxembourg 489
31  United States 487
32  Portugal 487
33  Ireland 487
34  Spain 483
35  Italy 483
36  Latvia 482
37  Lithuania 477
38  Russia 468
39  Greece 466
40  Malta 463
41  Croatia 460
42  Israel 447
43  Turkey 445
44  Serbia 442
45  Azerbaijan 431
46  Bulgaria 428
47  Uruguay 427
48  Romania 427
49  United Arab Emirates 421
50  Chile 421
51  Mauritius 420
52  Thailand 419
53  Mexico 419
54  Trinidad and Tobago 414
55  Costa Rica 409
56  Kazakhstan 405
57  Malaysia 404
58  Montenegro 403
59  Moldova 397
60 Miranda, Venezuela 397
61  Argentina 388
62  Jordan 387
63  Brazil 386
64  Colombia 381
65  Georgia 379
66  Albania 377
67  Tunisia 371
68  Indonesia 371
69  Qatar 368
70  Peru 365
71  Panama 360
72 Tamil Nadu, India 351
73 Himachal Pradesh, India 338
74  Kyrgyzstan 331
1 Shanghai, China 575
2  Finland 554
3  Hong Kong, China 549
4  Singapore 542
5  Japan 539
6  South Korea 538
7  New Zealand 532
8  Canada 529
9  Estonia 528
10  Australia 527
11  Netherlands 522
12  Liechtenstein 520
13  Germany 520
14  Taiwan 520
15   Switzerland 517
16  United Kingdom 514
17  Slovenia 512
18  Macau, China 511
19  Poland 508
20  Ireland 508
21  Belgium 507
22  Hungary 503
23  United States 502
24  Norway 500
25  Czech Republic 500
26  Denmark 499
27  France 498
28  Iceland 496
29  Sweden 495
30  Latvia 494
31  Austria 494
32  Portugal 493
33  Lithuania 491
34  Slovakia 490
35  Italy 489
36  Spain 488
37  Croatia 486
38  Luxembourg 484
39  Russia 478
40  Greece 470
41  Malta 461
42  Israel 455
43  Turkey 454
44  Chile 447
45  Serbia 443
46  Bulgaria 439
47  United Arab Emirates 438
48  Costa Rica 430
49  Romania 428
50  Uruguay 427
51  Thailand 425
52 Miranda, Venezuela 422
53  Malaysia 422
54  Mauritius 417
55  Mexico 416
56  Jordan 415
57  Moldova 413
58  Trinidad and Tobago 410
59  Brazil 405
60  Colombia 402
61  Tunisia 401
62  Montenegro 401
63  Argentina 401
64  Kazakhstan 400
65  Albania 391
66  Indonesia 383
67  Qatar 379
68  Panama 376
69  Georgia 373
70  Azerbaijan 373
71  Peru 369
72 Tamil Nadu, India 348
73  Kyrgyzstan 330
74 Himachal Pradesh, India 325
1 Shanghai, China 556
2  South Korea 539
3  Finland 536
4  Hong Kong, China 533
5  Singapore 526
6  Canada 524
7  New Zealand 521
8  Japan 520
9  Australia 515
10  Netherlands 508
11  Belgium 506
12  Norway 503
13  Estonia 501
14   Switzerland 501
15  Poland 500
16  Iceland 500
17  United States 500
18  Liechtenstein 499
19  Sweden 497
20  Germany 497
21  Ireland 496
22  France 496
23  Taiwan 495
24  Denmark 495
25  United Kingdom 494
26  Hungary 494
27  Portugal 489
28  Macau, China 487
29  Italy 486
30  Latvia 484
31  Greece 483
32  Slovenia 483
33  Spain 481
34  Czech Republic 478
35  Slovakia 477
36  Croatia 476
37  Israel 474
38  Luxembourg 472
39  Austria 470
40  Lithuania 468
41  Turkey 464
42  Russia 459
43  Chile 449
44  Costa Rica 443
45  Malta 442
46  Serbia 442
47  United Arab Emirates 431
48  Bulgaria 429
49  Uruguay 426
50  Mexico 425
51  Romania 424
52 Miranda, Venezuela 422
53  Thailand 421
54  Trinidad and Tobago 416
55  Malaysia 414
56  Colombia 413
57  Brazil 412
58  Montenegro 408
59  Mauritius 407
60  Jordan 405
61  Tunisia 404
62  Indonesia 402
63  Argentina 398
64  Kazakhstan 390
65  Moldova 388
66  Albania 385
67  Georgia 374
68  Qatar 372
69  Panama 371
70  Peru 370
71  Azerbaijan 362
72 Tamil Nadu, India 337
73 Himachal Pradesh, India 317
74  Kyrgyzstan 314

PISA 2006

PISA 2006
OECD members as of the time of the study are in boldface. Reading scores for the United States were disqualified.
Mathematics Science Reading
1  Taiwan 549
2  Finland 548
3  South Korea 547
4  Hong Kong, China 547
5  Netherlands 531
6   Switzerland 530
7  Canada 527
8  Macau, China 525
9  Liechtenstein 525
10  Japan 523
11  New Zealand 522
12  Belgium 520
13  Australia 520
14  Estonia 515
15  Denmark 513
16  Czech Republic 510
17  Iceland 506
18  Austria 505
19  Slovenia 504
20  Germany 504
21  Sweden 502
22  Ireland 501
23  France 496
24  United Kingdom 495
25  Poland 495
26  Slovakia 492
27  Hungary 491
28  Norway 490
29  Luxembourg 490
30  Lithuania 486
31  Latvia 486
32  Spain 480
33  Russia 476
34  Azerbaijan 476
35  United States 474
36  Croatia 467
37  Portugal 466
38  Italy 462
39  Greece 459
40  Israel 442
41  Serbia 435
42  Uruguay 427
43  Turkey 424
44  Thailand 417
45  Romania 415
46  Bulgaria 413
47  Chile 411
48  Mexico 406
49  Montenegro 399
50  Indonesia 391
51  Jordan 384
52  Argentina 381
53  Colombia 370
54  Brazil 370
55  Tunisia 365
56  Qatar 318
57  Kyrgyzstan 311
1  Finland 563
2  Hong Kong, China 542
3  Canada 534
4  Taiwan 532
5  Japan 531
6  Estonia 531
7  New Zealand 530
8  Australia 527
9  Netherlands 525
10  Liechtenstein 522
11  South Korea 522
12  Slovenia 519
13  Germany 516
14  United Kingdom 515
15  Czech Republic 513
16   Switzerland 512
17  Austria 511
18  Macau, China 511
19  Belgium 510
20  Ireland 508
21  Hungary 504
22  Sweden 503
23  Poland 498
24  Denmark 496
25  France 495
26  Croatia 493
27  Iceland 491
28  Latvia 490
29  United States 489
30  Slovakia 488
31  Spain 488
32  Lithuania 488
33  Norway 487
34  Luxembourg 486
35  Russia 479
36  Italy 475
37  Portugal 474
38  Greece 473
39  Israel 454
40  Chile 438
41  Serbia 436
42  Bulgaria 434
43  Uruguay 428
44  Turkey 424
45  Jordan 422
46  Thailand 421
47  Romania 418
48  Montenegro 412
49  Mexico 410
50  Indonesia 393
51  Argentina 391
52  Brazil 390
53  Colombia 388
54  Tunisia 386
55  Azerbaijan 382
56  Qatar 349
57  Kyrgyzstan 322
1  South Korea 556
2  Finland 547
3  Hong Kong, China 536
4  Canada 527
5  New Zealand 521
6  Ireland 517
7  Australia 513
8  Liechtenstein 510
9  Poland 508
10  Sweden 507
11  Netherlands 507
12  Belgium 501
13  Estonia 501
14   Switzerland 499
15  Japan 498
16  Taiwan 496
17  United Kingdom 495
18  Germany 495
19  Denmark 494
20  Slovenia 494
21  Macau, China 492
22  Austria 490
23  France 488
24  Iceland 484
25  Norway 484
26  Czech Republic 483
27  Hungary 482
28  Latvia 479
29  Luxembourg 479
30  Croatia 477
31  Portugal 472
32  Lithuania 470
33  Italy 469
34  Slovakia 466
35  Spain 461
36  Greece 460
37  Turkey 447
38  Chile 442
39  Russia 440
40  Israel 439
41  Thailand 417
42  Uruguay 413
43  Mexico 410
44  Bulgaria 402
45  Serbia 401
46  Jordan 401
47  Romania 396
48  Indonesia 393
49  Brazil 393
50  Montenegro 392
51  Colombia 385
52  Tunisia 380
53  Argentina 374
54  Azerbaijan 353
55  Qatar 312
56  Kyrgyzstan 285

PISA 2003

The results for PISA 2003 were released on 14 December 2004. This PISA cycle tested 275,000 15 year-olds on mathematics, science, reading and problem solving and involved schools from 30 OECD member countries and 11 partner countries.[15] Note that for Science and Reading, the means displayed are for "All Students", but for these two subjects (domains), not all of the students answered questions in these domains. In the 2003 OECD Technical Report (pages 208, 209), there are different country means (different than those displayed below) available for students who had exposure to these domains.[16]

PISA 2003
OECD members at the time of the study are in boldface. The United Kingdom was disqualified due to a low response rate.
Mathematics Science Reading Problem solving
1  Hong Kong, China 550
2  Finland 544
3  Korea 542
4  Netherlands 538
5  Liechtenstein 536
6  Japan 534
7  Canada 532
8  Belgium 529
9  Macau, China 527
10   Switzerland 527
11  Australia 524
12  New Zealand 523
13  Czech Republic 516
14  Iceland 515
15  Denmark 514
16  France 511
17  Sweden 509
18  Austria 506
19  Germany 503
20  Ireland 503
21  Slovakia 498
22  Norway 495
23  Luxembourg 493
24  Poland 490
25  Hungary 490
26  Spain 485
27  Latvia 483
28  United States 483
29  Russia 468
30  Portugal 466
31  Italy 466
32  Greece 445
33  Serbia and Montenegro 437
34  Turkey 423
35  Uruguay 422
36  Thailand 417
37  Mexico 385
38  Indonesia 360
39  Tunisia 359
40  Brazil 356
1  Finland 548
2  Japan 548
3  Hong Kong, China 539
4  Korea 538
5  Liechtenstein 525
6  Australia 525
7  Macau, China 525
8  Netherlands 524
9  Czech Republic 523
10  New Zealand 521
11  Canada 519
12   Switzerland 513
13  France 511
14  Belgium 509
15  Sweden 506
16  Ireland 505
17  Hungary 503
18  Germany 502
19  Poland 498
20  Slovakia 495
21  Iceland 495
22  United States 491
23  Austria 491
24  Russia 489
25  Latvia 489
26  Spain 487
27  Italy 486
28  Norway 484
29  Luxembourg 483
30  Greece 481
31  Denmark 475
32  Portugal 468
33  Uruguay 438
34  Serbia and Montenegro 436
35  Turkey 434
36  Thailand 429
37  Mexico 405
38  Indonesia 395
39  Brazil 390
40  Tunisia 385
1  Finland 543
2  Korea 534
3  Canada 528
4  Australia 525
5  Liechtenstein 525
6  New Zealand 522
7  Ireland 515
8  Sweden 514
9  Netherlands 513
10  Hong Kong, China 510
11  Belgium 507
12  Norway 500
13   Switzerland 499
14  Japan 498
15  Macau, China 498
16  Poland 497
17  France 496
18  United States 495
19  Denmark 492
20  Iceland 492
21  Germany 491
22  Austria 491
23  Latvia 491
24  Czech Republic 489
25  Hungary 482
26  Spain 481
27  Luxembourg 479
28  Portugal 478
29  Italy 476
30  Greece 472
31  Slovakia 469
32  Russia 442
33  Turkey 441
34  Uruguay 434
35  Thailand 420
36  Serbia and Montenegro 412
37  Brazil 403
38  Mexico 400
39  Indonesia 382
40  Tunisia 375
1  Korea 550
2  Hong Kong, China 548
3  Finland 548
4  Japan 547
5  New Zealand 533
6  Macau, China 532
7  Australia 530
8  Liechtenstein 529
9  Canada 529
10  Belgium 525
11   Switzerland 521
12  Netherlands 520
13  France 519
14  Denmark 517
15  Czech Republic 516
16  Germany 513
17  Sweden 509
18  Austria 506
19  Iceland 505
20  Hungary 501
21  Ireland 498
22  Luxembourg 494
23  Slovakia 492
24  Norway 490
25  Poland 487
26  Latvia 483
27  Spain 482
28  Russia 479
29  United States 477
30  Portugal 470
31  Italy 469
32  Greece 448
33  Thailand 425
34  Serbia and Montenegro 420
35  Uruguay 411
36  Turkey 408
37  Mexico 384
38  Brazil 371
39  Indonesia 361
40  Tunisia 345

PISA 2000

The results for the first cycle of the PISA survey were released on 14 November 2001. 265,000 15 year-olds were tested in 28 OECD countries and 4 partner countries on mathematics, science and reading. An additional 11 countries were tested later in 2002.[17]

PISA 2000
OECD members as of the time of the study are in boldface. The 11 partner countries tested in 2002 after the main group of 32 are italicized.
Mathematics Science Reading
1  Hong Kong, China 560
2  Japan 557
3  Korea 547
4  New Zealand 537
5  Finland 536
6  Australia 533
7  Canada 533
8   Switzerland 529
9  United Kingdom 529
10  Belgium 520
11  France 517
12  Austria 515
13  Denmark 514
14  Iceland 514
15  Liechtenstein 514
16  Sweden 510
17  Ireland 503
18  Norway 499
19  Czech Republic 498
20  United States 493
21  Germany 490
22  Hungary 488
23  Russia 478
24  Spain 476
25  Poland 470
26  Latvia 463
27  Italy 457
28  Portugal 454
29  Greece 447
30  Luxembourg 446
31  Israel 433
32  Thailand 432
33  Bulgaria 430
34  Argentina 388
35  Mexico 387
36  Chile 384
37  Albania 381
38  Macedonia 381
39  Indonesia 367
40  Brazil 334
41  Peru 292
1  Korea 552
2  Japan 550
3  Hong Kong, China 541
4  Finland 538
5  United Kingdom 532
6  Canada 529
7  New Zealand 528
8  Australia 528
9  Austria 519
10  Ireland 513
11  Sweden 512
12  Czech Republic 511
13  France 500
14  Norway 500
15  United States 499
16  Hungary 496
17  Iceland 496
18  Belgium 496
19   Switzerland 496
20  Spain 491
21  Germany 487
22  Poland 483
23  Denmark 481
24  Italy 478
25  Liechtenstein 476
26  Greece 461
27  Russia 460
28  Latvia 460
29  Portugal 459
30  Bulgaria 448
31  Luxembourg 443
32  Thailand 436
33  Israel 434
34  Mexico 422
35  Chile 415
36  Macedonia 401
37  Argentina 396
38  Indonesia 393
39  Albania 376
40  Brazil 375
41  Peru 333
1  Finland 546
2  Canada 534
3  New Zealand 529
4  Australia 528
5  Ireland 527
6  Hong Kong, China 525
7  Korea 525
8  United Kingdom 523
9  Japan 522
10  Sweden 516
11  Austria 507
12  Belgium 507
13  Iceland 507
14  Norway 505
15  France 505
16  United States 504
17  Denmark 497
18   Switzerland 494
19  Spain 493
20  Czech Republic 492
21  Italy 487
22  Germany 484
23  Liechtenstein 483
24  Hungary 480
25  Poland 479
26  Greece 474
27  Portugal 470
28  Russia 462
29  Latvia 458
30  Israel 452
31  Luxembourg 441
32  Thailand 431
33  Bulgaria 430
34  Mexico 422
35  Argentina 418
36  Chile 410
37  Brazil 396
38  Macedonia 373
39  Indonesia 371
40  Albania 349
41  Peru 327

Comparison with other studies

The correlation between PISA 2003 and TIMSS 2003 grade 8 country means is 0.84 in mathematics, 0.95 in science. The values go down to 0.66 and 0.79 if the two worst performing developing countries are excluded. Correlations between different scales and studies are around 0.80. The high correlations between different scales and studies indicate common causes of country differences (e.g. educational quality, culture, wealth or genes) or a homogenous underlying factor of cognitive competence. European Economic Area countries perform slightly better in PISA; the Commonwealth of Independent States and Asian countries in TIMSS. Content balance and years of schooling explain most of the variation.[18]

Reception

The results from PISA 2003 and PISA 2006 were featured in the 2010 documentary Waiting for "Superman".[19]

China

Education professor Yong Zhao has noted that PISA 2009 did not receive much attention in the Chinese media, and that the high scores in China are due to excessive workload and testing, adding that it's "no news that the Chinese education system is excellent in preparing outstanding test takers, just like other education systems within the Confucian cultural circle: Singapore, Korea, Japan, and Hong Kong."[20]

Students from Shanghai, China, had the top scores of every category (Mathematics, Reading and Science) in PISA 2009. In discussing these results, PISA spokesman Andreas Schleicher, Deputy Director for Education and head of the analysis division at the OECD’s directorate for education, described Shanghai as a pioneer of educational reform in which "there has been a sea change in pedagogy". Schleicher stated that Shanghai abandoned its "focus on educating a small elite, and instead worked to construct a more inclusive system. They also significantly increased teacher pay and training, reducing the emphasis on rote learning and focusing classroom activities on problem solving."[21]

Schleicher also states that PISA tests administered in rural China have produced some results approaching the OECD average: Citing further, as-yet-unpublished OECD research, Schleicher said, "We have actually done Pisa in 12 of the provinces in China. Even in some of the very poor areas you get performance close to the OECD average."[22] Schleicher says that for a developing country, China's 99.4% enrollment in primary education is "the envy of many countries". He maintains that junior secondary school participation rates in China are now 99%; and in Shanghai, not only has senior secondary school enrollment attained 98%, but admissions into higher education have achieved 80% of the relevant age group. Schleicher believes that this growth reflects quality, not just quantity, which he contends the top PISA ranking of Shanghai's secondary education confirms.[22] Schleicher believes that China has also expanded school access and has moved away from learning by rote.[23] According to Schleicher, Russia performs well in rote-based assessments, but not in PISA, whereas China does well in both rote-based and broader assessments.[22]

Denmark

University of Copenhagen Professor Svend Kreiner, who examined in detail PISA's 2006 reading results, noted that in 2006 only about ten percent of the students who took part in PISA were tested on all 28 reading questions. "This in itself is ridiculous,” Kreiner told Stewart. "Most people don't know that half of the students taking part in PISA (2006) do not respond to any reading item at all. Despite that, PISA assigns reading scores to these children."[24]

Finland

The stable, high marks of Finnish students have attracted a lot of attention. According to Hannu Simola[25] the results reflect a paradoxical mix of progressive policies implemented through a rather conservative pedagogic setting, where the high levels of teachers' academic preparation, social status, professionalism and motivation for the job are concomitant with the adherence to traditional roles and methods by both teachers and pupils in Finland's changing, but still quite paternalistic culture. Others advance Finland's low poverty rate as a reason for its success.[26][27] Finnish education reformer Pasi Sahlberg attributes Finland's high educational achievements to its emphasis on social and educational equality and stress on cooperation and collaboration, as opposed to the competition among teachers and schools that prevails in other nations.[28]

India

Of the 74 countries tested in the PISA 2009 cycle including the "+" nations, the two Indian states came up 72nd and 73rd out of 74 in both reading and mathematics, and 73rd and 74th in science. India's poor performance may not be linguistic as some suggested. 12.87% of US students, for example, indicated that the language of the test differed from the language spoken at home. while 30.77% of Himachal Pradesh students indicated that the language of the test differed from the language spoken at home, a significantly higher percent[29] However, unlike American students, those Indian students with a different language at home did better on the PISA test than those with the same language.[29] India's poor performance on the PISA test is consistent with India's poor performance in the only other instance when India's government allowed an international organization to test its students[30] and consistent with India's own testing of its elite students in a study titled Student Learning in the Metros 2006. [31] These studies were conducted using TIMSS questions. The poor result in PISA was greeted with dismay in the Indian media.[32] The BBC reported that as of 2008, only 15% of India's students reach high school.[33]

United States

Two studies have compared high achievers in mathematics on the PISA and those on the U.S. National Assessment of Educational Progress (NAEP). Comparisons were made between those scoring at the "advanced" and "proficient" levels in mathematics on the NAEP with the corresponding performance on the PISA. Overall, 30 nations had higher percentages than the U.S. of students at the "advanced" level of mathematics. The only OECD countries with worse results were Portugal, Greece, Turkey, and Mexico. Six percent of U.S. students were "advanced" in mathematics compared to 28 percent in Taiwan. The highest ranked state in the U.S. (Massachusetts) was just 15th in the world if it was compared with the nations participating in the PISA. 31 nations had higher percentages of "proficient" students than the U.S. Massachusetts was again the best U.S. state, but it ranked just ninth in the world if compared with the nations participating in the PISA.[34][35]

Comparisons with results for the Trends in International Mathematics and Science Study (TIMSS) appear to give different results—suggesting that the U.S. states actually do better in world rankings.[36] This can likely be traced to the different material being covered and the United States teaching mathematics in a style less harmonious with the "Realistic Mathematics Education" which forms the basis of the exam.[37] Countries that commonly use this teaching method score higher on PISA, and less highly on TIMSS and other assessments.[38]

Poverty

Stephen Krassen, professor emeritus at the University of Southern California,[39] and Mel Riddile of the NASSP attributed the relatively low performance of students in the United States to the country's high rate of child poverty, which exceeds that of other OECD countries.[26][27] However, individual US schools with poverty rates comparable to Finland's (below 10%), as measured by reduced-price school lunch participation, outperform Finland; and US schools in the 10–24% reduced-price lunch range are not far behind.[40]

Reduced school lunch participation is the only available intra-poverty indicator for US schoolchildren. In the United States, schools in locations in which less than 10% of the students qualified for free or reduced-price lunch averaged PISA scores of 551 (higher than any other OECD country). This can be compared with the other OECD countries (which have tabled figures on children living in relative poverty):[27]

Country Percent of reduced school lunches (US)[27]

Percent of relative child poverty (Other OECD countries)[41]

PISA score[42]
United States < 10% 551
Finland 3.4% 536
Netherlands 9.0% 508
Belgium 6.7% 506
United States 10%–24.9% 527
Canada 13.6% 524
New Zealand 16.3% 521
Japan 14.3% 520
Australia 11.6% 515
United States 25–49.9% 502
Estonia 40.1% 501
United States 50–74.9% 471
Russian Federation 58.3% 459
United States > 75% 446

Sampling errors

In 2013 Martin Carnoy of the Stanford University Graduate School of Education and Richard Rothstein of the Economic Policy Institute released a report, "What do international tests really show about U.S. student performance?", analyzing the 2009 PISA data base. Their report found that U.S. PISA test scores had been lowered by a sampling error that over-represented adolescents from the most disadvantaged American schools in the test-taking sample.[43] The authors cautioned that international test scores are often “interpreted to show that American students perform poorly when compared to students internationally” and that school reformers then conclude that “U.S. public education is failing.” Such inferences, made before the data has been carefully analyzed, they say, “are too glib”[44] and "may lead policymakers to pursue inappropriate and even harmful reforms."[45]

Carnoy and Rothstein observe that in all countries, students from disadvantaged backgrounds perform worse than those from advantaged backgrounds, and the US has a greater percentage of students from disadvantaged backgrounds. The sampling error on the PISA results lowered U.S. scores for 15-year-olds even further, they say. The authors add, however, that in countries such as Finland, the scores of disadvantaged students tends to be stagnant, whereas in the U.S the scores of disadvantaged students have been steadily rising over time, albeit still lagging behind their those of their more advantaged peers. When the figures are adjusted for social class, the PISA scores of all US students would still remain behind those of the highest scoring countries, nevertheless, the scores of US students of all social backgrounds have shown a trajectory of improvement over time, notably in mathematics, a circumstance PISA's report fails to take into account.

Carnoy and Rothstein write that PISA spokesman Schleicher has been quoted saying that “international education benchmarks make disappointing reading for the U.S.” and that “in the U.S. in particular, poverty was destiny. Low-income American students did (and still do) much worse than high-income ones on PISA. But poor kids in Finland and Canada do far better relative to their more privileged peers, despite their disadvantages” (Ripley 2011)."[46] Carnoy and Rothstein state that their report's analysis shows Schleicher and Ripley's claims to be untrue. They further fault the way PISA's results have persistently been released to the press before experts have time to evaluate them; and they charge the OECD reports with inconsistency in explaining such factors as the role of parental education. Carnoy and Rothstein also note with alarm that the US secretary of education Arne Duncan regularly consults with PISA's Andreas Schleicher in formulating educational policy before other experts have been given a chance to analyze the results.[47] Carnoy and Rothstein's report (written before the release of the 2011 database) concludes:

We are most certain of this: To make judgments only on the basis of national average scores, on only one test, at only one point in time, without comparing trends on different tests that purport to measure the same thing, and without disaggregation by social class groups, is the worst possible choice. But, unfortunately, this is how most policymakers and analysts approach the field.

The most recent test for which an international database is presently available is PISA, administered in 2009. A database for TIMSS 2011 is scheduled for release in mid-January 2013. In December 2013, PISA will announce results and make data available from its 2012 test administration. Scholars will then be able to dig into TIMSS 2011 and PISA 2012 databases so they can place the publicly promoted average national results in proper context. The analyses we have presented in this report should caution policymakers to await understanding of this context before drawing conclusions about lessons from TIMSS or PISA assessments.[48]

References

  1. PISA 2009 Technical Report, 2012, OECD, http://www.oecd.org/dataoecd/60/31/50036771.pdf
  2. Baldi, Stéphane; Jin, Ying; Skemer, Melanie; Green, Patricia J; Herget, Deborah; Xie, Holly (2007-12-10), Highlights From PISA 2006: Performance of U.S. 15-Year-Old Students in Science and Mathematics Literacy in an International Context (PDF), NCES, retrieved 2013-12-14, PISA 2006 reading literacy results are not reported for the United States because of an error in printing the test booklets. Furthermore, as a result of the printing error, the mean performance in mathematics and science may be misestimated by approximately 1 score point. The impact is below one standard error.
  3. PISA 2009 Results: Executive Summary (PDF), OECD, 2010-12-07
  4. ACER releases results of PISA 2009+ participant economies, ACER, 2011-12-16, archived from the original on 2014-10-08, retrieved 2016-04-15
  5. PISA 2012 Results in Focus (PDF), OECD, 3 December 2013, retrieved 4 December 2013
  6. CB Online Staff. "PR scores low on global report card" Archived 2015-01-03 at the Wayback Machine, Caribbean Business, September 26, 2014. Retrieved on January 3, 2015.
  7. OECD (2014): PISA 2012 results: Creative problem solving: Students’ skills in tackling real-life problems (Volume V), http://www.oecd-ilibrary.org/education/pisa-2012-results-skills-for-life-volume-v_9789264208070-en
  8. PISA 2012 Results OECD. Retrieved 4 December 2013
  9. Sedghi, Ami; Arnett, George; Chalabi, Mona (2013-12-03), Pisa 2012 results: which country does best at reading, maths and science?, The Guardian, retrieved 2013-02-14
  10. Adams, Richard (2013-12-03), Swedish results fall abruptly as free school revolution falters, The Guardian, retrieved 2013-12-03
  11. Kärrman, Jens (2013-12-03), Löfven om Pisa: Nationell kris, Dagens Nyheter, retrieved 2013-12-03
  12. Multi-dimensional Data Request, OECD, 2010, archived from the original on 2012-07-14, retrieved 2012-06-28
  13. PISA 2009 Results: Executive Summary (Figure 1 only) (PDF), OECD, 2010, retrieved 2012-06-28
  14. Walker, Maurice (2011), PISA 2009 Plus Results (PDF), OECD, archived from the original (PDF) on 2011-12-22, retrieved 2012-06-28
  15. Learning for Tomorrow’s World First Results from PISA 2003 (PDF), OECD, 2004-12-14, retrieved 2014-01-06
  16. PISA 2003 Technical Report (PDF), OECD
  17. Literacy Skills for the World of Tomorrow: Further Results from PISA 2000 (PDF), OECD, 2003, retrieved 2014-01-06
  18. M. L. Wu: A Comparison of PISA and TIMSS 2003 achievement results in Mathematics. Paper presented at the AERA Annual Meeting, New York, March 2008.
  19. "Waiting for "Superman" trailer". Retrieved 8 October 2010.
  20. Yong Zhao (10 December 2010), A True Wake-up Call for Arne Duncan: The Real Reason Behind Chinese Students Top PISA Performance
  21. Gumbel, Peter (7 December 2010), "China Beats Out Finland for Top Marks in Education", Time, retrieved 27 June 2012
  22. Cook, Chris (7 December 2010), "Shanghai tops global state school rankings", Financial Times, retrieved 28 June 2012
  23. Mance, Henry (7 December 2010), "Why are Chinese schoolkids so good?", Financial Times, retrieved 28 June 2012
  24. https://ifsv.sund.ku.dk/biostat/biostat_annualreport/images/c/ca/ResearchReport-2011-1.pdf
  25. Simola, Hannu (2005), "The Finnish miracle of PISA: Historical and sociological remarks on teaching and teacher education" (PDF), Comparative Education, 41 (4): 455–470, doi:10.1080/03050060500317810
  26. "The Economics Behind International Education Rankings" National Educational Association
  27. Riddile, Mel (15 December 2010), PISA: It's Poverty Not Stupid, National Association of Secondary School Principals, archived from the original on 22 January 2014, retrieved 15 April 2016
  28. Cleland, Elizabeth. "What Americans Keep Ignoring About Finland's School Success – Anu Partanen". The Atlantic.
  29. "Database – PISA 2009". Pisa2009.acer.edu.au. Archived from the original on 2016-03-22. Retrieved 2016-04-15.
  30. http://ddp-ext.worldbank.org/EdStats/INDprwp08b.pdf
  31. Initiatives, Educational (November 2006), "Student Learning in the Metros" (PDF), Educational Initiatives
  32. Vishnoi, Anubhuti (7 January 2012), "Poor PISA ranks: HRD seeks reason", The Indian Express
  33. Masani, Zareer (27 February 2008). "India still Asia's reluctant tiger". BBC News.
  34. Paul E. Peterson, Ludger Woessmann, Eric A. Hanushek, and Carlos X. Lastra-Anadón (2011) "Are U.S. students ready to compete? The latest on each state's international standing." Education Next 11:4 (Fall): 51–59. http://educationnext.org/are-u-s-students-ready-to-compete/
  35. Eric A. Hanushek, Paul E. Peterson, and Ludger Woessmann (2011) "Teaching math to the talented." Education Next 11, no. 1 (Winter): 10–18. http://educationnext.org/teaching-math-to-the-talented/
  36. Gary W. Phillips (2007) Chance favors the prepared mind: Mathematics and science indicators for comparing states. Washington: American Institutes for Research (14 November); Gary W. Phillips (2009) The Second Derivative:International Benchmarks in Mathematics For U.S. States and School Districts. Washington, DC: American Institutes for Research (June).
  37. "PISA Mathematics: A Teacher's Guide" (PDF).
  38. Loveless, Tom. "International Tests Are Not All the Same". Brookings Institution.
  39. quoted in Valerie Strauss, "How poverty affected U.S. PISA scores", The Washington Post, December 9, 2010.
  40. "Stratifying PISA scores by poverty rates suggests imitating Finland is not necessarily the way to go for US schools". Simply Statistics. 23 August 2013.
  41. "Child poverty statistics: how the UK compares to other countries", The Guardian. The same UNICEF figures were used by Riddile.
  42. Highlights From PISA 2009, Table 3.
  43. See, Martin Carnoy and Richard Rothstein, "What do international tests really show about U.S. student performance?", Economic Policy Institute, January 28, 2013.
  44. Valerie Strauss, "U.S. scores on international test lowered by sampling error: report", Washington Post, January 15, 2013.
  45. Carnoy and Rothstein, "What do international tests really show about U.S. student performance?", Economic Policy Institute, January 28, 2013
  46. Schleicher was quoted by Amanda Ripley to this effect in her 2011 book, The Smartest Kids in The World (Simon and Schuster).
  47. Carnoy and Rothstein, "What do international tests really show about U.S. student performance?", EPI, January 28, 2013. Another scholar, Matthew di Carlo of the Albert Shanker Institute, criticized PISA for reporting its results in the form of national rankings, since rankings can give a misleading impression that differences between countries' scores are far larger than is actually the case. Di Carlo also faulted PISA's methodology for disregarding factors such as margin of error. See Matthew di Carlo, "Pisa For Our Time: A Balanced Look", Albert Shanker Institute website, January 10, 2011.
  48. Carnoy and Rothstein, "What do international tests really show about U.S. student performance?", EPI, January 28, 2013.

Further reading

Official websites and reports

  • OECD/PISA website
    • OECD (1999): Measuring Student Knowledge and Skills. A New Framework for Assessment. Paris: OECD, ISBN 92-64-17053-7
    • OECD (2001): Knowledge and Skills for Life. First Results from the OECD Programme for International Student Assessment (PISA) 2000.
    • OECD (2003a): The PISA 2003 Assessment Framework. Mathematics, Reading, Science and Problem Solving Knowledge and Skills. Paris: OECD, ISBN 978-92-64-10172-2
    • OECD (2004a): Learning for Tomorrow's World. First Results from PISA 2003. Paris: OECD, ISBN 978-92-64-00724-6
    • OECD (2004b): Problem Solving for Tomorrow's World. First Measures of Cross-Curricular Competencies from PISA 2003. Paris: OECD, ISBN 978-92-64-00642-3
    • OECD (2005): PISA 2003 Technical Report. Paris: OECD, ISBN 978-92-64-01053-6
    • OECD (2007): Science Competencies for Tomorrow's World: Results from PISA 2006
    • OECD (2014): PISA 2012 results: Creative problem solving: Students’ skills in tackling real-life problems (Volume V)

Reception and political consequences

  • A. P. Jakobi, K. Martens: Diffusion durch internationale Organisationen: Die Bildungspolitik der OECD. In: K. Holzinger, H. Jörgens, C. Knill: Transfer, Diffusion und Konvergenz von Politiken. VS Verlag für Sozialwissenschaften, 2007.

France

  • N. Mons, X. Pons: The reception and use of Pisa in France.

Germany

  • E. Bulmahn [then federal secretary of education]: PISA: the consequences for Germany. OECD observer, no. 231/232, May 2002. pp. 33–34.
  • H. Ertl: Educational Standards and the Changing Discourse on Education: The Reception and Consequences of the PISA Study in Germany. Oxford Review of Education, v 32 n 5 pp 619–634 Nov 2006.

United Kingdom

  • S. Grek, M. Lawn, J. Ozga: Study on the Use and Circulation of PISA in Scotland.

Books

  • H. Brügelmann: Vermessene Schulen - standardisierte Schüler. Beltz-Verlag, Weinheim (Deutsch, English summary: https://www.academia.edu/15203894/Evidence-Based_Pedagogy ).
  • S. Hopmann, G. Brinek, M. Retzl (eds.): PISA zufolge PISA. PISA According to PISA. LIT-Verlag, Wien 2007, ISBN 3-8258-0946-3 (partly in German, partly in English)
  • T. Jahnke, W. Meyerhöfer (eds.): PISA & Co – Kritik eines Programms. Franzbecker, Hildesheim 2007 (2nd edn.), ISBN 978-3-88120-464-4 (in German)
  • R. Münch: Globale Eliten, lokale Autoritäten: Bildung und Wissenschaft unter dem Regime von PISA, McKinsey & Co. Frankfurt am Main : Suhrkamp, 2009. ISBN 978-3-518-12560-1 (in German)

Websites

Video clips

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.