Showing posts with label PISA. Show all posts
Showing posts with label PISA. Show all posts

Thursday, 23 February 2017

The data deficit effect

In my latest Sutton Trust blog, how a dearth of data in Scotland propelled a Sutton Trust report onto the front pages.
A funny thing happened with the Sutton Trust's Global Gaps report a couple of weeks ago. John Jerrim’s excellent look at the different performance of highly able 15 year-olds from different social backgrounds gained some good – but not spectacular – coverage in the London media.
But on the same day it became the top political news story in Scotland. The report included breakdowns for the four UK nations and the Trust had targeted stories at outlets in each.
The Scottish data was marginally worse than that in England – and crucially it showed that science results had dipped over the last ten years significantly – but this was enough to create front page splashes in some papers and much bigger stories in Scottish editions than in their English counterparts.
Crucially, too, the opposition took the data and ran with it. The two year gap in performance between poor and better off teenagers hit a nerve, and fed a narrative that the Scottish government has been failing on education. So much so that both Ruth Davidson, the Scottish Conservative leader and Keiza Dugdale, the Labour leader, majored on the report at First Minister’s Questions.
That took the story into a second day of front page news and saw the BBC’s Scotland political editor filing a lengthy report for the evening news bulletins. By the time last Thursday’s Question Time was broadcast from Glasgow the story was still fresh enough to warrant a separate discussion.
I’ve been reflecting on why this happened. There were some strong political reasons. Opposition politicians clearly leapt on the report with a vigour long lacking in their London counterparts, and that certainly gave the story more legs than had it been solely a Sutton Trust press release and report.
Education is also a much bigger issue in Scotland, both because Nicola Sturgeon and her education secretary John Swinney have made narrowing the attainment gap their big issue in this term, which means that any signs of failure get seized upon.
But I think another factor is just as important – the data deficit North of the border. I became acutely aware of this when I served last year on the Commission on Widening Access in Scotland. The dearth of data was the main reason I subsequently commissioned researchers at Edinburgh to produce the Access in Scotland report for the Sutton Trust.
At school level, this data deficit is particularly significant. Swinney is now introducing a more rigorous – if controversial – testing system this autumn. Scotland scrapped national testing in the mid-2000s, along with Wales. The result was predictably disastrous in Wales, which has been edging back towards testing, and the PISA results suggest it saw a slide in Scottish results too.
Potentially the reintroduction of national testing could do a lot for research into social mobility in Scotland, something the critics of testing often wilfully ignore, as well as ensuring that aspirations for able disadvantaged students are stretching.
Combined with the introduction of a Scottish version of the Teaching and Learning Toolkit, currently being developed by the Education Endowment Foundation with Education Scotland, this could have a genuinely beneficial impact on less advantaged pupils’ results.
Contrast the dearth of data in Scotland (and Wales) with its abundance in England. The National Pupil Database is an invaluable resource with the potential to improve social mobility as it shows schools how others succeed in similar circumstances and with linkage to other databases including HMRC it allows researchers to measure how well students from different backgrounds progress from the start of school to the workplace.
PISA is useful for its comparability in that respect, but is not sufficient – hence the excitement surrounding our recent report. Gratifying as it was to have such great coverage, I look forward to the day when such data doesn’t cause so much of a stir in Scotland because there is much more data available on the progress of Scottish children – and teachers have the tools to compare their pupils with similar pupils elsewhere in the country.

Thursday, 6 December 2012

Turning the global league tables

Last week’s publication of a new global education league table by the Economist Intelligence Unit and Pearson raised some eyebrows with its claim that the UK’s education system now ranks sixth in the developed world.

After all, on the same day, the Chief Inspector, Sir Michael Wilshaw was using the data from the OECD’s Programme for International Student Assessment (PISA) to make a case that English schools must do better if we are to match global competitors in the future.

The UK scores around average in PISA for reading and mathematical literacy, and a little above average for scientific literacy, based on tests of 15 year-olds. This places UK schools 25th for reading, 28th for maths and 16th for science, out of 65 countries.

Pearson has aggregated this PISA data with other studies from the Trends In International Mathematics and Science Study (TIMSS), which measures international trends in mathematics and science achievement of 9 and 13 year-olds, and the Progress In International Reading Literacy Study (PIRLS) which just focuses on the reading achievement of 9 year-olds – those in fourth grade.

The Pearson Learning Curve report also includes some UNESCO data to create a ranking of countries that looks at both cognitive skills and educational attainment. For cognitive skills, they use PISA, PIRLS and TIMSS, and for attainment they use UNESCO data on adult literacy and OECD data on graduation rates at the end of secondary school and at university.

PISA and TIMSS/PIRLS measure different things and do so at different ages. The primary difference is that TIMSS/PIRLS looks at what you been taught in a particular subject and how much you have learnt, whereas PISA looks at what you are able to do with the science you have been taught. PISA is more about the practical application of your knowledge.

The Learning Curve report also draws wider conclusions and lessons for education policymakers, including the importance of good teachers, a strong pro-education culture and on the best ways to engage parents.

But how does it reach so different a conclusion from PISA on the comparative health of the UK education system? It does so quite easily, in fact, and it is all in the weightings. And in looking at how it reaches the conclusions it does, there are also lessons on how one should use such league tables.

First, on PISA, the Pearson/EIU index has fewer countries than PISA in its list, so removing those countries and only using Hong Kong for China (PISA includes Shanghai and Macao too) elevates the UK four places higher in reading and science and six places higher in Maths. The new index is more interested in a country’s relationship to the mean than its ranking, so bunching around the mean in PISA would also reduce ranking differentials.

Second, on PIRLS and TIMSS, England and Wales score significantly better on these tables (some of which exclude other major developed nations such as France, Finland and New Zealand) than on PISA. Although Scotland scores lower, the UK average remains strong. Because TIMSS has both Grade 4 and Grade 8 tests, the combined value of PIRLS and TIMSS is stronger than that for PISA.

And finally, although the UK is rated 6th overall, it is only ranked 11th for cognitive skills – those measured by PISA, TIMSS and PIRLS – and its higher rating on the overall table owes more to adult literacy and graduation rates.

Add in all these factors, and look at the weighting given to each factor in the report. The default weight for the Index is two-thirds to cognitive skills and one-third to educational attainment. Within the cognitive skills category, the Grade 8 tests’ score accounts for 60% while the Grade 4 tests’ score accounts for 40% (Reading, Maths and Science all account for equal weights).

Pearson table


 So, the PISA reading and Maths scores combined, where the UK is weakest, only account for 20% of a country’s ranking, and PISA science another 6.7%. But because PIRLS and TIMSS are available at Grade 4 and Grade 8, they will be worth 26.7% for the Grade 4 and 13.3% for the Grade 8, a total of 40%. The HE and adult literacy scores are worth a further 33.3% between them.

Perhaps a bigger issue than the rankings is what the tables do not reveal about education in the UK, particularly the absence of any measure in the Pearson/EIU table of the attainment gap or of social mobility. One recent OECD report, for example, said that only Russia and the Czech Republic had a more socially segregated schools system.

The big gaps in attainment between pupils on free school meals and their peers in GCSEs are another important indicator – other countries have narrower gaps in attainment, as we demonstrated at our social mobility summit in May.

And the rankings should also look at how well countries perform with their most able students. Sutton Trust analysis earlier this year showed that, in maths, just 1.7% of 15-year-olds attained the very highest PISA level (level 6), compared with an OECD average of 3.1%, placing England 26th out of 34 countries.

The new index is an important step forward in consolidating international data. But any such league table is dependent on the quality and range of inputs. As the Economist Intelligence Unit and Pearson develop the index, they should consider adding measures of mobility and the achievements of the most able to give a fuller picture of the success of national education systems.

This posting first appeared at the Sutton Trust blog. It was also quoted by Anne McElvoy on the Economist Blighty blog.

Tuesday, 4 December 2007

Reading behind the international surveys (2)

Coming a week after the PIRLS reading survey, today's PISA results appear to confirm a trend. It is not, as the more excitable right-wing commentators suggest, that our students have dipped significantly in their performance because of central government policies - in fact on reading and science, only seven countries score significantly higher than England (though, our maths performance is rather less good); rather, it is that some other countries that have introduced strong reforms are improving faster. As I argued around PIRLS, the intensity of the strategies around literacy and numeracy (and this is as true at Key Stage 3 - age 14 - as it has been at Key Stage 2 - age 11) was lost between 2001-2005. In primary schools, the focus is being re-energised; in secondaries, the requirement that English and Maths results are included in the five good GCSEs recorded in the league tables is starting to have a similar effect (it is noticeable that far more pupils get a GCSE C+ in English than Maths (See Excel Table 7)). But the lesson of these international surveys is not that we should abandon testing, leave teachers alone or give up. Have a look at the dismal performance of Wales - against which 16 countries do significantly better on reading - to see what that approach brings. It is instead that - as the 2000 and 2001 surveys showed - when government has a clear overall strategy, schools are clear about their priorities and public accountability is strong - then pupils will achieve not just at the average level of their international counterparts, but significantly above average.