back to all Insights

Prior to pandemic, was England getting worse?

In the latest blog in our series on outline principles for the future of education, Tim Oates CBE, Director of Assessment Research and Development at Cambridge University Press & Assessment, challenges ‘Year Zero’ thinking in many of the discussions about future learning.

Students in classroom

In all the comments about the apparent lack of resilience in the education system - its disruption in the face of pandemic - I think we need to remind ourselves of three things.

Firstly, teachers, pupils and families were extraordinary in their response to the unprecedented level of disruption.

Secondly, to have prepared entirely adequately would have involved an eyewatering level of redundancy in arrangements and sustained preparatory effort.

Thirdly, the fact that we were unable to respond immediately and without huge effort does not mean that what we were doing before was wrong.

This last point is extremely important. There is a lot of ‘Year Zero’ thinking manifest in the discussions of ‘future learning’ at the moment. I want to challenge that, principally because I think we were doing a lot of things right. Let’s look at the questions of whether England’s performance was improving or deteriorating (spoiler - it’s been improving in key areas, and we should celebrate and learn from that). 

How can we assess education systems?

It is frighteningly hard to know what is happening in an education system in a nation the size of England. I am part of a group of researchers, from around the world, who specialise in looking at system performance. We have qualitative studies of practice in the classroom; we have national qualification and national assessment data; we have the big international surveys, PISA, TIMSS and PIRLS in particular; we have commercial tests and special tests such as the National Reference Test. None of these is infallible, all of them are helpful. But we need to be cautious in using these sources, and the outcomes don’t always line up neatly.

Qualifications data can be misleading for a variety of reasons, but a principal problem of the past - instability in the standards for grading - has been replaced by limitations of the very method which we have used to curb grade inflation: comparable outcomes. The method doesn’t lock down improvement in the way that norm referencing does, but it does inhibit the way that exam results should improve overall, if there has been an overall improvement in system performance. Qualifications data is useful, and indicative, but with known limitations.

To detect any improvement at system level Government introduced in 2017 the National Reference Test, a useful sample-based assessment of 16 years olds which gives us more insight into improved overall performance. This is well managed and is a good additional source of data.

The big international surveys - PISA at age 15, TIMSS at grades 4 and 8, and PIRLS around fourth grade - have various known limitations. Some analysts regard these as fatal, but in our research group we monitor carefully the way in which these surveys are being conducted and feel that although people tend to overinterpret small changes, if a nation has a pronounced trend of deterioration over a number of cycles of these surveys or pronounced improvement - and this lines up with policy action and other indicators - then there’s probably something real going on.

You always have to take into account the time lags - and people often don’t. Pupils in PISA are 15, so if you want to know why their score is the way it is, it’s important to look at the previous 10 years of schooling and at the sort of educational provision and experience which they have had over that time. If a new policy on reading is introduced to 8-year-olds, it’s not going to show up in PISA data until 7 years later. This failure to understand these time lags lies behind our account of why Finland’s high performance in PISA in 2000 has been almost universally misunderstood. But that’s another story

What has been happening in England?

Let’s turn to England. If we had been deteriorating over the last decade, we would have been a member of quite a large club: a substantial group of developed nations have experienced significant declines. But we haven’t declined - we have been improving. 

PISA 2018 showed gains in maths in England. The improvements coincide with large scale intervention and support on maths education, stimulated by Government and ably led by the National Centre for Excellence in the Teaching of Mathematics - including the maths hubs, approved maths materials and the Shanghai exchange. What we have been doing over the last decade in at least 50% of our primary schools has been focussed, evidence-based and well-grounded. And PISA suggests that it is working.

Speaking to mathematician Tony Gardiner, he rightly urges some caution about whether this is transferring to Secondary education, since TIMSS data at grade 8 shows an attenuation of the gains we are making in primary. But that’s where the National Reference Test is valuable. In maths it is showing gains in secondary, as per NFER’s excellent analysis.

So…it’s not easy to improve education systems; we can and should congratulate ourselves on the gains made in maths, prior to pandemic. We were doing the right things, and it was working. And Tony G is right…we need to look at how the gains in primary are being carried over into Key Stage 3. 

The reference test also shows a worrying overall stasis in English. No improvement, despite all the effort we have been putting into literacy, particularly reading. And this is replicated in the PISA data. But looking into this in detail gives us a pause for thought. We held steady when other key nations have been declining, with OECD average scores in Literacy down significantly since 2012. And I have looked into the international historical record. There has been protracted and serious decline in reading efficiency in the USA since the 1960s, well before digital additionally disrupted reading habits.

So we are doing well by holding steady and we really should congratulate ourselves on that too, alongside Maths. And note that the only improvement in Scotland was in literacy, which we can trace to an initiative on phonics. Dan Willingham’s work endorses the approach we have been taking in literacy and that too was going well before Pandemic. 

Science genuinely is a different story. TIMSS and PISA both show that national attainment is simply flat - we are performing at the same level which we did in the 1990s. And that makes sense, we have no initiatives in primary science equivalent to the support and interventions made in literacy and maths since 2010. No focussed actions; no change. 

Many of the high performing systems around the world are small - Hong Kong, Singapore, Estonia. We are big, and improvement is hard. But prior to Pandemic, in key areas, I believe we were doing the right, evidence-based things, and that was showing in elevated outcomes or better relative performance internationally. We have much more to do, not least on variable quality, displayed in our high within-school variation in PISA. If we can shift that, we would consolidate our position amongst some of the high equity systems. But let’s not succumb to Year Zero thinking. We were doing good things prior to pandemic and we need to get back to doing them as soon as we can.


About the author


Tim Oates


Tim Oates CBE is Director of Assessment Research and Development at Cambridge University Press & Assessment. Tim, who joined Cambridge Assessment in May 2005, was the Head of Research and Statistics at the Qualifications and Curriculum Agency for the best part of a decade. In 2011 he Chaired the Expert Panel as part of the Department of Education's National Curriculum Review. Tim was awarded CBE in the 2015 New Year's Honours for services to education.