top of page

The new Metric Martyrs? Are digital exams really the same as paper ones?

England’s school exam owners are preparing to offer GCSE digital exams and their paper equivalent in parallel. But is that fair? Is one easier than the other? Could we end up with analogue ‘Manuscript Martyrs’, just like Britain’s own Metric Martyrs, campaigning to keep paper exam delivery? Or is it complete bananas to compare the two?


Paper or digital is a choice. That's a good thing, right? Digitising liberates students currently locked out by paper delivery – claimed to be around 10%. So offering digital to these learners is equitable. But are the questions asked in a digital exam the same as paper ones? Is equal time permitted? How will standardisation and appeals work?


What’s the big deal? Loads of exams give you the choice of paper or computer. Many exam owners, both regulated and not, have offered dual modes for some time, including IELTS and SAT, although they are marked differently with different outcomes. Some have already completely transitioned to digital, such as GMAT and AAT. It is true that many exam owners offer dual modes, for locations with weak internet connectivity and/ or infrastructure.


But they're the same questions, how can it be different? In an exam room, it maybe possible to see two students sitting the same exam, but in two very different modes. School exam officers potentially have to cope with radically different service encounters and support.


Does exam mode influence results? Broadly yes, but it’s heavily caveated. Essentially, if you’re more used to a computer, you can score better than taking the same exam on paper. If you’ve rarely used a computer, paper gets a better score. The issue of legibility for papers to be effectively marked, is ignored by most commentators.


What does the research say? The research picture is mixed, to say the least. Betty Bergstrom's 1992 meta study covering 20 studies from 8 research reports found tests were generally comparable, but the mean scores on paper were consistently better than digital exams.


Rose Clesham’s 2010 paper considered both modes to be ‘internally reliable’ (i.e. have consistent outcomes), "...but they’re not always measuring the same thing. While it was found that students performed better on paper, teachers and students alike thought digital had greater validity."

“A student’s mastery of the exam delivery mode matters.”

In 2016, Steve Graham claimed that the most effective mode is the one that students are proficient with. “A student’s mastery of the exam delivery mode matters.”


Could a digital student could get their grade before their ‘paper colleague’? It’s technically possible. I’ve mentioned before about basing job and further study applications on real grades, not ‘halo and horns’ biased guesstimates. If some digital English Language high-stakes exams claim results (for a 4-facet exam) in less than 2 days, that’s a massive change.


What’s the potential fallout in delivering paper and digital exams together? Explaining why digital exams are equivalent to paper is a massive public relations challenge, especially in litigious times. The ‘sausage-making’ of how we get an exam grade is a dry, stats-heavy subject. It bamboozles 99.9% of the population. The equivalence issue needs to be discussed and resolved. But they want it simple. Just like an exam grade.

“Analogue stragglers and digital refuseniks could be the new Metric Martyrs, or Manuscript Martyrs, if you prefer.”

Nothing to See Here? Explaining that there’s no cause for concern has uncanny parallels with Britain’s journey to metrication. With a lack of academic consensus, minimal mainstream awareness of dual-mode delivery (or digital exams, to be honest), we could end up with stories about ‘analogue stragglers’ or ‘digital refuseniks’ becoming the education world’s Metric Martyrs, or Manuscript Martyrs, if you prefer. Even though this isn’t ‘Big Bang’ and more of a ‘sunsetting’. And that’s definitely not bananas.









bottom of page