Do exam owners declare their energy usage?
- Geoff Chapman
- Oct 10
- 3 min read
I listened to Rita Bateson this week at RM’s Assessment Summit in London, and was struck that for the assessment sector, energy usage remains a niche topic.
How do UK companies report energy use? I asked the question, “Should exam owners be mandated to declare their energy usage for test development and delivery?” Since April 2019, the UK’s Streamlined Energy and Carbon Reporting (SECR) policy mandates large UK companies to annually report on their ‘all-up’ energy usage and carbon emissions.
Is there any research on the carbon footprint of exams? In 2023, England exam regulator Ofqual tried to estimate the carbon footprint of a GCSE school exam. A fair first attempt, but muddled messaging involving washing machine cycles clouded many of the issues. Not least a missed opportunity to benchmark against digital exam delivery, of which Ofqual has regulated for over 20 years.
Ofqual missed a opportunity to benchmark and compare the carbon footprint of regulated paper and digital exams.
I’d love to be proved wrong, but I’ve yet to find any UK exam owner that assigns their energy usage to each element of the end-to-end test experience. Only studies from AQA and NEBOSH have explored some of the environmental data. I’ve blogged on this issue before – data is very limited, and few understand the full end-to-end impact.
How does AI impact exam production energy use? As more sector suppliers build use cases for using AI models to generate questions, even AI for England’s school SAT tests, and proctoring companies overlay AI systems to prevent malpractice, the AI energy cost is starting to reach the mainstream.
Rita detailed Ireland’s recent experience, whereby direct foreign investment has precipitated a situation where 55% of Dublin’s energy goes to data centres. While this investment has brought jobs and revenue, there appears to be discomfort around the development of parallel private infrastructure, as well as the incredible energy soak for one specific area.
Does the assessment sector have 'energy blindness'? Rita was right to suggest that there’s a cumulative impact at play: the iterative trialling and production of items across text, image, and video is becoming significant, but mostly invisible to the assessment professional. And the energy cost appears to be leaping.
The cumulative energy cost of producing assessments is becoming significant, but remains invisible to the sector.
I’d also argue that the E-waste 'black hole' within digital test delivery is still a taboo subject. While consumers and businesses scramble to extend their Windows 10 support and consign PCs to e-waste, too many take the path-of-least-resistance. Spending someone’s money (exam candidates’?) on brand new hardware is baking-in unsustainable business costs, but also environmental costs. On test development, Rita used an example to give a little colour on its energy impact.

Does video item production use more energy? Wasn’t video supposed to be the saviour of on-screen assessment? Many assessment professionals want more video usage within item types. Including me! AI can produce video clips and simulations from text, so is this the breakthrough? Or are we just creating more energy headaches?
Given the US and the UK have prioritised data centres, in parallel with the movement towards digital sovereignty, perhaps the mainstream investment gives enough comfort to the sector that the energy question has been taken care of?
Will AI make exams more energy intensive? If test development practices start to lean more on AI-powered tools, even just to expedite item cloning, then the statistics from Rita’s presentation seem to suggest it will. The increasing security layers associated with remote proctoring, often powered by AI, are also currently in play. However, with nothing to benchmark, will this just become an acceptable cost of doing business?
The assessment sector appears in denial. Nobody really knows the absolute economic and environmental cost of the entire exam production process. Either paper or digital.
Are digital exams better for the environment? Assertions that ‘digital exams are better for the environment’ don’t (yet) stand up to serious scrutiny. The data simply isn’t there, and exam owners are not currently mandated or motivated to audit and publish the entire data sets. The atomisation of many exam programmes into operational silos (e.g. job task analysis, test delivery, psychometric analysis) even with paper delivery, means the analysis job to understand the costs is complex and not transparent.
Thank you to Rita for a helpful and detailed presentation, and to RM Assessment for the kind event invitation.