English Language Testing (ELT) is a rapidly growing, 'booming' sector. It’s driven by learners needing to develop their language skills for work, entering higher education, settlement, and for pleasure. Disruption is occurring: exam owners attempting innovative digital delivery, and establishing new distribution channels for English language learning.
While ELT is well known, it is controversial. The sector witnessed a 'bust' with the 2014 UK Panorama TV investigation, which flagged severe ELT test centre malpractice. In 2019, a report by the UK government's National Audit Office (NAO) severely criticised the test owner and the supplier.
Reuters values the sector at USD$9.9 billion dollars, rising to USD$22 billion by 2024. This post looks at the challenges awaiting the next-generation of ELT solutions, and where the sector attempts to fulfil the demand.
The Bust - definite cheating on English language tests. The NAO’s final 2019 report concluded that the evidence, “...strongly suggests that there was widespread abuse of the student visa system” and “it is reasonable to conclude there was cheating on a large scale – because of the unusual distribution of marks, and high numbers of invalid tests in test centres successfully prosecuted for cheating”.
25 people were convicted of criminal offences for organising the cheating. “Organised crime groups” were blamed, with the report citing one crime ring which controlled three non-compliant colleges in Manchester:
"Evidence from unannounced visits, computers, and documents proved the fraud had taken place. The investigation found the crime group had used a set of proxy test-takers to take the [English] speaking test on behalf of people willing to pay. The group was making considerable income from the fraud, charging around £750 for each [test], which normally costs £180."
600 Centres. £21M. 25k candidates. A 300% mark-up and £21M spent by the government on...getting a result (pun intended). So what on earth happened?
Licence Revoked At the height of the problems, the licences of over 600 centres were revoked or suspended. What was their motivation? What’s changed, if anything?
Gaining and maintaining a reputation for successful student outcomes is key for a sustainable training organisation. Almost 100 institutions had their licence to sponsor foreign students suspended, and of those, 89 ultimately lost their licence. Was this fair?
On one hand, the immigration inspector found that the Home Office investigation of colleges had been “handled well”. On the other, colleges complained that they’d actually taken the students on because they had passed a Home Office-approved test. So they argued it was a bit rich to blame them, when it was actually the test that was compromised.
And bizarrely, in some cases, according to sector association Universities UK, “the Department had given sponsors a ‘clear’ audit rating shortly, before investigating them again with a view to suspending or revoking their licence”.
Under what circumstances does a test owner state that 97% of delivered tests are invalid or questionable? On the face of it, that is an appalling statistic.
The NAO report has given us the numbers. What’s important is that the amount expended on recovering the situation was probably more than the contract was actually worth. Classic horse bolted cliches notwithstanding, even if such an august test owner can’t offer assurance and security, how does the sector deal with the fall-out?
Who has been impacted by this?
Exam Programme Impact The NAO report states that the Home Office "didn’t care" if people were wrongly labelled cheats. They were “supremely relaxed” about innocent candidates being caught up in the ETS dragnet. It was two years before the Home Office commissioned its own quality assurance of ETS’s system for identifying candidates involved in malpractice.
The report also claims that the NAO, “...saw no evidence that the Home Office considered whether ETS had misclassified individuals or looked for anomalies…It had not investigated the reasons why people with invalid scores had low marks, won appeals, or gained leave to remain…”
The NAO, being more curious, burrowed into the evidence and found that ETS treated at least 3,700 written tests as suspicious “...even though the exams marks suggest candidates were not fed the answers...”.
It concluded that, “...the Department’s course of action against TOEIC students carried with it the possibility that a proportion of those affected might have been branded as cheats, lost their course fees, and been removed from the UK, without being found guilty of cheating, or given an adequate opportunity to clear their names.”
It’s clear from the NAO report that the impact on the Home Office was to spend huge amounts of public money, only when prodded to, and when it was too late, still ended up botching their efforts.
Solution Provider Impact For the test owner, the NAO states that it “...denied people the chance to clear their name.” Or access to justice, if you prefer a stronger term.
The Home Office took enforcement action against 25,000 people. Those who protested their innocence asked ETS for the information it used to brand them cheats, such as the recording of the suspicious oral exam.
The company’s response was unhelpful: to quote the NAO, "...even when students have had legal representation, they had difficulty obtaining all of their personal data, including the original recordings and other test materials. Some students also had difficulties obtaining the voice clips that were used as evidence against them."
Bindmans, the legal firm representing dozens of TOEIC candidates, told the NAO that ETS had failed to provide information and documents to which the individuals are entitled.
Also, in at least one case, the NAO reported that ETS destroyed the evidence instead of handing it over. So, the impact on ETS was, at best, to obfuscate, frustrate, and deny their paying customers (the candidates) access to their own evidence. I’ll merely point out that the UK’s Crown Prosecution Service has clear guidance on the Abuse of Process relating to Failing to Obtain, Losing, or Destroying Evidence.
Candidate Impact While tens of thousands of people were accused of cheating, the NAO was unimpressed by the investigation process. ETS administered the tests and did try to sort out who had cheated on them. They deemed 34,000 tests invalid, and another 22,000 “questionable”.
In total, that represented 97% of all its English exams taken in the UK between 2011 and 2014, as we mentioned previously. It arrived at that figure with the help of automated voice recognition software. The idea was that candidates hired someone with better English to take their oral exam.
The NAO pointedly does not accept the ETS analysis at face value, stating that voice recognition technology had not been used before in this context, and there was no control group to see whether it actually worked. You’d think someone would conduct research on that beforehand, wouldn’t you?
Independent experts who tried to assess ETS’s methods, “...did not know what software had been used, have access to many voice recordings, or know the performance of human verifiers”.
Nor could they explain why, in some cases, the supposed cheating candidate had actually failed the test. The report concludes that, “...it is difficult to estimate accurately how many people may have been wrongly identified [as cheats].”
According to NAO analysis of Home Office data, around 3,700 people accused of cheating have won appeals in the First-Tier Tribunal. Astonishingly, the report notes that,“...people usually had to appeal on human rights grounds, because they could not appeal the decision directly”.
So, this implies that the figure could have been much higher, had there been a direct right of appeal. The NAO’s assertion was that around 11,400 people caught up in this scandal were not permitted to remain in the UK.
The next part of this paper sprints through six sectors that use English testing, ending with a quick case study on a key country.
Academic Admission English language testing for higher ed is widespread, and most of it for academic entrance purposes, where institutions need to know whether an international student has the language abilities necessary to succeed in an academic environment.
The three most dominant tests are TOEFL (ETS), IELTS (Cambridge/British Council/IDP), and PTE Academic (Pearson), with other players such as Telc of Germany and LanguageCert of Greece also in the mix.
TOEIC was initially designed to test English language proficiency for candidates working in an international environment. While it’s still used for job entry by corporates and some government agencies/ departments, TOEIC is still used in a number of higher education settings.
Aviation English is the international language of aviation. The International Civil Aviation Organization (ICAO) requires all pilots and air traffic controllers to be competent in spoken and written English.
Aviation has distinctive 'domain specific' English requirements, such as the international phonetic alphabet, and a group of 300 key words such as ‘roger,’ ‘affirm,’ and ‘approach.’ Pilots have their licenses endorsed with their English language capability.
The ICAO’s Aviation English Language Test Service measures the speaking and listening ability of pilots and controllers. For UK pilots, there are five main English exams which are part of the ICAO’s programme: Anglo-Continental’s TEAP; The ELPAC ‘Level 6 test’; RMIT REALTA; The Mayflower College TEA Test, and the Versant for aviation test.
Military Post 9/11, significant investment has been placed in military language training and testing. NATO's Standardization Agreement deals with any language standards that are taught and spoken by any NATO member. Language proficiency skills across Listening Comprehension, Speaking, Reading Comprehension, and Writing are codified into six levels, from Nil Proficiency to Highly-Articulate Native.
For example, the Benchmark Advisory Test (BAT) is for NATO non-native English speaking military and civilian personnel. The BAT has a computer adaptive writing part with 60 MCQs within a two-hour test window. A speaking part, conducted over the phone, is administered by a test centre proctor, within a 40 minute test window.
Most US federal government agencies rely on the Defense Language Proficiency Test, and the Oral Proficiency Interview, delivered via the Defense Language Institute - the language training wing of the US Military.
The UK Defence Centre for Languages and Culture (DCLC) is based within the Defence Academy at Shrivenham. Language training is delivered for military personnel through two in-house language delivery functions: Foreign Language Wing (FLW) and English Language Wing (ELW).
Medical & Healthcare English language testing for entrance into the medical and healthcare sector is relatively ingrained into the sector. The UK’s Nursing and Midwifery Council and General Medical Council recognise the Occupational English Test (OET), which is marketed as a specific test of medical English.
This is a sector which hitherto has been the domain of generalists such as the IELTS test, but with Covid-19, many sectors that require rapid staff on-boarding and deployment need alternatives.
Contact Centre There are few studies into spoken language assessment approaches for contact centres. But, a common theme is that corporate employers are frustrated at employees who have good TOEIC/ IELTS scores, but underperform within the workplace.
The need for assessing second language English-speaking contact centre agents is relatively new. Spoken language performances in specific exchange contexts require a good understanding of that context, the target texts in that context, and a reliance on subject matter expert (SME) input.
Legal English Legal English language exams for the legal sector is a niche area, but falls into two broad areas: assessment for lawyers (practising in a setting where they need a second language), and court interpreters (translating and interpreting for speakers of languages other than those used by the courts and police services of the country concerned).
Demand from England-based law firms has seen the development of the Test of Legal English Skills (TOLES) exam suite. Offered by Global Legal English (GLE) up to six times a year for three papers.
Other exam owners offer regulated and non-regulated Legal English exams. The DPSI exam, set by the Chartered Institute of Linguists (CioL) has three pathways: Law, Health, and Local Government.
Although, interestingly, the UK Solicitor's Regulation Authority does not yet require candidates to provide separate evidence of English language skills.
China The Chinese Ministry of Education claims there are over 300 million in-country English learners, with over 50,000 English-training organizations, or 1 centre for 6 thousand learners. Data from CI Consulting claims the total market value is over 30 billion yuan (GBP£4 billion).
The 50,000 training organisations, can be broadly segmented into three areas: test preparation for TOEFL, IELTS or National College Entrance Exam candidates, corporate training for professionals, and school-age training.
The Chinese government's National Education Examinations Authority intends to launch a
National English Testing System this year. The incumbent College English Test has limited oral and speaking capabilities. With China’s education model extremely exam-oriented, Chinese candidates are usually good at English reading and listening, but spoken and written English skills are still sub-optimal.
So by volume, China is the largest market for tests of English. For example, ETS claim that
TOEFL test numbers have reached 300,000 indigenous Chinese candidates per annum.
Summary ELT is controversial, important, high profile, rapidly growing, and with a very intriguing risk/ reward ratio.
Huge amounts of public money have been spent in dealing with the outcome of ELT malpractice, with at best, limited results. The sector needs to do better in showing-and-telling how to stop malpractice using the digital exam toolkit, and more customer advocacy.
The sector also needs to wise up on stakeholder impact assessments. Good ones are very rare. If you can link these to the user stories that are generated when a programme is being developed, then that's helpful, and aids understanding of the impact our work has to different communities.
And finally: I’ve pointed out six addressable ELT sectors where digital exams can make a big impact. If you take away anything at all from my presentation, you’ll have some new sectors to explore!