Every year, Nigeria conducts one of the largest standardised examinations in the world. Close to two million candidates sit the Unified Tertiary Matriculation Examination (UTME), simultaneously across hundreds of computer-based testing centres nationwide. By sheer logistical scale alone, it is an impressive undertaking. By engineering standard, it is a system operating well beyond its design limits, and the consequences fall entirely on the young people it is supposed to serve.
- +Fixing UTME at scale: What JAMB must do before the 2026 exams
The 2025 UTME crisis made this undeniable.
The 2025 UTME crisis made this undeniable. A software glitch compromised the exam experience of nearly 380,000 candidates. The cause was a missed server update. Most centres implemented the latest patch, but 157 centres, predominantly in Lagos and Nigeria’s Southeast, were left running older software that could not correctly score the shuffled exam questions. The result was frozen screens, mid-exam logouts, corrupted scores, and a national outcry that lasted weeks. JAMB Registrar Professor Ishaq Oloyede took public responsibility, apologised, and offered free resits to affected candidates. That accountability was commendable. But accountability after failure is not a substitute for systems designed to prevent it.
What is less discussed is that this was not an isolated incident. Candidates have struggled with power outages, system crashes, and login errors across multiple exam years, including in 2015, 2023, 2024, and 2025. The 2026 mock exam again sparked anger as candidates and parents reported glitches and power outages at multiple centres. The pattern suggests not bad luck but structural underinvestment in the reliability of the platform itself.
Despite a 2.7 billion Naira budget for CBT infrastructure in 2024, critical gaps in network architecture were left unaddressed, including the fact that the entire South-East region routes its exam sessions through a server cluster in Lagos, nearly 1,000 kilometres away, with no local failover. That single architectural choice meant that when Lagos servers had problems in 2025, nearly 200,000 candidates in the South-East lost their sessions along with them. This is not primarily a technology problem. It is a governance and procurement problem wearing the clothes of technology. Organisations that build mission-critical systems, such as banks, telecoms, and hospitals, invest in redundancy, not because failure is likely, but because the cost of failure to their users is too high to accept. JAMB’s CBT platform is as mission-critical as any system in Nigeria: it controls access to higher education for two million young people each year. Yet it is procured and managed with the tolerance for failure of a low-stakes internal tool.
The human cost deserves naming directly. A 19-year-old named Timilehin Faith Opesusi reportedly died by suicide after receiving a score her family believes did not reflect her ability — a score later suspected to be among those corrupted by the glitch. Behind the statistics of 380,000 affected candidates are individual lives, students who spent months preparing, families who sacrificed to fund registration and transport, young people whose futures are structured around a single number generated by software that was not updated correctly. That is an unacceptable governance outcome, and it demands more than an apology. What would it take to fix this properly?
Three things, specifically. First, every CBT centre must operate a local exam server that caches each candidate’s session on login. If connectivity to the national server is lost mid-exam, the session continues locally and syncs when it is restored. This is how resilient offline-first systems have been built for years, in banking, in healthcare, in the very edtech platforms serving Nigerian students outside JAMB. There is no technical barrier to doing it in JAMB’s infrastructure. There is only a procurement and engineering prioritisation barrier.
Second, exam session state must save continuously — every 30 seconds — to local storage. A candidate whose computer restarts, freezes, or loses power should resume from their last saved answer, not lose everything and face the choice between starting over or going home. This is a design decision, not a hardware requirement. Platforms serving hundreds of thousands of Nigerian learners already implement it. JAMB should.
Third, JAMB must invest in automated deployment verification. The 2025 glitch was triggered because a software update was not confirmed across all centres before exam day. Any modern release management system would flag this automatically. A deployment to 700+ centres should not proceed until every single centre has confirmed the update was received and validated. This is not an advanced capability, but a standard release engineering practice. Nigeria has the engineering talent to build all of this. Several Nigerian-founded edtech platforms currently operate at scale, with better uptime, more resilient infrastructure, and more candidate-centred design than JAMB’s CBT system. The gap is not capability. It is institutional will.
JAMB has done important work: the shift from paper to digital, the crackdown on miracle centres, the introduction of CCTV and score randomisation. These are genuine contributions to examination integrity. But integrity of process and integrity of infrastructure are both necessary. One without the other fails the students it is supposed to protect.
Two million candidates a year cannot afford for the technology to be an afterthought.
Oluwasegun Ige is Engineering Lead and Head of Operations at Class54, a Nigerian edtech platform, and Tech Lead at BudgIT, a civic technology organisation.
