This week, as thousands of Grade 9 learners sit for the Kenya Junior School Education Assessment (KJSEA), I wish them courage and composure. Behind every examination paper lies a quiet dream: of belonging, of opportunity, of a fair chance.
When I think back to my own school selection years ago, choices were made around kitchen tables, imperfect, slow, occasionally unfair, but undeniably human. There was space for conversation, for persuasion, for hope. Today, that space exists where electricity meets the internet: in school labs, cyber cafés, or on borrowed devices. Hope now submits through a digital form, though for those offline, the path remains uncertain. In the end, it’s the digital divide that decides who gets to submit at all.
From Kitchen Tables to Algorithms
As a data scientist working in education policy, I have seen how technology can bring order and transparency to processes once vulnerable to manipulation. The Ministry of Education’s digital selection platform, selection.education.go.ke, promises to do just that: align student placement with ability, interest, and opportunity.
The Kenya National Examinations Council’s presentation on the process outlines a system that places learners into five clusters: national, extra-county, county, sub-county, and private schools. Placement is based on performance, learner preferences, school capacity, and chosen career pathways (STEM, Social Sciences, or Arts and Sports Science). Learners will select up to 12 schools, balancing aspiration with realism. It is a model designed for inclusivity, equity, and transparency, a digital leap forward.
Also Read: EXPLAINED: How KCSE Exams Are Marked by KNEC
Yet here lies the tension. The Ministry’s digital platform promises fairness through data. But fairness is not mathematical; it’s moral. And that’s the question at the heart of Kenya’s school placement reform.
Psychometrics measures what test designers define as ability. But ability is shaped by context. A child’s score reflects not just what they know, but what they have been exposed to: qualified teachers, textbooks, electricity, even peace of mind. The algorithm does not see these differences. It only sees the number.
Political theorists have long argued that fairness is not the equal treatment of unequals. True fairness is contextual justice: recognizing structural disadvantages and responding to them. Kenya’s challenge, then, is not only to design a fair algorithm but also one that understands inequality.
This year marks a symbolic handover between two eras: the old Kenya Certificate of Primary Education (KCPE) system, where one exam determined destiny, and the new Competency-Based Education era, where continuous assessment and digital selection promise holistic fairness. The promise is noble, but the question still lingers: Can fairness be automated?
Proponents argue that a systematic digital process is better than the opaque, paper-based chaos of the past. They are not wrong, but “better than before” is not the same as “good enough.”
Learning from Past Mistakes
Kenya has seen this story before. When the university funding model shifted to data-driven “banding,” it sparked widespread criticism. Students were sorted by socio-economic status and performance, but the logic of the algorithm remained opaque. Many families felt misclassified, unseen, and unheard.
The lesson is clear: data without dialogue breeds mistrust. As we roll out digital placement for KJSEA learners, transparency must be the foundation. The public deserves to know which data are used, how they are weighted, and what values guide the design.
Also Read: KNEC Issues 4 Directives as KPSEA, KJSEA Exams Begin Nationwide
One practical step forward is back-testing: using historical data from previous placements to simulate how the new algorithm would have performed. This means analyzing prior placement records, performance data, and school choices to measure rates of accurate placement, satisfaction, and retention. KNEC should work with independent researchers and data ethics experts to conduct this testing before full implementation.
Such testing would identify potential bias, flag data gaps, and ensure the system places learners accurately before the real rollout. It could also reveal what many fear: that the platform might misbehave at the worst possible moment. Because when systems fail, it’s not abstract. It’s thousands of lives delayed, dreams deferred.
What This Moment Requires
The elegance of the algorithm is not what achieves fairness in education, but the integrity of those who design, test, and govern it. The Ministry’s move toward digital placement deserves credit: it curbs corruption, limits political interference, and expands access. Yet it must be anchored in human judgment.
In this new system, psychometrics can illuminate potential, but only people (teachers, parents, communities) can interpret it with empathy. If we combine the power of data with the wisdom of humanity, this could become one of Kenya’s most transformative education reforms. But if we surrender fairness to formulae, we risk creating a new elite of algorithmic privilege.
What does this require in practice? Policymakers must publish the logic of the placement algorithm and let transparency build trust. Civil society organizations should independently monitor placement data for regional and gender equity. Researchers need to conduct back testing and bias audits before we fully commit to this new system. And parents and teachers must guide learners through the process, ensuring no child is left behind by a lack of information or internet access.
As this year’s KJSEA candidates complete their exams, the question is simple but profound: will each learner be seen as a datapoint to be sorted, or as a human possibility to be nurtured? This being the first CBE cohort preparing for senior school placement. The real test isn’t whether the technology works. It’s whether it serves justice. Does it deliver fairness and equity to those historically denied both?
The answer won’t be found in algorithms alone. Still, in our collective moral imagination, whether we make data a tool for dismantling inequality, or allow it to become another mechanism for reproducing the exclusions we aim to rectify.
This article was written by Kennedy Kamau, a Data Scientist at the African Population and Health Research Center (APHRC). He works at the intersection of education data, AI, and assessment, exploring how data-driven insights can make school systems more equitable and responsive to learners’ needs.
Follow our WhatsApp Channel and X Account for real-time news updates.









































































