The computer says “no”

Share this article

The computer says, “No, you Josh from a community comprehensive in Mexborough shan’t go to Oxford”. Nor shall you, Amber from Bransholme, Hull’s largest council estate. But the computer says “yes” to Harry from Rishworth and Olivia from Ampleforth. As for grades, the computer says “yes” to downgrading Josh’s and Amber’s and “yes” to inflating Harry’s and Olivia’s.

The A-level results scandal has exposed a major problem about so-called ‘automated’ decision-making. While the computer spits out an automated result, the algorithms, assumptions and prejudices that feed the automation are all human. And they’re riddled with prejudice and bias, deliberate or otherwise. It is plain stupid not to be open about the fact that cultural prejudices are built into any computer algorithm that’s written – and used to automate decision-making regarding just about everything.

For example, you could write an algorithm to sift out anyone not ordering an inflight meal and automatically allocate them the middle seat of a plane. You could allocate all Sheffield Eagles games to be played on Featherstone Rovers grounds. You could also get a self-service checkout to make automatic decisions based on video cameras in the store to recognise customers by estimated weight and refuse to allow them to buy biscuits. You could have an algorithm sift out all British passport holders and stick them in the longest passport control queues (not that we’ll need that post-Brexit). Or devise one to sift out all bald men and pay them lower wages than those with a fine head of hair, even if it was an especially expensive wig.

Automated decision-making is meant to be discriminatory. But it doesn’t have to be unfair, unjust or preposterous. The current A-level grade fiasco shows what happens when no ethical assessment is done and applied to the design of the algorithm. That is even a grosser injustice than rejecting genuine expertise.

The suggestion that this algorithm’s designers rejected input from statisticians who refused to sign five-year non-disclosure agreements, fans suspicion that this is not a one-off. Fears that there are links with Cambridge Analytica and Cummings, fans anxiety that whoever commissioned the algorithm planned for this outcome, dismissed the prospect of public objections, and expects to employ the same algorithm regardless over the next few years.

Worse, the government must have cynically decided that young people’s objections could be easily disregarded, and that universities facing cuts wouldn’t have the places or resources to accommodate requests for re-appraisal of results. In short, the decision-makers seem to attach as little value to genuine merit as they do to expertise. So far, the post-Brexit failures and covid lies have shown that ministers can get away with whatever they think expedient at the time. However, the A-level bare-faced injustice highlights government contempt for the people on whom the country will depend in future.

It is hard to escape the conclusion that the A-level scam was wholly intentional and is what the government wants. Algorithms are always designed to do something specific.

Who would accept an actual human university admission’s tutor applying the algorithm’s ‘socially discriminatory logic’ to decide who should get a place on a particular course at a top university? If Prof X had decided to reject all students from postcodes in deprived areas, or from state schools that had relatively poor results over the past few years, who would have said that was fair? If Prof Z had decided to allocate her department’s quota of places only to students from expensive private schools regardless of their grades, who would have said that was just? 

It is improbable to believe that admissions tutors would weigh up the facts, disregard them and only allocate places based on results manipulated by an algorithm. If the government is allowed to get away with the pretence that the computer got it right, it will have successfully chipped away at the idea that any decisions need to be moderated by a human being.


More articles by Prof Juliet Lodge:


This is highly risky. Some things require critical analysis and a human being in charge. A human should look at another human being, weigh up the pertinent facts, and then make a reasoned decision based on those facts and relevant information not necessarily captured by computer code.

The public isn’t as gullible as to believe that there wasn’t an actual human behind the algorithm. But the depth of turpitude behind entrenching automated social engineering for the rich and privileged, procured by the rich and privileged, has yet to sink in. Not challenging the A-level results amounts to allowing the government to dismiss the value of education as a path to social mobility and start allotting jobs to children based on their social class and where they live.

Bright students from the ‘computer says no’ postcodes have been displaced by probably less-able ones from ‘approved’ postcodes. The longer-term detrimental effect of this is already compromising the UK’s reputation as a place of excellence for innovation and research.

The French are incredulous at the idea of letting their government get away with anything as noxious as allocating grades by postcodes or past school performance. Equally preposterous is the idea that last year’s school results will be exactly replicated this year. Humans are not identical. All kinds of personal and systemic factors affect how well a student does, and they too vary from person to person and year to year. An algorithm does not see the individual.

No admissions tutor would, or should, ever award a place on the basis of which school a prospective student attended, their parentage, wealth, ethnicity, gender, whether they wear green or can stand on one leg for five minutes before wobbling. None of those things predict university attainment.

Yet making suppositions based on class prejudice is still shamefully common. Even today some public school educated students and staff gasp in amazement that a working class woman from Hull has got into medical school purely on merit from a mere comprehensive/academy. This says everything about wealth and entitlement and nothing about the hard work that working class or socially less-privileged students especially have to put in to get to university in the first place.

In the past, all prospective students got a personal or group interview with the admissions tutor in their chosen department and university. As universities had their budgets steadily cut, personal interviews were dropped, but students were invited to open days and to have a chat with the lecturing staff. Even when most places were allocated without interview, the tutors would look at grades achieved before A-levels, as these were a good predictor of whether someone was likely to do well in their final university exams. School references were helpful but not the final arbiter. If in doubt, students were invited to virtual or face-to-face interviews.

A human made the decision in all cases – not some skewed algorithm that could be programmed to reject anyone from a family drawing universal credit, but accept anyone living in a five-bedroomed detached house in a posh area, regardless of educational attainment.

Is the current algorithm unfit for purpose? From a moral perspective, certainly. But possibly not from the perspective of the political biases that those charged with creating it either chose or were required to insert into it. Social engineering has many faces and this is maybe the latest digital one.

So what should be done to rectify the gross injustice foisted on the covid class of 2020? All university applicants whose grades seem artificially inflated or downgraded should be offered virtual interviews with the tutors. Those tutors should then decide whether or not to award a place. Time consuming, yes. Unfair? No.

If the computer said “no”, the human may just say “yes”.

Can you help us reach more readers?