Colour Splash Be-IT Blog

Mutant algorithms and exam results – what a state we’re in…

Mutant algorithms and exam results – what a state we’re in…

Posted on 2nd December 2021

LinkedIn ShareShare

I was intrigued to read on Sky News this week that the government “has launched an algorithmic transparency standard amid concerns that biased algorithms are impacting the way Britons are treated by the state.”

There are several schools of thought on this.  One – a bit of a conspiracy theory, possibly - is that the government is deliberately using the country’s citizens to test whether its algorithms actually work. Another is that it’s more cock-up than conspiracy, with government trying its best but not actually succeeding.  A good example of the latter would be last year’s exam results being predicted by algorithms.

As this article from Wired shows, a lot of effort went into getting this right. Ofqual produced a 317 page report with all the reasoning behind the plan, but at the end of the day it was clear that they got it wrong.  Basically, the results individual pupils received were based as much on the school they attended rather than their innate intelligence, wit and sagacity. Thousands were not chuffed to find their exam results downgraded and eventually the government had to resort to using teachers’ predicted grades, which led, unsurprisingly, to more grade inflation and thus caused problems for universities’ admission departments as they tried to separate the wheat from the chaff.

Boris Johnson didn’t help when he called it a “mutant algorithm.”  I have this image of a rogue computer program with evil powers seeking to wreck teenagers’ aspirations, but in reality Ofqual’s program did what it was designed to do. The problem was that their projections were, of necessity, based the past.  That means the predicted future, by default, looked like the past. And as we know, while history is important, the past does not always predict the future...

It’s not just education where the public sector gets this wrong.  Privacy campaigners are complaining about the databases used to predict whether children might get involved in crime. This year, Sky News tells us that a report by Big Brother Watch on “the hidden algorithms shaping Britain's welfare state” claimed that their investigation demonstrated “that many predictive systems are no more than glorified poverty profiling, working to datify the long-held prejudices authorities in society hold against the poorest." Moreover,

"A common thread across all these automated systems is a lack of due attention paid by councils to the serious risks of bias and indirect discrimination."

As a result, and following a review into bias in algorithmic decision-making, a new algorithmic transparency standard has been developed by the Cabinet Office's Central Digital and Data Office, with help from the Centre for Data Ethics and Innovation (CDEI). They are, though, somewhat late to this particular party.  In 2017, a review was carried out by ResearchGate into algorithmic accountability and the extent to which government allowed algorithms to be made public in the USA. The conclusion was that there were “inconsistencies in government policies and practices related to algorithmic disclosure” and suggested “a need for better mechanisms to hold government algorithms accountable.”

 In the UK, the CDEI recommended "that the UK government should place a mandatory transparency obligation on public sector organisations using algorithms to support significant decisions affecting individuals."  This will mean that the public sector will need to make plain not just what algorithms they are using but also explain how they work.  That, in my view, can only be a good thing, especially if you are waiting for your exam results…

Scott Bentley, Be-IT

 

 

Posted in News, Opinion


.. Back to Blog

Be-IT Accreditations
Cookies and Privacy on this website
We use Cookies to ensure that we give you the best experience on our website. If you wish you can restrict or block cookies by changing your browser setting. If you continue without changing your settings, we'll assume that you are happy to receive all cookies on this website.