The algorithm undermining Black History Month

The algorithm undermining Black History Month

One year ago the #Blacklivesmatter protests over the terrible killing of George Floyd were shining a spotlight on racism. In the parliamentary debate on Black History Month I spoke of the possibility of its eradication: “I look forward to a day when parents will explain racism to their children in the same way that they now explain hanging, drawing and quartering: as a barbaric practice of our past... We want a world in which success is open to all, and Black History Month can help to achieve that by remembering all our history in colour and making racism history.”

A year on I am not nearly so optimistic. Racial justice seems once again to be a niche interest. And rather than being eradicated racism is being entrenched – by algorithms.

An algorithm is just a set of instructions, which acts on data entered in a particular format to make a decision. Increasingly they control our lives: through Amazon, Facebook, Google and other platforms they tell us what to buy and what to read; who to fire and hire, who to vote for or give a visa to.

Critically, algorithms are only as good as their design and the data they are trained on. Software engineers tend to come from a very narrow demographic—few are women, from ethnic minorities or working class. Algorithms reflect the limitations of their designers.

And the data algorithms are trained on too often has limitations. Six years ago  Google’s photo recognition algorithm identified Black people as gorillas because only white people had been used to train it, while Facebook's ad-delivery  system is biased against Black people and women for both recruitment and real estate.

Design rules, oversight and accountability can help protect against such biased outcome, but right now there are no regulatory requirements and no business incentives to do so. 

An effective regulatory framework for algorithmic bias requires a government that accepts its existence. But the Sewell Report commissioned by the government deliberately denied the overwhelming body of evidence on structural racism. And it sought to apply that to tech. 

The Sewell Report mentions algorithmic bias but proposes the solution is to “define fairness mathematically”. Racism is not an equation, it is a lived reality, and whilst computational approaches can produce tools that help identify and mitigate bias it cannot be coded out: fairness is a social and political problem first and foremost.  

I fear that this Government lacks the ability to contextualise racism as a human problem, with human victims, that requires diverse and resource intensive human consideration if we are to truly eradicate it. 

That lack of understanding applies to the Online Safety Bill currently before Parliament in draft form. The previous secretary of state often said he wanted what is illegal offline to also be illegal online. There are at least three problems with that. Firstly, you cannot separate the world into online and offline as simply as Oliver Dowden believed. Secondly online drives offline – so public acts of racism may have their roots in online material promoted by algorithms, and racist far-right “pile ons” like those experience by many of England’s Black footballers are promoted to other racists by platform algorithms. Thirdly, some things are not illegal offline but their harm is industrialised and therefore amplified online and fourthly  - why limit ourselves to the failings of the offline world, rather seek to  create a better world online -  which can drive a better one offline?  

The Online Safety Bill’s definition of harm is vague, mostly left to the platforms and does not consider algorithms at all. The Bill will be out of date before it is even law.

Tech is neutral. That is one of the reasons I wanted to become an engineer and spent over two decades in tech. People could call me names and deny my qualifications but the code I wrote either worked or it didn’t, the signal processing I designed did the job or it didn’t. It was bias free, unlike the people around me. Unfortunately, it appears that is no longer the case. And this government hasn’t even begun to deal with it.