When we consider bias within the design of technology (and much else besides, as we’ll come to), we have no hiding place because technology by its nature is specific and binary, comprising 1s and 0s, literal with interfaces and unambiguous with functionality. This has an important and intractable impact on the fairness with which the users of software are treated.
It’s not that digital tech is any more or any less morally pure than the humans who design and build it, it is that what may be implicit and temporary in the behaviour of humans is explicit and permanent in the experience of technology.
The teenager who felt excluded from their peer group in the 1980s had to deal with insinuation, in–the–moment comments and social exclusion; the excluded teenager in the 2020s has to deal with the specifics of no likes on their social media posts, snide comments from peers which last indefinitely and a real–time count (with associated peer league table) of friends, connections and followers. That teenager can return to the digital footprints of their exclusion at any time and feel that rejection just as acutely as they did the first time they encountered it.
A UK visa applicant in the 1990s may have had a feeling that their application wasn’t being dealt with fairly as a result of the colour of their skin or their nationality, however such an applicant in 2020 doesn’t need to rely on their feelings as they can be objectively sure that nationalistic racism has been explicitly programmed into the online application process. Chai Patel, legal policy director of Joint Council for the Welfare of Immigrants summarised this state of affairs accurately “This streaming tool took decades of institutionally racist practices, such as targeting particular nationalities for immigration raids, and turned them into software”.
Software therefore, replaces the ambiguity, innuendo and insinuation of human bias and makes it unequivocal, categorical and definitive. Left unchecked, this means that the prejudice and partiality of the humans who design and build software gets baked into the product by default.
From a commercial perspective this makes software ineffective.
From an ethical perspective this makes software unfair.
And from a diversity, equity and inclusion perspective this makes software discriminatory.
Ultimately, biased design is bad design, predominantly because it commits the cardinal sin of designing for the designer and people like the designer, not the user. Unless they upset this status quo, the designer has no feedback loop to make them aware of this, because their peers like the design, their friends and family like the design and people within their demographic like the design. So, in their little cocoon, they think they’ve done a good job.
It’s only when they invest time in understanding people not like them that they can truly understand the effectiveness or otherwise of their work.
Examples of designers designing for people like them are legion:
Cities are (inadvertently) designed for six–foot tall men despite the fact that lots of people who aren’t six–foot tall men need to use them. This is great news if you are a six–foot man, but less good news if you are a child, a pregnant woman, infirm or have a disability. This focus on athletically–imagined and perfectly–sculpted men influenced the height of door handles, the scale of stairs and the size of blocks. The Matrix Feminist Design Cooperative launched in the 1980s to challenge city architects to consider the needs of those with prams and shopping trolleys and those who want to navigate underpasses and subways with a feeling of safety.
Crash test dummies are based on male physiology, meaning that for decades women are more likely to be killed in car crashes. Shamefully, female front passengers are 17% more likely to be killed in a car crash than a male occupant of the same age. Females wearing seatbelts have 73% greater chance of being seriously injured in a frontal car crash than a male equivalent. Female crash test dummies were developed as far back as 1966, however they were just ‘scaled–down men’ and thus weren’t anatomically accurate.
Police stab vests and body–armour favour men, to the extent that some female police officers have had breast–reduction surgery in order to wear them. When Jackie Smithies made her procedure public, 700 other officers contacted her within 6 months to seek guidance and support. They were concerned they were being bruised by their kit belts, having no space for their breasts and that the vests came up too short, leaving key parts of their bodies exposed. Anatomical differences in chests, hips and thighs were barely considered in the design and deployment of the kit.
Voice recognition devices understand white American males above all other categories. Its 92% accuracy rate for white American males falls to 79% for white American females and to 69% for mixed–race American females. This happens because the product is designed predominantly by white American males who haven’t escaped their cocoon and even the 13.5% of female workers in the machine learning industry are swimming against the tide.
The COVID–necessitated A Level grading algorithm introduced by the UK government in 2020 effectively graded the school not the student, meaning that posh yet unintelligent Jemima got her marks increased whereas poor but smart James got his marks decreased. Some critical lobbying led by the excellent Foxglove, married with some common sense forced the UK to reconsider and scrap the algorithm.
One could go on, but you get the picture. Left unchallenged, design can prop up privilege and reinforce discrimination. We see that truth play out in physical products, in service–delivery, in algorithms and in software.
Those of us who make our living from software design need to do much much better.
Average males are not average humans.
Average Caucasians are not average humans.
Average middle–class people are not average humans.
Average designers are not average humans.
Average programmers are not average humans.
It feels fitting that the last word should go to the Matrix Feminist Design Cooperative: “Consciously or otherwise, designers work in accordance with a set of ideas about how society operates, who or what is valued, who does what and who goes where. The question is who gets included, whose values we prioritise, and what kind of world we want to create.”