Get in touch

+44 (0) 28 9099 2882
deeper@fathom.pro

Fathom

Technology can no longer be considered ethically neutral by default

Technology can no longer be considered ethically neutral by default

A hundred years ago Einstein observed “It has become appallingly obvious that our technology has exceeded our humanity”.

If only we had listened to him more closely.

It’s difficult to believe or remember, however the arrival of Web 2.0 and its early–stage blogging and sharing platforms in the mid–2000s was greeted with optimism and fanfare.  We weren’t to know that Web 3.0 in the early–2020s would see communities of interest evolve into political polarity, citizen journalism becoming misinformation and fake news and a read–write web turn into cyberbullying and uncorroborated conspiracy theories.

Global tech has evolved in this unwelcome direction because of an intoxicating mix of power, money and influence, a cocktail from which good things rarely emerge. The state of global technology in the early part of the 21st century is no exception.

This author has long held the view that technology is by its nature ethically neutral; that is in and of itself neither good nor bad but can be used for good or bad purposes. This position didn’t emerge as the result of reviewing evidence and analysing data, but rather settled as the default perspective – a holding pattern if you will, until a more informed view might emerge.

I have recently changed my mind on this and believe that we can create a better digital world only if we start with the presumption that all technical advances are morally and socially negative and can only become good for humankind if humanity is baked into both the design and development process as well as the commercial model which underpins the innovation.

When technology makes money from hatred, division and rage and is incentivised to keep users hooked by any means necessary only bad things can happen.  Bad things like all the bad things we see in the news constantly:

In Ethiopia a leading academic is leading a $2bn lawsuit against Meta alleging its algorithm spread hate and violence during a civil war.

In 2020 Facebook apologised for its role in the 2018 anti–Muslim riots in Sri Lanka, as incendiary anti–Muslim content spread unchecked, contributing to riots and the declaration of a state of emergency.

In the UK a mother is distraught how her 15–year old son has been radicalised by the toxic masculinity of Andrew Tate, promoted and influenced on social media.  His teachers concur that they have observed a marked rise the number of children, particularly boys, sharing his pernicious poison without shame.

In 2017 14–year–old Molly Russell ended her life after viewing suicide and self–harm content online.  The mealy–mouthed sickly PR–led responses from major social networks are vomit–inducing to all right–thinking people.

Rather than doubling its efforts to protect vulnerable people like Molly Russell, Twitter has disbanded The Trust and Safety Council, containing about 100 independent groups such as Samaritans and the UK Safer Internet Centre; this group advised them on self–harm, child abuse and hate speech.

Perhaps they are too busy dealing with the airing of their dirty laundry by Founder and Editor of The Free Press Bari Weiss (the so–called “Twitter files”) showing how a small group of people have huge sway over speech and reach.

Not to be left out in the cold, Crytpo is of course playing its part, first with the plunge in value of its various stocks and shares and the arrest in the Bahamas of FTX crypto boss Sam Bankman–Fried following what US authorities called “one of the biggest financial frauds in US history”.

Digital product ethics and impact are clearly not confined only to the digital world, with their reach extending all over the globe at personal, social and governmental levels. The only benefactors from these endless unedifying sagas appear to be social network shareholders and online bullies.

It has reached a stage where the constant ethical challenges posed by technology are so great that the identification of societal impact, personal risk to users and unintended consequences should be part of the discovery and research phase of all software design.  The findings of this work can provide ethical lines, outside of which the product must not veer.

This may or may not make a positive impact but at least it ensures the perpetrators of the worst offences can’t claim they didn’t know, when the inevitable happens.

American writer and editor Stewart Brand has been pondering long–term thinking for over half a century.  He founded SALT (Seminars About Long–term Thinking, whose contributors include luminaries such as Kevin Kelly, Sam Harris and Brian Eno) in 2003 and summed up the challenge perfectly: “Once a new technology rolls over you, if you’re not part of the steamroller, you’re part of the road”.

When I think of the promise and optimism of teenage Web 2.0 and how it is evolving into adult Web 3.0 I am reminded of the Del Amitri song “When You Were Young”:

So look into the mirror
Do you recognise someone?
Is it who you always hoped you would become
When you were young

We probably all sit somewhere on the spectrum between young, naïve, idealistic teenagers and rich cynical old guys at the country club bar waving our fists at clouds wondering where it all went right.  While we learn with age that the answers which satisfied us in our teenage years no longer do, we do well to cling on to the best of that idealism when the realities of life kick in.

The market may have no soul, but product designers do.  We need to bring it to our profession like never before.

By Gareth Dunlop

Gareth formed Fathom in 2011 and has been in the business of design performance for over two decades.

View more insights by Gareth

Like to read more of our insights regularly?

Receive our monthly insights newsletter straight to your inbox.

To prove you’re a human please rewrite the following into the box below.
3ykf5bkf

Latest from the blog

Our latest views and ideas

Our Cookie Policy

Find out more I accept

Like most websites, ours uses cookies to make sure you receive the best experience possible. These cookies are safe and secure and do not store any sensitive information. To continue, please accept the use of cookies.