Tracey Spicer On How To Take Action On AI Being Predisposed To โ€˜Misogyny, Ableism, Homophobia And Ageismโ€™

By Robyn Foyster Robyn Foyster has been verified by Muck Rack's editorial team
on 27 December 2023

Here is the second in a series of extracts from award-winning journalist Tracey Spicerโ€™s book,ย Man-Made: How the bias of the past is being built into the future. In this extract from her book, Tracey explains how we can take action on AI being predisposed to ‘Misogyny, ableism, homophobia and ageism.’

Start a conversation or create an event centred on AI bias. Write emails to your local politicians. Call talkback radio whenever the topic is discussed, and pitch stories to the media. Weโ€™re always seeking interesting, future-focused content.

There are simple things you can do in your home, car, or workplace. Change the voices of servile assistants to male instead of female. Better still, find a โ€˜gender-neutralโ€™ option. Siri now has a non-binary voice. Open the โ€˜Settingsโ€™ app on your iPhone or iPad and go to โ€˜Siri & Searchโ€™. Choose โ€˜Siri Voiceโ€™ then select โ€˜Voice 5โ€™ to switch to gender-neutral.

Apple iPhone 15
Apple iPhone 15

When buying a new vehicle, ask questions about the voice controls. Whatโ€™s the accuracy rate for female voices? For people whose first language isnโ€™t English? Or those with an accent? Take it a step further and ask about the composition of the upper echelons of the company manufacturing the car. Most are still stuck in the 1950s.

Perhaps youโ€™re realising your absent partner knows where you are, what youโ€™re doing and the conversations youโ€™re having. Ask yourself these questions: why are they so keen for me to use the Alexa? Are the lights, air conditioning and/or heating being turned on and off? What about the time heavy metal music screamed from the smart speaker for no apparent reason? You could be experiencing high-tech gaslighting.

If you work in a womenโ€™s shelter or law enforcement, be on alert for men using tracking devices like Apple AirTag. As small as a coin, this can be attached to keys and handbags or slipped into a womanโ€™s pocket. Of course, youโ€™re probably all too aware of the latest methods being used to stalk women and girls.

Are you reeling after being rejected for a job, credit card or home loan? Donโ€™t take it lying down. Ask follow-up questions. Was this decision made by a human or software? If itโ€™s the latter, are the algorithms audited regularly for bias? Does the software use historical data to make decisions? Is machine learning involved in this process? What recourse do I have to lodge a complaint, or reapply?

It might be worthwhile contacting the Australian Human Rights Commission or the Banking Ombudsman. Or consider taking legal action. This book is a reminder that we live in a democracy and you have rights. Know where you stand, take a deep breath and push back.

When you receive the results of your medical tests, maybe they donโ€™t seem quite right. Ask your doctor whether the diagnostic machine is as accurate for women as it is for men. Or people with disabilities, Indigenous communities, or transgender populations.

If a clinician recommends a new drug, device or treatment, ask about the science. Are there peer-reviewed studies? If so, whatโ€™s the gender/race/age make-up of the cohort? For example, much of the international long COVID research is based on a male experience, through the US Department of Veterans Affairs.

With mass automation occurring in the workplace, join a union and agitate to retain people โ€“ or at least retain human oversight. Seek legal recourse if youโ€™re sidelined or sacked by the machines. Ask your boss whether these robots will be as accurate, effective and productive. Play devilโ€™s advocate: what happens if something goes wrong?

Managers, executives and board members, I hope this book gives you some ideas about the questions you should be asking. Who are our technology suppliers? Whatโ€™s their record on diversity and inclusion? Where does the data come from? What auditing are we going to do in-house? Should bias-testing be done externally for a more robust result? How can we continuously monitor our algorithms to ensure theyโ€™re up-to-date with societyโ€™s mores?

Features of Garmin Fฤ“nix 7 Pro and Epix Pro Series
Features of Garmin Fฤ“nix 7 Pro and Epix Pro Series

In an era in which employers use workplace surveillance to monitor our productivity, we often forget that we can turn the tables. Futurists predict workers will soon be monitoring their bosses using the same tools. For example, employees can check and record stress levels on their Garmin, Fitbit or iWatch.

Perhaps thereโ€™s a place for a website to publish the results of smartwatch data from employees at particular companies. Kinda like a high-tech Glassdoor, the website where employees anonymously review companies. Itโ€™d be fascinating to see the results from the likes of Apple, Google and Facebook. Talk about turning the tables on the tech titans. Solidarity forever!

TRACEY SPICER
Man-Made: How the bias of the past is being built into the futureby Tracey Spicer. Published by Simon & Schuster RRP AU$ 34.99 / NZ$ 39.99

You can read the first extract of Man-Made: How the bias of the past is being built into the future here

You can buy Man-Made: How the bias of the past is being built into the futurehere

Readers can also buy a signed copy via this link: https://traceyspicer.com.au/signedbook

Tags
N/A

Related News


More WLT News