Here is the third in a series of extracts from award-winning journalist Tracey Spicer’s book, Man-Made: How the bias of the past is being built into the future. In this extract from her book, Tracey explains how to have uncomfortable conversations about AI and bias.
Bias in artificial intelligence is seen as a technical issue. But it stems from existing biases within society. If you’re uncomfortable talking about tech, bring the dialogue back to historical inequity. Intersectional discrimination pervades every aspect of society. If we can change the culture from the ground up, we’ll solve the problem.
When I take a television network to court for maternity discrimination, present documentaries about the rights of women and girls, and work on investigative stories as part of the #MeToo movement, these are not solo efforts. It may seem like a pebble in a pond, but one person can embolden others to speak out. This creates a ripple effect that can change the world.
During filming for the documentary Silent No More, I learn about bystander training at Queensland’s Griffith University. The MATE program teaches people how to safely call out unacceptable behaviour. This empowers victim-survivors and puts perpetrators on notice. What does this have to do with data, algorithms and machine learning?
Creating environments where discrimination is not tolerated is crucial to shifting the mindset of blithe acceptance of biased bots. Have a quiet chat with like-minded colleagues about how you can question AI bias at the next team meeting. Standing up for yourself is like building a muscle at the gym: the more you do it, the stronger you become.
Hopefully, you have enough ammunition to counter the inevitable backlash: ‘Now you think even the robots are biased? Crazy, hairy-legged, woke, left-wing feminazi!’ When trolls say this, my latest response is to treat is as a compliment. Well, on the days when I have the energy to bother responding. ‘Why, thank you for noticing! Yes, I support equality in all forms. How about we sit down for a nice cup of tea to discuss this?’ Of course, they never get back to me.
If they don’t believe feminists, maybe they’ll listen to big business. As early as 2018, business insights giant Gartner was predicting that by the year 2030, 85 per cent of artificial intelligence projects will provide false results caused by bias. But it doesn’t have to be this way. I invite you to join a movement with its sights set on a fairer future. A future for all of us; a future made for humans. Collectively, we have the power to determine our destiny.
You can buy Man-Made: How the bias of the past is being built into the future, here
Readers can also buy a signed copy via this link: https://traceyspicer.com.au/signedbook
You can read more extracts from Tracey Spicer’s book here on ‘How we can take action on AI being predisposed to ‘Misogyny, ableism, homophobia and ageism.’