Exploited, addicted and groomed, mother of two Elise Strachan pulls no punches when it comes to the safety of teens on line.
In November 2024, the Australian government took a groundbreaking step by introducing new laws to protect teenagers from the harmful effects of social media. Spearheaded by Michael Wipfli (Wippa) and supported by more than 127,000 Australian voices including parenting expert Maggie Dent, comedian Hamish Blake, and tech advocate Felicity McVay, this movement marks a pivotal moment in Australia—and perhaps the world.
If you’re a parent and haven’t yet discovered Maggie Dent, do yourself a favour and look her up. She’s the fiercely informed grandma we all need, unflinching in her criticism of tech platforms that exploit our children’s vulnerability.
I had the pleasure of viewing a lively panel featuring the above advocates with my 13-year-old son at SXSW Sydney in October, and it profoundly changed the way he now sees social media. It pulls back the curtain on how unaware even the most astute parents are of the content and messaging their kids are consuming.
I strongly encourage you to view a copy of this discussion, which Women Love Tech caught in it’s entirely here.
Mixed Reactions to the News
Since the announcement, responses have varied. A rural teenager noted that social media helped her “find her tribe” in a way real-world connections seemingly could not. And yes, there’s a safe connection aspect we want for our children, but what we currently have is not it.
For more than a decade, we have heard parents across the country sharing devastating stories surrounding a growing list of families shattered by preventable tragedies linked directly to unregulated digital spaces.
These are not isolated incidents—they are part of an alarming trend exacerbated by platforms prioritising profits over children’s safety.
The Duality of Social Media
As a career content creator who embraced YouTube for its learning and education capabilities early on, I’ve experienced both sides of social media. It can be a wonderful, connective, and educational space, but let’s not ignore the dark corners. Trolls, unregulated ads, scams, explicit content, and harmful algorithms are thriving in environments where vulnerable young minds are left unprotected.
I’m a mother in this age group – my boys are 10 and 13 – and I see first-hand the constant stream of harmful messaging they’re exposed to. Ads for gambling, alcohol, hyper-sexualised cartoon content, and shockingly one-sided political messaging are being served alongside innocent Minecraft or Fortnite videos and within rewards-driven, “free” kids’ games from the app store. It’s insidious conditioning of their young, impressionable minds while 18+ brands invest in turning today’s children into the next generation of gamblers and alcohol consumers, all to bolster their bottom line
Here’s the harsh truth: Kids aged 13 to 16 are not ready to navigate this world of adult social media safely. The dopamine-driven reward systems, addictive algorithms, and exploitative advertising are designed to manipulate and brainwash, not empower.
“Freedom for All” vs. “Safety for the Vulnerable”
Some critics argue that Australia’s move threatens individual freedom. Will governments start controlling what movies we watch or conversations we have? Here’s the kicker—we’re already being controlled by companies far less motivated with the nourishment of future generations than our own Government. Tech giants monitor our data, manipulate our emotions, and feed us fake news, often without accountability.
The difference? Adults, for all our struggles, are better equipped to discern reality from manipulation. Kids aren’t. They’re turning to social media for education, validation, and community, often with devastating consequences.
The Content Gap: A Crisis in Digital Spaces
Beyond social media, other platforms you might deem ‘safe’ are contributing to this crisis. For instance, under a paid Spotify Premium Family Plan with explicit filters enabled, searching the word “sex” reveals extreme, violent, and graphic content.
If the content creator (like sex-positive podcaster Abbie Chatfield) has had the decency to mark their content as explicit (equating to less views, sponsors and revenue) its unable to be viewed by kids. But Spotify’s self-certification model is lacking oversight and so much explicit content remains available. From erotic audiobooks to podcasts glorifying abuse, the content is still appallingly accessible to children.
Imagine a 13-year-old listening to narratives that normalise sexual violence, assault, and coercion as entertainment, while well-intentioned parents think they’re jamming to Taylor Swift’s latest album drop. This isn’t about the occasional swear word in a song—it’s about shaping worldviews in profoundly damaging ways.
Inside free app store games obviously aimed at children, I observed the below ad where two women begin fighting over a man, until he encourages them to passionately ‘kiss and make up’ before they all go off to… well, your 10-year-old will have to download the game to find out.
Parents who complain are often sent in circles. “We’ll escalate this,” they’re told. Yet nothing changes.
Tech Giants Must Do Better
Let’s call it what it is: negligence. Platforms like YouTube know exactly who their audience is, and while they are doing better than most with the implementation of YouTube Kids in 2015, their kids platform designed for toddlers and very young children.
From 10 up, kids seek out their favourite Minecraft or Fortnite videos on YouTube proper, (often under parental accounts) and are bombarded with gambling, alcohol, and adult content ads.
YOU don’t see these ads when you browse your favourite recipe or how-to video in the same account. Brands are targeting your kids through gaming content they know is consumed by a largely juvenile audience.
Creating “YouTube Teens”, “Snapchat Minor”, or “Instagram Teen” would be straightforward solutions. But that would mean sacrificing millions in ad revenue and gambling dollars from industries with endlessly deep pockets allocated to conditioning their impressionable underage army of future addicted revenue generators (that’s your child, by the way)
Australia: A Global Leader for Change
Thanks to trailblazers like Wippa, Maggie, Felicity and the tireless voices of grieving parents, Australia is forcing a conversation that tech platforms can no longer ignore. The world is watching, and we have a rare opportunity to set the standard for protecting our children in digital spaces.
36 months has just launched similar initiatives in the UK, NZ and Japan. At the time of writing, only a handful of supporters have signed
Head here to show your support à https://www.change.org/m/2502
While you’re there, take a moment to view some of the personal video testimonies from concerned parents within the Australian petition à https://www.change.org/m/2502
What Comes Next?
At Women Love Tech, we’re committed to driving this conversation forward. We’ll dive deep into the platforms you use every day, ask the tough questions, and push for real solutions.
We’re calling on you—our readers, parents, and community members:
- Share your real-world experiences.
- Send screenshots of inappropriate ads or unanswered complaints.
- Let’s document these issues and present a united front for change.
Together, we can help our government hold tech giants accountable. We can demand safer, smarter solutions for our kids and teens.
Let’s Be the Example the World Needs
Australia is leading the way, but this is just the beginning. By staying constructive and solution-focused, we can create impactful change—not just for our kids but for generations to come.