Children fleeing violence at Christmas need your help. Donate and help give a Buddy Bag by 25 December.
The Alannah & Madeline Foundation welcomes the release of the report of Australia’s Online Safety Act review, which was tabled yesterday by the Minister for Communications, the Hon. Michelle Rowland MP. This groundbreaking (and lengthy) report calls for key changes to the legislation to make the digital environment a safer place for children and young people.
One of the most important suggestions is that online services adopt an overarching duty of care for their users. And the good news is the Australian Government has already given this idea the green light, in principle.
So, what does duty of care mean in the online world? Well, right now, the approach taken by tech platforms is mostly reactive — platforms take down harmful or illegal content after it’s been posted. But by then, the damage is often already done. This system leaves too much responsibility on families and individuals—especially children themselves—to avoid harms or report issues. That’s just not good enough.
With a duty of care in place, online services would need to get ahead of the game. They’d be required to:
Most importantly, platforms would need to put the best interests of children front and centre when managing these risks. This isn’t just about protecting children who are using the platforms—it’s also about safeguarding them from harmful activities happening in the background, like the distribution of child sexual abuse material.
The report demands real accountability and is especially tough on the big players. These high-risk and high-reach platforms would be required to conduct yearly risk assessments and publicly report on the steps they’re taking to protect users.
Making these platforms safer could mean adjusting everything from design and moderation to algorithms and advertising systems. And if companies don’t comply? They could face serious consequences— fines of up to $50 million or 5% of their global annual turnover. That’s enough to make even the biggest tech giants sit up and pay attention.
But here’s the thing, it’s not just about stopping cyberbullying or shielding children from violent content. Many of the risks children and young people face online are baked into the very business models of these platforms. Features like low default privacy settings, endless notifications, and that addictive ‘never-ending scroll’ are all designed to keep us hooked—but they also open the door to harm.
The report doesn’t shy away from this. It points out that a digital duty of care could address the deeper issues, including algorithms, recommender systems, addictive designs, AI, and more. We’re hopeful these areas will finally get the attention they deserve and that any changes will reinforce the upcoming Children’s Online Privacy Code.
The report also recommends a major overhaul of the online safety regulator, giving the eSafety Commissioner more power to create and enforce mandatory rules. Given how fast digital technology is evolving, it calls for the eSafety office to become an independent commission with more governance power—potentially up to nine commission members. And in a bold move, it recommends that the regulator’s work be funded by the tech industry itself through a ‘cost recovery mechanism.’ This sounds fair to us.
With age-appropriate safety by design that prioritises the best interests of children over commercial interests, safe online spaces for children and young people are possible. The proposed changes bring us a big step closer to creating a digital environment where children and young people can safely explore, learn, and connect with confidence. And that is something to look forward to.
The Alannah & Madeline Foundation will continue to advocate for the right of all children and young people to be safe in all places where they live, learn and play - including online spaces.