The Alannah & Madeline Foundation acknowledges the Government's announcement today confirming it will proceed with a minimum age limit of 16 years for social media use in Australia.
This is a step towards making the online world safer for children, however age limits alone will not prevent online harms affecting Australian children.
In drafting the legislation and holding the tech companies to account, it's imperative that the new rules address the underlying causal factors lurking below the surface that make the online environment inherently unsafe for children.
This includes:
- Safety by Design: implement age-appropriate and safety-by-design requirements for all digital services accessed by children, not just social media.
- Default Privacy & Safety Settings: all digital products and services must automatically provide the highest level of privacy and safety settings for users under 18 years, by default.
- Age-appropriate Content: ensure all content is aligned to the developmental stage, cognitive abilities and emotional maturity of the user based on age.
What is 'age assurance'?
‘Age assurance’ encompasses a variety of approaches for determining age to varying levels of certainty, but no perfect solution exists yet. This is a rapidly evolving area, and it is essential for Australia to be proactive in regulating age assurance technologies.
While age assurance can help identify when a child is present, it is not a cure-all. Platforms that rely solely on age-gating to restrict access for under-16s may still expose older teens (and younger children who bypass age checks) to significant risks.
As age assurance technologies become more prevalent, it is vital to ensure they protect children’s rights and adhere to the highest standards of safety, privacy, and accessibility. Companies such as Roblox, Google, Yubo, and Meta have begun to experiment with ID scans, selfies, and facial age estimation to confirm age, but the unregulated use of these technologies raises serious privacy risks. Without proper oversight, there is a danger that children’s data—including biometric information—could be exploited for commercial purposes, undermining their privacy and security.
We need a multi-pronged approach to keep children safe online
- We believe it’s government’s responsibility to ensure children’s digital rights are upheld and realised – setting minimum standards based on community expectations and holding tech companies to account for not meeting these standards.
- It’s tech companies’ responsibility to prevent their services and products from being used in ways that violate children’s rights and which expose children to harms while using their services and platforms.
- And it’s up to the rest of us to take responsibility to upskill and educate ourselves and our children on how to navigate tech and the online world safely and confidently; to participate with them in their digital worlds, not just police them.
We must avoid repeating past mistakes where children’s rights were overlooked in social media regulation. Measures must be put in place from the very beginning to ensure that new technologies function to benefit children and not to harm or exploit them.
Keeping children safe online requires collective action from governments, tech companies, and regulators to create a safer, more responsible digital environment that prioritises children's safety and well-being over commercial interests.
The Alannah & Madeline Foundation will continue to advocate for the right of all children and young people to be safe in all places where they live, learn and play – including in online spaces.
Learn more about our Advocacy work.