Privacy leader releases smart… | Alannah & Madeline Foundation Skip to main content

The Alannah & Madeline Foundation welcomes new guidance from the Privacy Commissioner to social media platforms on restricting access to users aged 16 and over. The advice offers clear, practical steps to meet the Australian Government’s age limit while respecting children’s privacy and complying with privacy law.

From 10 December 2025, social media platforms must take ‘reasonable steps’ to deactivate accounts held by children under 16 and prevent new ones. While the final list of affected platforms has not been confirmed, restrictions will apply to Instagram, Snapchat, TikTok, Facebook, X/Twitter and YouTube. The goal is to delay children’s exposure to online risks until they have developed greater digital literacy and maturity.

However, concerns exist about risks to children's privacy. At present, we do not know what tools providers will use to check their users’ ages. The Government has confirmed that platforms may only request official ID if they also offer a reasonable alternative, and a trial found that technical solutions to age assurance are possible, although not foolproof.

The Commissioner’s guidance is a welcome step. It anticipates implementation challenges and offers case-based examples and ‘dos and don’ts’ for industry. Crucially, it promotes a ‘privacy by design’ approach that is proportionate and data-minimising, including:

  • Assessing the privacy risks associated with any approach to age assurance and then taking steps to manage, minimize or remove these risks
  • Using the least intrusive approaches and handling the least sensitive data possible in order to assess age
  • Collecting only information that is necessary to meet the age limit requirement
  • Holding onto personal information only for legitimate, limited purposes – otherwise, destroying the data promptly
  • Limiting the ‘secondary’ uses of data collected for age assurance purposes and ensuring users understand these secondary uses and can refuse them easily
  • Explaining age assurance clearly to users.

However, while the guidance is positive, we remain concerned about the extent to which industry will follow it. For social media platforms, data equals dollars: their products are designed to handle as much personal information about their users as possible. Strengthening privacy protections for children is unlikely to be a priority for these platforms unless they are made to do it.

Unfortunately, the regulator’s ability to require compliance is constrained by its modest resourcing and by outdated aspects of Australia’s Privacy Act, a law passed in 1988 which needs reform to meet the challenges of the digital age.

In this context, the Australian Government’s commitment to introduce a Children’s Online Privacy Code is especially important. The Code is being drafted by the Privacy Commissioner now. It should provide a stronger regulatory framework, raising privacy standards and setting clear rules for how industry may and may not collect, use, share and hold children’s data.

However, the Code will not be finalised until next year, and enforcement will require both legal authority and operational capacity for the Privacy Commissioner to guide industry, monitor compliance, investigate breaches, and respond to complaints and emerging issues.

We applaud the Privacy Commissioner’s leadership in developing their guidance on the social media age limit and we look forward to a comprehensive Code backed by proper resourcing. Together, these measures can help ensure that children enjoy their rights to both safety and privacy online.

You may also be interested in...