Response to Meta's new Instagram… | Alannah & Madeline Foundation Skip to main content

Meta has announced it will introduce ‘teen accounts’ on its Instagram platform, signalling some positive intent to help make its platform safe by design for children and young people.

The move comes after many weeks of community discourse about the issue of age limits for social access, and the Australian Government’s announcement last week of the development of the Online Children’s Privacy Code.

Meta says the new ‘teen accounts’ will be set to private by default and include age-inappropriate content restrictions.

CEO of the Alannah & Madeline Foundation said this is a positive step in the right direction from the social media giant.

“We are pleased to see Meta playing a more active role in addressing the risks faced by children online,” said Ms Davies.

“For too long, responsibility for children's safety online was pushed back onto parents, schools and children themselves.

“Meta’s proposed changes aim to tackle critical issues surrounding contact, content, conduct, and compulsion on their platforms through default settings for teens under 16.

“This goes some way to building in safety by design and ensuring that children’s privacy and safety online are put ahead of commercial interests,” added Ms Davies.

The Alannah & Madeline Foundation continues to advocate for a collective approach that recognises the critical role government, tech companies, and community play in making our online world safe for children to enjoy.

“We look forward to seeing more details on Meta’s age assurance strategy,” said Ms Davies.

“Key questions remain about how the company will identify children on its platforms and manage their personal information, including sensitive data like biometrics.

“As age assurance technologies evolve, it’s crucial that these measures don’t inadvertently introduce new risks, such as data misuse or increased marketing targeting.”

Ultimately, Australia needs clear, enforceable national regulation about what all digital platforms that children use do with children's personal information, with a focus on upholding the rights of the child.

The Foundation supports the Australian Government’s establishment of the Children’s Online Privacy Code, spearheaded by the Privacy Commissioner. This code has the potential to transform the digital landscape, ensuring a safer online environment for all children in Australia.

Keeping children safe online

Keeping children safe online requires collective action from governments, tech companies, and regulators to create a safer, more responsible digital environment that prioritises children's safety and well-being over commercial interests.

Multi-pronged approach to keep children safe online

  • We believe it’s government’s responsibility to ensure children’s digital rights are upheld and realised – setting minimum standards based on community expectations and holding tech companies to account for not meeting these standards.
  • It’s tech companies’ responsibility to prevent their services and products from being used in ways that violate children’s rights and which expose children to harms while using their services and platforms.
  • And it’s up to the rest of us to take responsibility to upskill and educate ourselves and our children on how to navigate tech and the online world safely and confidently; to participate with them in their digital worlds, not just police them.


A broader safety net to address the underlying causal factors must include:

  1. Default Privacy & Safety Settings: all digital products and services must automatically provide the highest level of privacy and safety settings for users under 18 years, by default.
  2. Data:  data is the currency for many tech services and products, but we must ban the commercial harvesting / collection / scraping / sale / exchange of children’s data.
  3. Privacy:  prohibit behavioural, demographic and bio tracking and profiling of children, and therefore no profiled, commercial advertising to children under 18.
  4. Recommender Systems (the algorithm):  ban the use of recommender systems – software that suggests products, services, content or contacts to users based on their preferences, behaviour, and similar patterns seen in other users.
  5. Reporting Mechanisms:  require child-friendly, age-appropriate reporting mechanisms and immediate access to expert help on all platforms.
  6. Independent Regulation:  remove self and co-regulation by the tech industry and establish a fully independent regulator with the power and resources to enforce compliance.
  7. Safety by Design:  implement age-appropriate and safety-by-design requirements for all digital services accessed by children.
  8. Public Data Access:  ensure public access to data and analytics for regulatory, academic, and research purposes.

Together, we can make a meaningful difference and gift children a digital world where they are free to live, learn and play safely.  Read more about our Advocacy work related to the digital rights of children.