Update from the Joint Select… | Alannah & Madeline Foundation Skip to main content

The Alannah & Madeline Foundation acknowledges the release of the Second Interim Report from the Joint Select Committee on Social Media and Australian Society. We commend the Committee for a well-written and accessible report.

However, we note with disappointment that recommendations addressing online safety and the risks faced by children and young people in the digital space are still pending.

That said, the Foundation is encouraged by the progress being made on several key issues, and we remain resolute in our advocacy for the following:

  • Children’s Online Privacy Code: Provision for this Code is included in the Privacy and Other Legislation Amendment Bill 2024 now before Parliament. We are eager to see this key reform progress under the leadership of Office of the Australian Information Commissioner, with the right resourcing and enforcement powers to bring it to life. This is a once-in-a-generation opportunity to introduce robust privacy protections for children.
  • Revisions to the 2021 Online Safety Act: With the review of the Act due to conclude this month, we are pushing for crucial amendments, including an end to the co-regulatory approach and shifting the development of industry safety codes away from industry to the eSafety Commissioner, and requiring industry to identify and address threats to children that occur on and through their services. The eSafety Commissioner, as the expert, independent regulator, must have the necessary authority and resources to access relevant industry data and enforce penalties for any and all breaches.
  • Review of the Basic Online Safety Expectations (BOSE): The ‘best interests of the child’ must be a mandated primary consideration for the industry, not merely an aspiration. We call for all digital platforms to be held accountable to this standard.
  • Age Assurance: We support a trial of age-assurance technologies to prevent children’s access to harmful content such as pornography.



Our Expectations Moving Forward

It is imperative that the next set of recommendations from the Joint Select Committee prioritises online safety for children. Specifically, we call for:

1. Default Privacy & Safety Settings

All digital products and services must automatically provide the highest level of privacy and safety settings for users under 18 years, by default.

2. Data

Data is the currency for many tech services and products, but we must ban the commercial harvesting / collection / scraping / sale / exchange of children’s data.

3. Privacy

Prohibit behavioural, demographic and bio tracking and profiling of children, and therefore no profiled, commercial advertising to children under 18.

4. Recommender Systems (the algorithm)

Control the use of recommender systems – software that suggests products, services, content or contacts to users based on their preferences, behaviour, and similar patterns seen in other users.

5. Reporting Mechanisms

Require child-friendly, age-appropriate reporting mechanisms and immediate access to expert help on all platforms.

6. Independent Regulation

Remove self and co-regulation by the tech industry and ensure independent expert regulators accountable to the public have the power and resources to enforce compliance.

7. Safety by Design

Implement age-appropriate and safety-by-design requirements for all digital services accessed by children.

8. Public Data Access

Ensure public access to data and analytics for regulatory, academic, and research purposes.

Media and digital literacy education

Additionally, there must be ongoing investment in digital literacy and data privacy education for children, parents, carers, teachers, and the broader community who will benefit greatly from effective and evidence-based education.

Children and young people must be consulted at every step to ensure their voices are not only heard but are integral to shaping policies, frameworks, and solutions that directly impact their digital experiences. Their lived realities in navigating online spaces offer invaluable insights, and any approach that fails to incorporate their perspectives risks being fundamentally incomplete and disconnected from the very people it seeks to protect.

Furthermore, the widespread exploitation of children’s data—through commercial harvesting—must be addressed. We support the creation of mechanisms to ensure the deletion of existing data that has been collected without proper safeguards, following the example of France’s "right to be forgotten."

The responsibility to uphold and realise children's digital rights lies with the Government. However, accountability for ensuring that products and services do not violate these rights rests squarely on the shoulders of tech companies. Tech companies must face the consequences when they fail to protect the most vulnerable members of society.

The Alannah & Madeline Foundation will continue to advocate for the right of all children and young people to be safe in all places where they live, learn and play – including in online spaces.

Learn more about our Advocacy work.

You may also be interested in...

You may also be interested in...