Join us in calling for stronger child… | Alannah & Madeline Foundation Skip to main content

Do you ever stop to think about how safe the internet is for children and young people? With digital technology all around us, we at the Alannah & Madeline Foundation believe that it's important to make sure children are protected from harm while they're online.

But there isn't enough being done to keep them safe.





What you need to know


Understanding the risks

While there are many benefits to the use of technology in our lives and for children and young people, the way that online services are designed can create risks for all of us, particularly for children.

One of the areas that we’re particularly concerned about right now is something called ‘Relevant electronic services’ (RES), a wide category of services that allow people to communicate online (including email, SMS, MMS, chat, instant messaging, various online games and dating services).

Through these services, when misused, children can experience risks like meeting strangers or seeing harmful content.

It's time to make sure the internet is a safer place for them to explore.


What is being done to keep children safe?

The eSafety Commissioner is doing some incredible work crafting an industry safety standard for RES, focused on the most severally illegal material, including child sexual exploitation material. Still, extra support is needed to ensure these rules are strong enough to deliver the kind of protection we know is necessary.


Why online safety is important

Online child sexual exploitation poses a growing threat to children, their families and communities. In 2022-23, 40,232 reports of online child sexual exploitation were made to the Australian Centre to Counter Child Exploitation. Worldwide, reports of abuse have increased by 87% since 2019.

The impacts are severe, with each new viewing and sharing of the images a fresh violation, perpetuating the suffering of real children and emboldening offenders.

Furthermore, The Mind the Gap report released by the eSafety Commissioner in 2022 highlighted that exposure to potentially negative content and sexual content online is prevalent among young people aged 14–17, and almost two-thirds (62%) of young people aged 14–17 said they have been exposed to potentially negative user-generated content online.

Seven in ten young people aged 14–17 (71%) have seen sexual images on the internet, while just under half (47%) have received a sexual message from someone online.



The part you can play in keeping children safe

We strongly support eSafety’s development of a robust industry standard for relevant electronic services to prevent and address the most seriously illegal content, including child sexual exploitation material.

This is a great first step.

Before it is finalised, we believe the standard should be strengthened to provide better protections for children from sexual exploitation and abuse.

We’re asking you - our community to use your voice and networks to raise awareness about this.

eSafety’s draft standard promises to demand greater action against child sexual exploitation by gaming services, dating services, end-user hosted services, and messaging services which use end-to-end encryption.

The draft standard promises to put digital providers under new pressure to disrupt and deter child sexual abuse material (new and previously identified) and to detect and remove known child sexual abuse material. Large, high-risk providers would also be required to fund programs to detect and deter child sexual abuse material, including AI-generated material.

These are significant steps in the right direction. However, the standard should be strengthened before being registered.

Below are the additions that we believe are needed.

Is this a cause that you care about too? Add your name and comment (optional) to ensure that your voice is heard.

We believe that children's safety must be a priority – join us in calling for action today.

In addition to the measures proposed by eSafety, we believe the final industry safety standard should:

  1. Define a child as an individual who has not reached 18 years, in line with the Online Safety Act and the United Nations Convention on the Rights of the Child.
  2. Set default protections for children up to age 18, not 16 as currently proposed. This would apply to higher-risk services, which must set children’s accounts to ‘private’ by default and, unless the child’s parent or guardian has agreed otherwise, hide children’s location from other service users and prevent adults from using the service to contact the child.
  3. Require providers which use end-to-end encryption to demonstrate that the actions they take to detect and remove child sexual abuse material are meaningful and effective.
  4. Require that programs funded by providers to detect and deter child sexual abuse material are effective and not tokenistic.
  5. Require that services have clear, accessible, age-appropriate mechanisms for children to report illegal content themselves if they wish and be connected to appropriate professional support.
  6. Ensure all service providers are bound by the highest standards for reporting child sexual exploitation in Australian jurisdictions. We are concerned that the proposed reporting threshold of ‘evidence of a serious and immediate threat to the life or physical safety of a person in Australia’ is too high to capture a lot of child sexual exploitation.

Pledge your support

Let's take up this powerful opportunity to make the digital environment a better place for children.

Add you name and comment below to ensure your voice is heard and so that children can be safe. You can also download our shareable tiles to post on social media and let your network know about this important cause.

The submission closes on on 6 June 2024. Pledge your support by adding your name before this date to ensure we can make the impact needed to create a safer online world for children and young people.

Why is this important? As a Child Safe organisation, we value our young supporters. We want to make sure we don't send you anything we shouldn't.

By adding your voice, you give permission for the Alannah & Madeline Foundation to contact you about more ways to support our campaigns.



How industry organisations can support

If you are a community services organisation or an organisation that cares about the best interest of children, you can show your support by adding your voice to the Australian Government’s statutory review here.

You may also be interested in...