As policymakers across the world grapple with how to keep children safe online, a growing number are recommending age-based social media 'bans' as a tool to help keep children safe. While laudable in intent, at Save the Children, we are concerned that laws banning children’s access to online spaces – particularly if used in isolation – risk creating unintended harms, and a false sense of safety, as well as curtailing the opportunities that online environments offer to children. There are better alternatives.
As a parent, I worry about my children’s safety and well-being online. Like many families, my husband and I are navigating the balance between fostering our 11- and 9-year-olds’ independence and our responsibility to keep them safe.
Working in child protection, Save the Children’s teams are used to preventing, mitigating and responding to risks. To do this effectively, we prioritise listening to children and empowering children and their families in the diverse contexts in which we work. When I speak to young people and their parents about the online world, I hear differing perspectives; while globally many parents – including myself – are concerned about the significant risks that young people face online, children themselves often see opportunities – ways to meet others, to be social, access information and support they might not feel comfortable talking to their parents about. Increasingly, online spaces are where children learn, play, and seek help – as well as where they may be targeted and harmed.
As policymakers across the world grapple with how to keep children safe online, a growing number are recommending age-based social media 'bans' as a tool to help keep children safe. For example, Australia’s Online Safety Amendment, which bans anyone under 16 in Australia from making or keeping accounts on social media apps, including TikTok, Instagram, YouTube, Snapchat, X, Facebook and so on. While laudable in intent, at Save the Children, we are concerned that laws banning children’s access to online spaces – particularly if used in isolation – risk creating unintended harms, and a false sense of safety, as well as curtailing the opportunities that online environments offer to children. There are better alternatives.
Abstinence-only approaches are largely ineffective, as we know when it comes to alcohol use or sexual and reproductive health messaging, Teenagers are naturally curious and skilled at circumventing rules that do not reflect the realities of their lives. If we simply ban children from accessing online spaces, they will most likely still go online, but be out of our sight, won’t seek help when they need it, and won’t be ready to use these spaces appropriately when they become adults.
Safety by design, empowering children and their caregivers
Online spaces were not created with children in mind, and we are fully aware of the harms that unfettered industry practices and weak regulation have had. Technology companies must be held accountable – through strong and enforceable rules – for preventing harms. This includes making their products genuinely safe and age-appropriate, and for existing age limits are enforced.
Rather than banning them from these spaces, we should redesign these spaces so that children can participate safely and age-appropriately. That means platforms making products genuinely safe and age-appropriate, alongside education and support for children and caregivers to become more confident digital citizens.
Safety features and child accounts should come with default-high privacy and safety settings, making it easier for parents to navigate. We are also calling for age-appropriate interfaces with revised algorithms, prohibiting features that expose children to serious harms and inappropriate content.
But this must also be accompanied by education and support for children and their caregivers to help them become more aware digital citizens. This is much like teaching a child to ride a bike and gradually reducing supervision as their skills grow. To do this effectively, lawmakers should integrate comprehensive, age-appropriate digital literacy, social and emotional learning and online safety education into school curricula, including a critical understanding of platform design, algorithms and commercial incentives. Accessible guidance and support should be provided to parents, caregivers and professionals like teachers, social workers and health practitioners, so they can recognise digital-related distress and respond, without blame, to get children seek the support and help they need.
We are also seeing harmful gender and power dynamics that exist offline being amplified online, and we need to continue to work with children, caregivers and teachers to address harmful norms, technology-facilitated gender-based violence, and issues of consent and bullying, as part of a comprehensive digital literacy programme, encouraging peer-to-peer learning and bystander intervention.
While I want more than anything to keep my children safe, I firmly believe the best way to do this is by working with and for children, holding technology companies accountable for making safe and age-appropriate products and investing in comprehensive digital literacy programmes for children and their parents and caregivers. We must protect children from online harms while upholding their rights and not cutting them off from the wide potential benefits of digital technology.
Save the Children strongly urges policymakers to adopt an inclusive rights-based approach to online safety by making digital products and services safe by design, progressively building children's and parents’ capacities in how to navigate online spaces safely, and embedding meaningful child participation into the process.