Governments around the world are beginning to recognize the need to intervene in the digital space to protect children from the negative effects of social media.
Many countries have enacted or are in the process of implementing laws that set age restrictions and demand parental consent for children to engage with social media platforms.
They are also increasingly addressing the complex intersection of privacy, data protection, and mental health in the digital age.
One of the primary concerns for governments is the ability of social media companies to collect and monetize data from children. The General Data Protection Regulation (GDPR) in the European Union, for example, mandates that companies obtain parental consent for the processing of personal data for children under the age of 16.
This regulation aims to ensure that children’s privacy is safeguarded while also preventing companies from exploiting their data for marketing and other purposes.
Similarly, several countries have raised the minimum age for creating social media accounts, aiming to shield children from the harmful effects of exposure to inappropriate content and interactions
United Kingdom
The UK’s Online Safety Act passed in 2023, requires platforms like Facebook and TikTok to adopt age-appropriate designs and transparency measures.
While the UK has not introduced Australia-style restrictions, Digital Minister Peter Kyle has indicated that “everything is on the table” to ensure children’s safety online.
United States (Tech’s Self-Regulation)
Although not part of the stricter legislative actions, U.S.-based companies like Facebook, TikTok, and Snapchat claim a minimum age of 13 for user sign-ups.
Child safety advocates argue these self-regulations are insufficient, as studies show significant numbers of children under 13 on these platforms.
Australia
Australia has approved the world’s toughest social media restrictions for children. Starting January 2025, tech giants like Meta and TikTok must prevent minors under 16 from accessing their platforms or face fines of up to AUD 49.5 million (USD 32 million). A trial to test enforcement mechanisms begins in January 2024.
European Union
The EU mandates parental consent for processing personal data of children under 16, though member states can lower this limit to 13. This regulation forms part of the EU’s General Data Protection Regulation (GDPR), ensuring consistency across the bloc.
Belgium
Belgium enacted a law in 2018 requiring children to be at least 13 years old to create a social media account without parental permission. However, critics argue that enforcement has been lax, and compliance remains inconsistent.
Italy
Italy requires parental consent for children under 14 to sign up for social media accounts. Once they reach 14, no consent is required. While the law exists, experts suggest more stringent checks could ensure better compliance.
Netherlands
The Netherlands has no specific minimum age for social media usage but introduced a nationwide ban on mobile devices in classrooms from January 2024. The move aims to reduce distractions, with exceptions for medical or educational needs.
Germany
In Germany, minors aged 13 to 16 need parental consent to use social media. Despite these requirements, child protection advocates call for stricter enforcement, citing loopholes in the current system.
France
France passed a law in 2023 requiring parental consent for minors under 15 to create accounts on social platforms. However, technical hurdles have delayed its enforcement.
A panel commissioned by President Emmanuel Macron also proposed banning cell phones for children under 11 and internet-enabled devices for those under 13.
Norway
The Norwegian government recently proposed raising the age of consent for social media terms from 13 to 15. It is also exploring legislation to impose an absolute minimum age for social media use. Statistics reveal that half of Norway’s nine-year-olds already use social media, raising the urgency for regulation.
What you should know
- Social media plays a role in shaping children’s perceptions of the world around them. Platforms often present unrealistic standards of beauty, success, and happiness, which can lead to comparison and feelings of inadequacy.
- Influencers, celebrities, and content creators often promote an idealized lifestyle that may not be achievable or healthy. This can distort a child’s understanding of what is normal and acceptable, influencing their self-image and goals.
- As the digital landscape continues to evolve, the pressure on parents, policymakers, and tech companies to create safe and regulated online environments for children becomes increasingly urgent.