Tue 26 Aug 2025

European Commission issues guidelines on the protection of children online

Key compliance insights for digital platforms

Online safety for children and young people has become a critical concern for families, regulators, and platforms. In July 2025, the European Commission published detailed Guidelines under the Digital Services Act (DSA), focused specifically on protecting children online. The DSA's requirements extend to non-EU businesses, including those in the UK, if services are offered to users within the EU.

What are the new guidelines?

Although the Guidelines are non-binding, the European Commission has made clear that it intends to use them as a key benchmark when assessing compliance with the DSA.

The Guidelines provide a set of recommendations to help online platforms reduce risks such as grooming, harmful content, cyberbullying, and problematic behaviours. They promote a risk-based approach, meaning platforms should tailor protections based on their size, purpose and audience.

Key measures recommended include:

  • Private accounts by default: Children’s accounts should be set to private automatically to keep their personal data and content hidden from strangers.
  • Amend recommended algorithms: Platforms are advised to adjust their content recommendations to prevent minors from encountering harmful or addictive material. This includes prioritising explicit signals from minors rather than behavioural data to personalise feeds.
  • Encouragement against cyberbullying: Children should be able to block or mute users and must not be added to groups without their explicit consent.
  • Preventing content misuse: The Guidelines suggest disabling features that allow downloading or screenshotting children’s posts to avoid unwanted sharing of sensitive content.
  • Limiting addictive features: Features such as streaks, autoplay, read receipts, and push notifications should be disabled by default, alongside protections for AI chatbots.
  • Protecting children from exploitative commercial practices: Platforms must ensure that children are not targeted with manipulative commercial tactics that encourage addictive behaviours or unwarranted spending, such as loot boxes or virtual currencies.
  • Reporting tools: Platforms must introduce tools to encourage feedback and allow minimum parental controls.

Age verification

To enforce age-appropriate access to platforms, the Guidelines encourage the use of reliable age verification methods. These tools will restrict children from accessing adult content and other age-restricted services. The EU’s upcoming Digital Identity Wallets are expected to become a standard for age verification. This will allow users to prove their age to access age-restricted online services without releasing unnecessary personal data. The Guidelines also recommend using age estimation techniques to apply appropriate safeguards for children.

What does this mean for online platforms?

For online service providers, these Guidelines signal a clear expectation that platforms should adopt safety and privacy tools to protect children online. Platforms should therefore:

  • Conduct regular risk assessments focused on children accessing and using the platform.
  • Implement measures proportionate to identified risks.
  • Provide clear and accessible controls for both children and their guardians.
  • Maintain transparency about their procedures.

Additional measures may also include investing in staff training focused on child protection issues, ensuring that appropriate teams are well-equipped to identify and respond to potential risks. Platforms may also wish to collaborate with child safety experts to stay updated on emerging threats and best practices.

The bigger picture on children's safety online

The publication of these Guidelines follows consultations involving experts, stakeholders, and young people themselves. The European Commission’s risk-based and rights-focused approach reflects the growing recognition across Europe that children require tailored protections online, without unnecessary restrictions on their freedom.

Governments across the globe, including the UK, are strengthening regulations to hold platforms accountable for children’s safety online, responding to concerns over excessive screen time, harmful content, and manipulative commercial tactics. There was a reported surge in the use of VPNs in the UK recently, following the introduction of new age verification rules.

Platform providers seeking advice and assistance on how to deal with this evolving area should contact our Commercial Team. 

You’ll find blogs on this area below.

This article has been co-authored by Sasha Fothergill.

Make an Enquiry

From our offices we serve the whole of Scotland, as well as clients around the world with interests in Scotland. Please complete the form below, and a member of our team will be in touch shortly.

Morton Fraser MacRoberts LLP will use the information you provide to contact you about your inquiry. The information is confidential. For more information on our privacy practices please see our Privacy Notice