Mușat & Asociații este întotdeauna cu un pas înainte în ceea ce privește evoluția cadrului legislativ și a mediului de afaceri, anticipând nevoile clienților și acționând că un deschizător de drumuri în domeniu.

EU’s Digital Shield: New rules to keep minors safe online

Authors:

Mateea Codreanu, Partner Mușat & Asociații

Ilinca Zamă, Junior Associate Mușat & Asociații

 

On 10 October 2025, the European Commission published new Guidelines on the online protection of minors (the „Guidelines”), pursuant to the Digital Services Act (i.e., Regulation (EU) no. 2022/2065), aiming to ensure a safe online experience for children and young people.

In the sections below, we detail the main principles and measures which target, as a rule, all providers of online platforms accessible to minors, with a focus on practical aspects and having regard also to the relevant market practices and to the opportunities generated by this enactment.

Overview

The Guidelines establish a set of general rules and principles regarding the privacy, safety and security of minors online and provide several key recommendations which will serve as a reference point in asserting whether online platforms that allow minors to use them meet the necessary standards. However, given the voluntary nature of the Guidelines, their implementation requires collaboration between the European Commission, the competent authorities from Member States, as well as the online platforms falling under its scope of application.

The development of the Guidelines follows a comprehensive process, which allowed all interested parties to provide their feedback on the proposed scope and approach of the Guidelines, as well as on good practices and recommendations related to mitigation measures to the risks that minors can encounter online. This process included, amongst others: (a) feedback gathered through a call for evidence; (b) organizing expert consultation workshops; (c) a targeted public consultation aiming to gather feedback from all stakeholders; and (d) organizing meetings of the European Board for Digital Services within its working group on the protection of minors.

Main principles and tools

We note that the Guidelines lay down a set of interconnected principles that online platforms should observe when deciding which mitigation measures to apply. According to the Guidelines, providers should only choose proportionate and appropriate measures, based on a previous case-by-case analysis, especially when dealing with competing rights of individuals. Protection of children’s rights, as well as their best interests should be primordial. Thus, online platforms should ensure and implement privacy, safety and security by design, including age-appropriate designs of their services, in line with the developmental, cognitive and emotional needs of minors.

The European Commission has centred its focus on mitigation measures related to service design (e.g., age verification systems; registration requirements; default account settings; online interface design; recommender systems and search features; commercial practices; moderation). Furthermore, the Commission recommends improving the mechanisms for reporting, feedback and complaints tools to facilitate minors’, guardians’ and other users’ access to such support tools. The Guidelines also provide examples of good governance practices which online platforms should implement to ensure a high level of protection.

Platforms accessible to minors are encouraged to implement effective age assurance methods and technologies and to conduct a previous assessment to determine whether implementing age assurance measures is appropriate and proportional with the legitimate goal of protecting minors’ safety online.

Another important tool refers to designing default settings in a way that confers the highest level of protection to minors’ accounts by ensuring safe and age-appropriate settings, including without limitation to: (a) setting minors’ accounts to private by default, so they only interact with accounts they have previously accepted; (b) prohibiting accounts from downloading or taking screenshots of content posted by minors; (c) disabling by default features that contribute to excessive use (e.g., read receipts, autoplay, push notifications, streaks, number of likes or reactions).

The Guidelines set out recommendations on the online interface design and organization and on time management tools resulting in the obligation to ensure that minors are not exposed to persuasive design features predominantly aimed at engagement, since such could lead to excessive use of the platform.

According to the European Commission, AI features, such as AI chatbots and filters, should only be made available on online platforms accessible to minors after the provider assesses the risks those AI features may pose to minors’ privacy, safety and security and, as the case may be, they should be easy to turn off.

Online platforms accessible to minors should also implement measures able to protect minors from economically exploitative practices, especially given children’s lack of commercial literacy. Thus, as per the Guidelines, platforms should implement effective mechanisms able to prevent minors’ exposure to commercial practices that may be manipulative, lead to unwanted spending or addictive behaviours, including certain virtual currencies or loot-boxes. As regards commercial communications, they should meet the transparency requirements set forth by the Guidelines, namely they should be clearly visible, child-friendly, age-appropriate, accessible and clearly identified as a form of advertising (e.g., with the use of an icon or a similar sign).

The Guidelines lays down additional moderation responsibilities for online platforms accessible to minors, as an important tool for identifying and removing harmful or illegal content and behaviours. For this purpose, providers should put in place clear moderation policies and procedures, subject to periodic reviews, and should create special content moderation teams. The success of the moderation systems is inextricably connected to the way user reporting, feedback and complaints tools are designed. Thus, online platforms should implement effective, visible and child-friendly mechanisms which should facilitate minors’, guardian’s and other users’ access to such support tools.

Implementation starts NOW

The implementation of the protection measures envisaged requires changes in online platforms’ governance practices. Thus, the Guidelines provide certain recommendations to providers in this regard, such as modifying the T&Cs and ToUs by including the measures taken to ensure minors’ privacy, safety and security, presenting such information in a child-friendly, age-appropriate, easy-to-understand and easily accessible manner and regularly monitoring and evaluating the effectiveness of such measures.

Following the publication of the Guidelines, the European Commission has already sent information requests to major online platforms to verify and understand the measures taken to protect minors within their services, including with respect to age verification systems and the implementation of measures aimed at preventing minors from accessing illegal products or harmful content.

For online platforms, these new EU Guidelines are a business opportunity, not just a legal demand. Platforms that adopt ‘safety by design’—meaning private accounts by default, no manipulative features like ‘streaks,’ and clear protection from exploitative ads—gain a huge advantage. They are not just avoiding massive DSA fines, but they are earning the trust of parents. In the competitive online world, this trust is the ultimate currency. By showing a genuine commitment to child safety, a company transforms itself into a trusted gateway and secures the loyalty of its most important future users.