
Safeguarding Young Minds Online: The EU's Commitment to Child Digital Safety
Discover how the European Commission's BIK+ strategy and Digital Services Act are building a safer, empowering, and age-appropriate online world for children
Safeguarding Minors Online
A simplified guide to the European Commission's guidelines for online platforms, ensuring a high level of privacy, safety, and security for children.
Introduction
Online platforms offer incredible opportunities for minors to learn, connect, and explore. However, they also come with significant risks, from harmful content and cyberbullying to exploitation and excessive use. The European Commission's guidelines, effective July 14, 2025, aim to provide a clear framework for platforms to protect children. This guide simplifies those recommendations into actionable steps.
Key Principles
Proportionality & Appropriateness
Children's Rights Protection
Privacy, Safety & Security by Design
Age-Appropriate Design
1. Understanding and Managing Risks
Platforms need to regularly check for and understand the privacy, safety, and security risks minors face when using their services. This includes looking at how likely it is that minors will use the service, the potential harm from content, behavior, or interactions, and what's already being done to prevent these harms.
Actions that need to be taken:
- Conduct Regular Risk Checks: Perform a detailed risk assessment at least once a year, or whenever significant changes are made to the platform.
- Use the "5Cs" Framework: Use the "5Cs" (Content, Conduct, Contact, Consumer, Cross-cutting risks) to categorize and analyze potential harms, considering how different platform features contribute to these risks.
- Prioritize Children's Best Interests: Ensure that all risk assessments and mitigation strategies prioritize the well-being of children.
- Involve Minors and Experts: Get feedback from children, parents/guardians, and independent experts during the risk assessment process.
- Be Proactive: If there's a good reason to believe a feature or practice might harm children, take steps to prevent or reduce that harm immediately, even before full evidence is available.
- Document and Share Findings: Keep clear records of risk reviews and publish the non-sensitive outcomes for transparency.
2. Designing the Service Safely
Platforms must build safety and privacy into their services from the start, considering the age and developmental stage of minors. This covers everything from how users sign up and manage their accounts to how content is recommended and what types of advertisements are shown.
Age Checks (Age Assurance)
- Implement Strong Age Verification: For high-risk content (like gambling or pornography), use reliable age verification methods, not just asking users for their age.
- Consider Age Estimation for Other Risks: For medium-level risks, age estimation tools can be used if they are accurate and independently reviewed.
- Offer Choices: Provide users with more than one way to verify their age to ensure accessibility.
- Protect Privacy in Age Checks: Ensure age verification methods do not collect more personal data than necessary or track users. The upcoming EU Digital Identity Wallet can be a secure option for this.
- Provide an Appeals Process: Allow users to challenge an incorrect age assessment.
User Registration
- Integrate Age Checks at Sign-Up: Use the registration process as an opportunity to perform necessary age assurance.
- Default to High Privacy for Unregistered Users: If no registration is required, assume users might be minors and apply the highest privacy and safety settings by default.
- Explain Clearly: Inform users about the benefits and risks of registration in an easy-to-understand way.
- No Enticement for Underage Users: Do not encourage or make it easy for children under the minimum age to sign up.
- Easy Account Deletion: Make it simple for minors to log out and delete their accounts.
Account Settings
Setting Area | Action for Minors (Default) |
---|---|
Interactions (likes, comments, DMs) | Only with previously accepted accounts. |
Content Download/Screenshots | Not allowed for minor's content/info. |
Content/Profile Visibility | Only to previously accepted accounts. |
Activity Visibility | Hidden (e.g., "liking" content, "following" others). |
Geolocation, Mic, Camera Access | Turned off (and auto-off after session if enabled). |
Tracking Features | Not strictly necessary tracking turned off. |
Autoplay Videos / Live Streams | Turned off. |
Push Notifications | Turned off by default, always off during core sleep hours. |
Excessive Use Features | Turned off (e.g., "likes" count, "streaks", "is typing"). |
Filters (Body Image) | Turned off. |
- Nudge Towards Safety: Do not encourage minors to change settings to lower privacy or safety; present options neutrally.
- Easy Reversion: Make it easy for minors to revert to default settings (e.g., with a single click).
- Warn About Changes: Clearly explain potential risks when minors change their default settings.
- Restrict Access to Features: Consider removing certain settings or features entirely from minors' accounts, or making some default settings unchangeable for certain age groups, based on risk assessment.
- Prevent Unwanted Contact: Ensure minors cannot be easily found or contacted by unaccepted accounts, and their personal contact details are never shared without explicit permission.
Online Interface Design
- Age-Appropriate Experience: Design the platform's look and feel to be suitable for minors' developmental stages.
- Avoid "Addictive" Design: Remove or limit features that encourage excessive use, like infinite scrolling, automatic video playback, or fake notifications.
- Time Management Tools: Provide clear, easy-to-use tools for minors to manage their screen time.
- Child-Friendly Tools: Ensure all settings, tools, and reporting mechanisms are easy for all minors, including those with disabilities, to find, understand, and use.
- Careful AI Integration: If AI features (like chatbots) are included, they should not be automatically activated or encourage use. Assess their risks and make them easy to turn off, with clear warnings that interactions are with AI and may be inaccurate. AI features should not be used to nudge minors towards commercial content.
Content Recommendation and Search
- Regular Testing: Continuously test and adjust recommendation systems to improve minors' safety and privacy, with input from minors and experts.
- Limit Data Collection: Do not use behavioral data collected outside the platform for recommendations. Limit extensive tracking of on-platform activity for recommendations to avoid continuous monitoring.
- Prioritize Explicit Preferences: Focus on explicit user signals (like stated interests or direct feedback: "Show me less/more") rather than just implicit engagement (like time spent viewing content) for recommendations.
- Filter Harmful Content: Implement measures to prevent minors from repeatedly seeing harmful content (e.g., promoting unrealistic beauty standards, self-harm, discrimination).
- Prevent Illegal Content: Ensure systems do not facilitate the spread of illegal content or criminal offenses involving minors.
- Safe Search: Prioritize verified accounts and contacts in search results for minors. Block search terms known to trigger harmful content and redirect users to support resources for such queries.
- User Control: Allow minors to reset their recommended feeds completely.
- Non-Profiling Option:1 Provide an option for minors to choose a recommendation system that is not based on profiling, especially for Very Large Online Platforms (VLOPs).
- Direct Impact of Feedback: Ensure user reports and feedback directly influence recommendation systems, leading to removal of reported content/contacts and reduced visibility of similar material.
Commercial Practices
- Protect Against Exploitation: Do not take advantage of minors' limited understanding of commercial practices.
- Prevent Harmful Advertising: Avoid exposing minors to harmful, unethical, or unlawful advertisements, and regularly review protective measures.
- Limit Exposure: Control the volume and frequency of commercial content to prevent excessive spending or addictive behaviors.
- No AI Commercial Nudging: Do not use AI systems to influence or nudge children for commercial purposes.
- Clear Disclosures: Ensure all commercial communications are clearly visible, child-friendly, and consistently marked (e.g., with an icon).
- No Hidden Ads: Prevent hidden or disguised advertising from both the platform and its users.
- Transparent Transactions: Avoid misleading financial mechanisms like certain virtual currencies that obscure real money value.
- No Forced Purchases: If a service is presented as "free," do not require in-app purchases to access core functionality. Price all in-app purchases in national currency.
- Prevent Addictive Spending: Avoid features like paid "loot boxes" or gambling-like elements that can lead to excessive spending or addictive behaviors.
- Guardian Oversight: Consider tools for guardians to manage spending limits or require consent for financial commitments by minors.
- Age-Appropriate Transactions: Review policies to ensure younger age groups are not exposed to or allowed to engage in economic transactions if they cannot understand spending.
Content Moderation
- Clear Rules: Clearly define what content and behavior are harmful to minors and make these rules transparent to users.
- Effective Policies: Establish and enforce policies for detecting and moderating harmful content and behavior.
- Prioritize Minor Safety: Focus moderation efforts on content most likely to harm minors, and prioritize reports made by minors.
- Human Oversight: Ensure human review is available for reported content or accounts that might pose risks to minors.
- Trained Teams: Ensure content moderation teams are well-trained, adequately resourced, and available 24/7.
- Technical Solutions: Implement effective technical solutions (like hash matching and AI classifiers) to detect and prevent harmful content.
- Cross-Platform Collaboration: Work with other platforms, regulators, and civil society to detect illegal and policy-violating content and prevent its spread.
3. Supporting Users and Guardians
Platforms need to provide clear ways for minors to report issues, get help, and for guardians to oversee their children's online activities in a respectful and supportive manner.
User Reporting and Feedback
- Easy Reporting: Make reporting, feedback, and complaint tools effective, visible, and child-friendly.
- Comprehensive Reporting Options: Allow minors to report any content, activity, or user that makes them uncomfortable or violates rules, including suspected underage accounts.
- Feedback for Recommendations: Enable minors to provide feedback on content they see (e.g., "Show me less/more") to influence recommendations.
- Confidentiality: Ensure reports are confidential and anonymous by default, with an option to remove anonymity.
- Prompt Responses: Prioritize reports concerning minors' privacy, safety, and security, providing immediate confirmation of receipt and clear explanations of the process and outcomes.
User Support Measures
- Accessible Help: Provide clear, easily identifiable support tools for minors to seek help, including block and mute buttons.
- Direct Connection to Helplines: Connect minors directly to appropriate support services like national Safer Internet Centres and child helplines.
- Warnings and Redirection: Display warning messages and links to support lines when minors interact with potentially harmful or illegal content.
- Blocking and Muting: Offer anonymous blocking and muting options for any user or account, ensuring no information is visible to blocked accounts.
- Comment Control: Allow minors to restrict who can comment on their content.
- Group Invitations: Ensure minors explicitly accept invitations to groups before joining.
Tools for Guardians
- Complementary, Not Sole Solution: Guardian tools should enhance safety measures, not replace them. Platforms must protect minors even if guardians are absent or disengaged.
- Age-Appropriate and Empowering: Tools should be designed to promote communication and autonomy, not just control, and respect minors' privacy.
- Easy to Use: Make tools simple for guardians to access and activate, ideally without needing to create a separate account.
- Transparency to Minors: Clearly notify minors when guardian tools are activated and provide real-time signs if any monitoring is occurring.
- Compatibility: Ensure tools work across different devices and operating systems and are compatible with future "one-stop-shop" tools for guardians.
4. Good Governance and Transparency
Platforms need to have strong internal processes, clear rules, and open communication about how they protect minors online. This includes internal policies, dedicated teams, continuous monitoring, and transparent reporting.
Internal Governance
- Dedicated Team/Person: Appoint a specific person or team responsible for minors' privacy, safety, and security, with enough resources and direct access to senior management.
- Child-Centric Culture: Foster a company culture that prioritizes children's rights and actively involves children in platform design.
- Staff Training: Provide comprehensive training for all staff involved in minor protection, development, moderation, and reporting.
- Regular Compliance Checks: Implement procedures to regularly monitor compliance with these guidelines.
- Data Collection on Harms: Systematically collect data on risks and harms to minors on the platform.
- Share Best Practices: Collaborate with other platforms, regulators, and civil society to share effective protection strategies.
Terms and Conditions
- Clear and Child-Friendly Language: Explain terms and conditions in clear, plain, and intelligible language, especially for minors.
- Include Safety Information: Explicitly detail community guidelines, harmful content, protection measures, and complaint processes.
- Easy Access: Make terms and conditions easy to find and search within the platform.
- Log and Publish Changes: Keep a public record of all changes to the terms and conditions.
Monitoring and Evaluation
- Continuous Assessment: Regularly monitor and evaluate the effectiveness of all platform elements related to minors' privacy, safety, and security.
- Involve Stakeholders: Continuously consult with minors, guardians, experts, and civil society on design and evaluation, incorporating their feedback.
- Adapt and Improve: Adjust the platform's design and features based on evaluation results, new research, and changes in user behavior or risks.
Transparency
- Accessible Information: Make all relevant information about how the platform works and how minors are protected easily accessible to minors and guardians.
- Key Information to Disclose: This includes details on age assurance, recommender systems, reporting processes, AI tools, registration, guardian tools, content moderation, and terms and conditions.
- Child-Friendly Communication: Present all information and warnings in child-friendly, age-appropriate, simple, succinct, and engaging ways, using graphics and videos where helpful.
- Gradual Information Delivery: Provide information incrementally over time to improve retention.