Meta breaches EU law over child safety on social media
Meta has been found in breach of the EU's main law regulating online platforms, and risks an hefty fine.
Published on April 29, 2026

© Unsplash
Team IO+ selects and features the most important news stories on innovation and technology, carefully curated by our editors.
Meta has done little to prevent children under 13 from accessing Instagram and Facebook. This is the preliminary finding by the European Commission's analysis of the American tech giant, which has thus been found in breach of the Digital Services Act (DSA), the bloc's main law regulating online platforms.
This decision follows a formal investigation launched in May 2024 into the operations of Instagram and Facebook. Regulators conclude that Meta’s current systems do not effectively identify or mitigate the risks associated with children accessing these services. Despite Meta’s stated policies requiring users to be at least 13 years old, the Commission found these restrictions are easily bypassed. Data suggests that between 10% and 12% of children under the age of 13 are currently active on these platforms.
"The DSA requires platforms to enforce their own rules: terms and conditions should not be mere written statements, but rather the basis for concrete action to protect users – including children," said in a statement Henna Virkkunen, Executive Vice-President for Tech Sovereignty, Security and Democracy.
Meta's systemic failures
Currently, Meta relies heavily on user self-declaration for age verification. Users simply enter a birth date, a process that regulators deem ineffective because it lacks robust verification controls. Furthermore, the reporting mechanisms designed to flag underage accounts are overly complex and hidden from the average user. The Commission noted that a user must navigate through up to seven clicks to report a minor on the platform. This friction discourages users from using the safety tools Meta claims to provide.
While Meta asserts it has invested in over 50 tools and policies to protect young users, the Commission finds these efforts insufficient under the current regulatory landscape. The gap between Meta’s self-reported compliance and the reality of underage access is a central point of the dispute. Regulators are demanding a shift away from easily bypassable systems toward more rigorous, privacy-preserving methods. This scrutiny forces a re-evaluation of how social media companies manage the entry points to their digital environments and ensures that safety measures are functional rather than performative.
To address failures in age verification, the European Commission has recently introduced a technical blueprint for a standardized age-verification application. This tool utilizes zero-knowledge proofs, a cryptographic method that allows users to prove they meet an age requirement without sharing their actual birth date or identity documents.
What is the DSA about?
The DSA serves as the legal backbone for this enforcement action. It establishes a uniform set of rules across the European Union to ensure a safe and transparent online environment. Under the DSA, platforms with more than 45 million monthly active users in the EU are classified as Very Large Online Platforms (VLOPs). These entities, including Meta’s Instagram and Facebook, face the highest level of regulatory scrutiny and must adhere to strict risk management obligations.
The act prohibits targeted advertising for minors and bans the use of dark patterns that manipulate user behavior. It also requires platforms to perform annual risk assessments to identify potential harms to fundamental rights and public discourse. The European Commission directly enforces these rules for VLOPs to maintain strategic autonomy and protect European citizens from systemic digital threats.
Setting a precedent
The financial implications for Meta are substantial. The Commission stated that it will continue its investigation. If European lawmakers issue a final non-compliance decision, Meta could face fines of up to 6% of its total worldwide annual turnover. This represents a multi-billion-dollar risk for the social media giant.
Meta now has the opportunity to review the investigation files and submit a formal written response to the preliminary findings. The company will likely propose new remedies and technical changes to avoid the maximum penalties. Following Meta’s response, the Commission will consult with the European Board for Digital Services before reaching a final verdict. This case will set a precedent for how the DSA is applied to other global tech firms.
