EU Says Meta Failed to Protect Under-13s on Facebook and Instagram
The European Commission said Meta’s age checks are ineffective and that 10% to 12% of children under 13 are using Facebook and Instagram, according to EU findings.
- On Wednesday, the European Commission charged Meta with breaching the Digital Services Act for failing to prevent minors under 13 from accessing Facebook and Instagram.
- The charges follow a two-year investigation by the European Commission, which found that Meta's measures to enforce age restrictions 'do not seem to be effective' at identifying or removing underage users.
- Evidence suggests roughly 10-12% of children under 13 use the platforms, contradicting Meta's 'incomplete and arbitrary risk assessment,' while reporting tools are 'difficult to use and not effective.'
- If the charges are confirmed, the European Commission may issue a non-compliance decision resulting in a fine of up to 6% of Meta's worldwide turnover, though the company can reply to the findings.
- EU tech chief Henna Virkkunen stated that terms must be the 'basis for concrete action' to protect children, as the Commission develops an age-verification app that is 'technically' ready.
218 Articles
218 Articles
Politicians across all EU countries are calling for a social media ban for young people – we want to raise the age limit at 14 years. Those who want to register with TikTok, Facebook, Insta and the like should now verify their age. What the EU Commission has now revealed their own app for. But is it safe?
Facebook and Instagram must do more to block 13-year-olds, warns EU on demand and accuses them of breaking rules.
Brussels threatens US digital giants with billions of sentences again. EU app for safe age control is ready to go despite difficulties.
Coverage Details
Bias Distribution
- 38% of the sources lean Left
Factuality
To view factuality data please Upgrade to Premium




























