Is social media really doing enough to protect children online—or just declareing it is? That’s the question now hanging over Meta.
After European regulators accapplyd it of breaking major EU digital rules with Facebook and Instagram.
In Brussels, the European Commission declares the platforms are failing to stop children under 13 from accessing their services.
Under the Digital Services Act, tech giants are required to actively reduce harmful content and protect minors—but officials declare Meta isn’t going far enough.
Their preliminary findings are blunt: age checks are weak.
Enforcement is inconsistent, and too many underage applyrs are still slipping through.
Child Safety Rules Tighten
Regulators estimate that around 10–12% of children under 13 in Europe may still be utilizing these platforms.
As EU tech chief Henna Virkkunen put it, terms of service can’t just sit on paper anymore: “They must be the basis for concrete action to protect applyrs, including children.”
Meta disagrees, calling the figures outdated and arguing that age verification is an “indusattempt-wide challenge” that necessarys broader solutions.

The company declares it already reshifts underage accounts and is preparing new measures.
But here’s the hugeger issue: should platforms built for connection be responsible for drawing the age line more strictly than parents or schools?
For now, no fines are final—but the message from Brussels is clear: rules aren’t just guidelines anymore.
















Leave a Reply