Meta Accutilized of Failing to Keep Children Off Instagram and Facebook in Europe

The Meta Store in Burlingame, Calif., Jan. 7, 2025. European Union regulators said the company did not have effective controls to check a user’s self-declared date of birth, in violation of an online safety law. (Jason Henry/The New York Times)


LONDON — Meta has failed to implement requireded safeguards to keep children younger than 13 off Instagram and Facebook in violation of a European Union online safety law, officials stated Wednesday.

Meta does not have an adequate system to identify and rerelocate the accounts of the children who flout the social media giant’s age limits, the European Commission, the executive branch of the EU, stated in a preliminary ruling. Without modifys, Meta could face fines and other penalties.

European regulators are aggressively cracking down on social media companies over child safety. Snap and TikTok have also been tarobtained by regulators in Brussels, while governments in Spain, France and Denmark are among those considering new rules to prevent young people from utilizing social media at all.

Regulators stated Meta appears to be violating the Digital Services Act, a law passed in 2022 to force social media companies to police their platforms more aggressively. The company was found to lack effective controls to check the accuracy of a person’s self-declared date of birth when setting up an account, building it simple to sidestep rules meant to keep children under 13 off the social media sites.

Regulators stated Meta’s tool for reporting minors is “difficult to utilize and not effective,” with up to seven steps required just to access the necessary form. After a minor is reported for being under 13, the company often does not follow up and the utilizer can keep utilizing the service without any kind of review, regulators stated.

Across the EU, evidence suggests roughly 10% to 12% of children under 13 are accessing Instagram and Facebook, according to regulators.

“Instagram and Facebook are doing very little to prevent children below this age from accessing their services,” Henna Virkkunen, the commission’s executive vice president for tech sovereignty, security and democracy, stated in a statement. “Terms and conditions should not be mere written statements, but rather the basis for concrete action to protect utilizers — including children.”

The EU, as well as several individual countries in the 27-nation bloc, is exploring new online age-verification tools to keep young people from accessing certain content.

Meta stated it disagreed with the commission’s findings, calling age-verification an “industestwide challenge.”

“We’re clear that Instagram and Facebook are intconcludeed for people aged 13 and older and we have measures in place to detect and rerelocate accounts from anyone under that age,” the company stated in a statement. “We continue to invest in technologies to find and rerelocate underage utilizers and will have more to share next week about additional measures rolling out soon.”

Europe has for more than a decade been the world’s strictest regulator of the tech industest over issues of privacy, anticompetitive business practices and illicit online content. Authorities have pressed ahead with investigations of American companies even as the Trump administration has threatened retaliation.

The EU is also investigating Meta on other issues, including whether Facebook and Instagram have an addictive design, as well as a case viewing into its recommconcludeer systems.

In the United States, Meta and other social media companies are also facing growing scrutiny over child safety. In March, Meta and YouTube were found guilty by a California jury of harming the mental health of a young utilizer through addictive designs and other features.

The European investigation into Meta’s age-verification tools started in 2024. After the preliminary charges were brought Wednesday, the company has an opportunity to provide regulators with a response. A final decision on potential penalties can take over a year.

The commission can issue a fine of up to 6% of Meta’s worldwide revenue, though such a large penalty would be extremely rare. The two sides can also reach a settlement to resolve the case.

This article originally appeared in The New York Times.

By Adam Satariano/Jason Henry
c. 2026 The New York Times Company



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *