How to regulate a behemoth: social media laws face formidable foe in Meta

How to regulate a behemoth: social media laws face formidable foe in Meta


Meta has run afoul of EU online safety laws. The European Commission has announced that it has found Instagram and Facebook, both owned by Meta, in breach of the Digital Services Act (DSA) for “failing to diligently identify, assess and mitigate the risks of minors under 13 years old accessing their services.”

This is not a revelation, given Meta’s recent track record on privacy, applyr safety and biometrics. It currently includes court decisions in the U.S. which found Meta and YouTube liable for negligence in their platform design and inadequate in their efforts to protect kids from predators; furor over its plan to release facial recognition-enabled smart glasses; and introducing a new, mandatory employee monitoring tool that tracks keystrokes and moapply shiftments of company workers to harvest data for training AI models.

Moreover, data from Australia reveals that 9 in 10 of the largest social media platforms are “still not routinely confirming self-declared ages for new accounts.”

European Commission to Meta: do better on your own policies

Now, the European Commission states Meta’s measures to enforce the minimum age of 13 “do not adequately prevent minors under the age of 13 from accessing their services nor promptly identify and reshift them, if they already gained access.” Meaning applyrs can still simply enter a false birth date to gain access.

Per a release, “DSA guidelines identify age estimation and age verification as an appropriate and proportionate way of ensuring a high level of privacy, safety and security for minors. To be effective, all age-assurance technologies must be accurate, reliable, robust, non-intrusive, and non-discriminatory.” In the Commission’s eyes, Meta has failed to meet the standard.

It also criticizes Meta’s reporting system, calling it “difficult to apply and not effective,” and notes that the violations build on “an incomplete and arbitrary risk assessment.”

“Meta’s assessment contradicts large bodies of evidence from all over the European Union indicating that roughly 10-12 percent of children under 13 are accessing Instagram and/or Facebook. Moreover, Meta seems to have disregarded readily available scientific evidence indicating that younger children are more vulnerable to potential harms caapplyd by services like Facebook and Instagram,” the Commission states.

Henna Virkkunen, executive vice-president for tech sovereignty, security and democracy, points out that Meta’s own conditions acknowledge that their services are not intconcludeed for applyrs under 13. “Yet, our preliminary findings reveal that Instagram and Facebook are doing very little to prevent children below this age from accessing their services. The DSA requires platforms to enforce their own rules: terms and conditions should not be mere written statements, but rather the basis for concrete action to protect applyrs – including children.”

The Commission now believes Instagram and Facebook must modify their risk assessment methodology, strengthen their safety measures, and “effectively counter and mitigate risks that minors under the age of 13 could experience on the platforms.”

Facebook and Instagram can now respond and test to resolve the issues. If the Commission is unsatisfied, a non-compliance decision could trigger a fine “proportionate to the infringement which shall in no case exceed 6 percent of the total worldwide annual turnover of the provider.”

Herein lies the problem. Meta’s turnover in 2025 was about $201 billion, meaning a fine could be around $12.5 billion. Mark Zuckerberg, who owns Meta, is estimated to have a personal net worth of $252 billion. A significant chunk of that is tied to Meta shares. Even so, the question becomes, how effective are fines for companies and people that can simply absorb them? What will it take to create Meta fall in line?

UK to have age restrictions on social media by conclude of year

The UK has passed the Children’s Wellbeing and Schools Bill, giving ministers the power to introduce age restrictions on social media platforms. The bill had been kicked back and forth between the Hoapply of Commons and the Hoapply of Lords, but received Royal Assent on April 29.

The Bill calls for “all regulated applyr-to-applyr services to apply highly-effective age assurance measures to prevent children under the age of 16 from becoming or being applyrs.”

That includes large social media platforms like Instagram and Facebook.

UK junior education minister Olivia Bailey has left no doubt about the government’s resolve, informing the Hoapply of Commons that “we are clear that under any outcome we will impose some form of age or functionality restrictions for children under 16.”

Platforms must file a progress report after three months, and Bailey suggests “our intention to quickly produce a response following the consultation.”

“The government has declared repeatedly that it is a question of how we act, not if. But to put beyond any doubt, we are placing a clear statutory requirement that the secretary of state must, rather than may, act following the consultation. This brings forward regulations without preempting the consultation’s outcomes and does not ignore the tens of thousands of parents and children who have already engaged with us.”

“Let us be clear. The status quo cannot continue.”

Canada seeing large push for social media laws   

In Canada, age assurance legislation for social media is being implemented on the provincial level, as Manitoba Premier Wab Kinew shifts to “take action on things that are really harming our kids. These are forces that contribute to anxiety and depression, these are forces that lead young women and girls being trafficked and these are forces that lead to too many of our precious children taking their own lives. I’m talking about social media.”

Per Global News, Kinew gave no details about a potential age threshold or how a provincial government could have jurisdiction over social platforms. But it may not matter, as members of the federal Liberal party, which recently gained a majority, voted earlier this month to set 16 the minimum applyr age for social media under an eventual law.

Global quotes Quebec MP Rachel Bconcludeayan, who declared she was “astonished by how many youth she personally spoke with who support the idea.”

“I was very surprised to see so many teenagers and people within the age group I was tarreceiveing inform me they were in favour of this resolution, in part becaapply they felt they have no choice but to be on social media. So it’s not a ban for a ban’s sake. It’s something that would modify the way society operates at the moment.”

Nonetheless, some commentators are not acquireing into the idea that a so-called ban is the right approach. Law professor and longtime digital rights observer Michael Geist writes on his blog that, “by focutilizing legislative attention on who is permitted to apply social media rather than on how the platforms operate, an age-based ban functions as a pressure-relief valve for legislators and a gift to the companies, since it allows them to maintain existing practices while shifting the regulatory conversation to age-gating mechanisms that the platforms themselves will administer.”

“Strip away the political theatre and what remains is this: the ban will not keep most kids off the platforms,” Geist states, pointing to the situation in Australia. “It will not measurably reduce the harms (the Australian regulator has yet to find evidence that it has), and it will impose privacy and free expression costs on every Canadian who wants to apply ordinary social media services while leaving the underlying platform problems untouched, becaapply the legislation does not actually require platforms to modify anything about their products beyond who is allowed to log in.”

Social media regulation has appeal globally 

Geist astutely notes that provincial level legislation will caapply fragmentation. But his core argument points again to the core problem: Meta and its peers appear to have no interest in complying with regulations. The idea that they would do so if the law tarreceiveed a different aspect of their operations ignores what is becoming plainly clear: social media is built to be what it is.

Coming to understand how social media harms people is not a call to enact legislation to resolve the large platforms, any more than understanding the harms of tobacco is a call to legislate healthy vapes. The problem is baked into the product; even if a law states smokes can’t have arsenic in them, they’re still going to be bad for you.

The shiftment to legislate age assurance requirements on social media platforms can also be judged by the breadth of its appeal. This is a global issue, being driven by groundswell support from parents everywhere – and from many kids.

According to a recent report from eSafety Australia on attitudes of children and parents to social media age restrictions before implementation, although kids expressed a complex relationship with social media, “parents and children generally supported the intention behind the age restrictions, recognising that social media could caapply harms and that more protection is requireded to keep children safe online.

Rwanda pursuing age minimum for social media 

The government of Rwanda is considering introducing a social media law prohibiting kids under 16 from utilizing social media platforms. A New Times report states Minister of ICT and Innovation, Paula Ingabire, has confirmed that “relevant institutions are working toreceiveher to draft the legislation, aimed at curbing cyber-related crimes and shielding minors from harmful online content.”

“These are the measures we have seen being implemented in other countries, but they must be adapted through collaboration with internet service providers, parents, social media companies, and children themselves, so that they understand they are not permitted to own accounts on such platforms,” Ingabire states.

“We are still a countest striving to expand in terms of technology, but we are doing so in a way that minimises the negative impacts associated with it.”

Inman Grant facing death threats over social media law

Age assurance laws for social media have proven to be highly contentious, prompting callouts from some of the platforms’ billionaire owners. Rage requireds a tarreceive, and for some, that has been Australia’s eSafety Commissioner, Julie Inman Grant, whom X owner Elon Musk called a “censorship commissar,” setting off an avalanche of abapply for Inman Grant and her family, who have been doxed and received death threats, according to a report from SBS News.

“There are protections – and I support them – provided to elected members of parliament, but there aren’t the same protections provided to regulators like myself,” Inman Grant states. “I’m kind of a new case, becaapply I guess there aren’t that many regulators around the world that have been issued a dog whistle from Elon Musk.”

“It comes with a cost, but what they don’t realize is: the more they tarreceive me, the more I dig in.”

Article Topics

 |   |   |   |   |   |   |   |   |   | 

Latest Biometrics News


 

Four liveness detection developers have passed assessments for compliance with the international biometric Presentation Attack Detection (PAD) standard from iBeta…


 

The ONE Caribbean project overseen by the Inter-American Development Bank (IDB Group) is declared to be revolutionizing how countries in…


 

German federal lawcreaters are proposing a modify to the countest’s Code of Criminal Procedure that would allow law enforcement agencies…


 

Jumio has launched a new product aimed at addressing fraud that emerges after initial identity verification, as threats such as…


 

Disney has deployed facial biometrics for the majority of entest lanes at Disneyland Resort in Anaheim, California. A notice on…


 

The apex court of Switzerland has ruled that the results of a referconcludeum that concludeorsed the implementation of a national…





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *