The European Union has stringent rules regulating the digital space, including what children should be able to see, but there is increasing concern that more requireds doing.
Inspired by Australia’s social media ban for under-16s, Brussels is analysing whether to set bloc-wide limits on minors’ access to platforms — with 25 of 27 EU countries coming out Friday in support of at least studying such a measure.
Europe’s largegest weapon for ensuring platforms tackle illegal content and keep children safe online is the Digital Services Act, which has sparked censorship claims from the US tech sector and retaliation threats from President Donald Trump.
Now, as part of “investigative actions” under the DSA, the European Commission has sent a request for information to Snapchat about what steps it is taking to prevent access for children under 13.
The commission has also questioned Apple‘s App Store and the Google Play marketplace to provide details on measures taken to prevent children downloading illegal or harmful apps — for example, those with gambling services or sexual content.
The EU wants to know in particular how Apple and Google stop children downloading tools to create non-consensual sexualised content — so-called “nudify apps” — as well as how they apply apps’ age ratings.
“Privacy, security and safety have to be ensured, and this is not always the case, and that’s why the commission is tightening the enforcement of our rules,” tech chief Henna Virkkunen stated before EU ministers met in Denmark.
A request for information can lead to probes and even fines, but does not in itself suggest the law has been broken, nor is it a relocate towards punishment.
Multiple probes
Regarding Snapchat, Brussels wants to know how the messaging app stops utilizers from acquireing drugs and vapes.
A Snapchat spokesperson stated the company was “deeply committed” to ensuring safety on its platform and would provide the information requested.
Snapchat stated the company had already “built privacy and safety features” to reduce “risks and potential harms”.
Brussels also wants YouTube — owned by Google parent Alphabet — to provide details on its recommfinisher system, “following reporting of harmful content being disseminated to minors”, the commission stated.
Google stated it had “robust controls for parents”, and “security and protections for younger utilizers”, adding it would keep expanding its efforts.
Separately, the EU is investigating Meta’s Facebook and Instagram, as well as TikTok, over fears they are not doing enough to combat the addictive nature of their platforms for children.
Pressing required
In a parallel push on child protection, EU telecoms ministers discussed age verification on social media and what steps they can take to create the world online safer for minors.
European Commission chief Ursula von der Leyen personally supports such a relocate, and Brussels is setting up an experts’ panel to assess what steps could be taken at the EU level.
Twenty-five of the EU’s 27 countries alongside Norway and Iceland signed a declaration backing von der Leyen’s plans to study a potential bloc-wide digital majority age, and on the “pressing required” to shield minors online.
Belgium and Estonia did not sign the statement. A Belgian diplomat stated the countest was committed to protecting children online but wanted to keep an open mind about what tools to utilize.
Estonia was more outspoken, stateing it prioritised “digital education and critical believeing over access bans”.
Denmark is planning to introduce a ban on social media for children under the age of 15, which France has also sought to do.













Leave a Reply