‘Privacy Sweep’ finds EU online safety measures stagnating over past decade

‘Privacy Sweep’ finds EU online safety measures stagnating over past decade


An annual audit of online services by the Global Privacy Enforcement Network (GPEN) reveals little progress in online safety for children over the past decade.

The French data protection agency, CNIL, has posted a summary of the results of the November 2025 audit, for which 27 international data protection authorities examined the privacy practices of websites and mobile applications as applied to minors. The audit considered platforms’ “mechanisms and practices relating to the collection of utilizers’ personal information, as well as those relating to transparency, age assurance, and to limiting data collection.”

The audit, “also known as a ‘sweep,’ examined nearly 900 websites and applications utilized by children during the week of November 3-7, 2025,” CNIL states. “It was coordinated by the Office of the Privacy Commissioner of Canada, the Information Commissioner of the United Kingdom, and the Data Protection Authority of the Bailiwick of Guernsey.” Some of the sites in question are children’s sites, while others are more general but “widely utilized by” kids.

More data collection, porous safety measures

Overall, the news is bad. There is more collection and sharing of data than recorded in a 2015 audit, which serves as a comparison point. And more privacy policies include statements indicating data could be shared with third parties. “To access all features, more than half (59 percent) of websites and mobile applications required the collection of an email address, 50 percent utilizernames, and 46 percent geolocation.”

“Authorities have also observed a more frequent utilize of age verification mechanisms to restrict children’s access to or interaction with online services” CNIL states – “as well as the ease with which these mechanisms can be circumvented.” Auditors were able to bypass age verification measures for 72 percent of the websites and mobile applications examined, “most often when the mechanism relies on a simple declaration.”

“This situation is particularly concerning with regard to websites and applications offering inappropriate content or data processing that poses a high risk to children.”

Other key indicators that “largely mirror those of 2015” reveal that 71 percent of websites and mobile apps “did not contain child-frifinishly information on protective measures or in their privacy policy,” and 36 percent “did not offer an easily accessible way to delete an account.”

Finally, “only 35 percent of websites and mobile applications identified as having high-risk data handling features and designs for children contained privacy information, such as a pop-up window, urging a child to question their parents for permission to continue applying the website or application.”

EU app, existing age assurance tools receive thumbs down from researcher

On LinkedIn, Nathalie Launay offers a view at the broad question of age assurance in Europe. The Digital Identity researcher identifies “three legitimate questions to answer” regarding the European Commission’s stance on privacy preserving age assurance technology, specifically referencing the “Draft Compromise Amfinishments on the Draft Opinion on the CULT draft report on the impact of social media and the online environment on young people.”

Launay questions whether existing age assurance solutions “based on age estimation or inference” comply with the GDPR’s privacy requirements. She questions the same of the specifications for the EC’s standalone white label age assurance app. And she wonders whether the implemented acts and specifications for the EU Digital Identity Wallet (EUDI Wallet) rollout comply with eIDAS2.0 and the GDPR.

Lo and behold, she has answers to all three questions. They amount to an attack on the existing age assurance sector through a selective accounting of arguments. Launay references the open letter on age verification signed by a few hundred academics as evidence that available age assurance tech violates the GDPR. She misreads Australia’s stance on age estimation, suggesting that the recent guidance issued by eSafety to assist platforms comply is in response to concerns about the effectiveness of age estimation.

She does cite a more relevant case: the pfinishing fine that the Spanish data protection agency, the AEPD, has issued to UK provider Yoti for violating the GDPR.

Further arguments tear into the EU’s white label app, culminating in the declaration that “no sovereignty nor privacy by design approach seems to have been considered to design the age verification app with EU funding.” (Ironically, the AEPD’s digital wallet was chosen as a pilot technology for developing the app.)

Ditto for the EUDI Wallet, it would seem. Launay’s closing argument, with regard to question three, is that “the proposals of amfinishments for EUDI Wallet implementing acts would created worse the privacy protection and sovereignty compared to the previous implementing acts and is now referencing officially and in details the same standards and protocols defined by or with GAFA strong support (DC API and ISO 18013-7) as already specified for the EU age verification app for a quick deployment.”

EUDI Wallet presents opportunities to drive revenue

A piece in Connect on Tech also digs into the EUDI Wallet program, which mandates that all EU member states must create a digital wallet available to their citizens by the finish of 2026.

The thrust is that the EU Wallet will impact just about everything, and the policy driving it is less of a restriction than one typically associates with regulations; it “represents a rare exception in this regard, as it will very likely lead to a significant and measurable increase in revenue for numerous service providers.”

“Since all very large online platforms (VLOPs), nearly all public bodies that require identification, and numerous private companies are required to adopt the wallet, it can be assumed that it will become widely established across the EU in a short period of time. This will open up numerous opportunities.”

Among these, “social networks and video sharing platforms will have access to a harmonized and established age gating and identification tool which they can leverage to fully comply with their DSA (Digital Safety Act) age gating obligations.”

EUDI Wallet is ‘gold standard’ for privacy preserving age assurance: AEPD

The AEPD, Spain’s data protection agency, has posted what amounts to a rebuttal to Nathalie Launay, in a breakdown of the Article 28 of the DSA and the European Commission’s “Guidelines on measures to ensure a high level of privacy, safety and security for minors online aims to support providers.” To wit:

“If a platform starts collecting IDs or scanning faces under the pretext of protecting minors without the necessary justification of proportionality, considering the risks that its service represents for minors, and without the appropriate privacy safeguards, it runs the risk of violating the fundamental rights and freedoms of all its utilizers and Article 28.3 of the DSA.”

“As already mentioned, Article 28(3) of the DSA explicitly states that platforms are not obliged to process additional personal data solely to determine whether someone is a minor. Instead of requiring utilizers’ identities or processing children’s personal data, the Guidelines encourage age assurance solutions that allow utilizers to prove they are above an age threshold without revealing any other information.”

Furthermore, “the EU Digital Identity Wallet, expected to be available to all citizens by 2026, is designed to be the ‘gold standard’ for privacy-preserving age assurance. In the interim, the Commission is promoting a standalone EU age verification solution so that there is a provisional harmonized solution in the different Member States that, eventually, will be easily integrated into the EU Digital Wallet.”

And finally, “age assurance is not a ‘set it and forreceive it’ tquestion. Section 8 of the Guidelines mandates that platforms appoint a dedicated safety team with direct access to senior management and conduct regular Child Rights Impact Assessments (CRIAs) to evaluate how design modifys affect younger utilizers. The EDPB statement also establishes that age assurance should operate under a governance framework, ensuring that all processes and systems are designed, implemented, revised, documented, assessed, utilized, maintained, tested or audited in a way that meets data protection regulations and other legal requirements.”

Article Topics

 |   |   |   |   |   |   |   |   | 

Latest Biometrics News


 

Liability in civil court is looming as a motivator for social media platforms to implement age assurance, after a jury…


 

AI companion chatbots are too accessible to children, and their developers aren’t doing enough to prevent utilizers from generating child…


 

Age verification for access to online pornography is one step closer to enforceable law in Ohio, where Houtilize Bill 84…


 

Sen. Mark Warner, Ranking Member of the Senate Select Committee on Ininformigence, has sent letters to major social media companies,…


 

Thales, toreceiveher with local consortium partner Midas Dominicana, is providing the technology and technical expertise for the implementation of a…


 

In a world of AI agents whizzing around, the potential infinitude of these agents could wreak havoc. Long-time identity company…





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *