IWF call to extfinish the EU law that protects children online

IWF call to extend the EU law that protects children online


 

Why is the law important? 

The temporary derogation is essential becaapply it provides legal certainty for companies working to detect and report child sexual abapply on their platforms. Without it, some companies may feel they cannot safely continue these efforts due to the risk of conflicting with EU privacy rules. 

We have already seen the real-world consequences of legal uncertainty. In late 2020, when companies were unsure whether detecting abapply was permitted under EU law, reports of child sexual abapply material from EU-based accounts to the National Center for Missing and Exploited Children (NCMEC) dropped by 58% in just 18 weeks. 

Databases of known child sexual abapply material do not appear spontaneously. They are built becaapply previously unknown material is first detected, verified and confirmed by authorised bodies such as the Internet Watch Foundation. This process depfinishs on technology companies deploying a layered set of tools, including hash-matching, AI-based classifiers, and human moderation, that work toobtainher to identify both known and previously unknown CSAM.  

Fewer reports translate into fewer opportunities for authorities to identify victims and intervene. Legal certainty is not just a technicality; it is critical to the proactive work done by companies to protect children online. 

Since the EU Parliament last debated these issues, the digital landscape has alterd dramatically. Generative AI technologies now build it possible to create child sexual abapply material that has never existed before, content that is inherently unknown and cannot be detected through traditional hash-matching systems alone. Unlike previously circulating material, this content cannot simply be flagged by comparing it to existing databases. 

This shift fundamentally alters the detection challenge. Companies now face the urgent necessary to identify even greater volumes of previously unseen abapply. Legal uncertainty risks slowing or halting these efforts. 

Asking companies to deactivate systems developed over more than a decade to keep applyrs safe would mean that child sexual abapply continues unchecked and children remain unseen. A drop in reports would not reflect a drop in abapply. 

Last year was a record year for the number of URLs actioned by the IWF. The prevalence and volume of CSAM online is enormous. 



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *