This is an HTML version of an attachment to the Freedom of Information request 'Exchange with NGO Thorn'.

Fighting Child Sexual 
Abuse in the EU
Thorn’s position on a long-term regulation for the 
detection, removal and reporting of CSAM online

Detection, Removal and Reporting of CSAM Online   | 2
Thorn is a US-based nonprofit organization 
that builds technology to defend children 
from sexual abuse and online exploitation. 
We work globally to accelerate law enforcement’s 
framework allowing the use of effective and 
ability to identify child victims; to equip industry 
secure technology for the purpose of combatting 
with the tools they need to detect, report, and 
child sexual abuse is an essential building block 
remove child sexual abuse material from their 
of a global ecosystem to protect children online.  
platforms; and to lead cutting edge research 
and programming to build resilience in youth 
In this paper we are sharing Thorn’s  
online. We are an independent organization 
perspective on:
that sits at the unique nexus between child 
•  The urgency of fighting child sexual abuse 
advocacy, law enforcement, and technology. We 
online: why the fight against CSAM is timelier 
are neither a tech company nor a government 
than ever and what hurdles we are  
or law enforcement entity; we are a third-party, 
facing today.
mission-centered non-profit organization. Thorn 
was founded by Demi Moore and Ashton Kutcher 
•  The recently adopted temporary derogation 
in 2012 with over 5,000 donors investing in our 
from certain provisions of Directive 2002/58/
mission and supporting our work. 
EC, both in terms of aspects Thorn would like 
to uphold or clarify in the context of the future 
At Thorn, we believe in the power and potential 
long-term legal framework. 
of governments, law-makers, NGOs, and tech 
•  Our priorities for shaping a long-term 
companies working together to eliminate child 
framework for the detection, removal,  
sexual abuse material (CSAM) online. This goal 
and reporting of CSAM online.
cannot be achieved by just one of these entities 
alone. We are grateful for the EU’s leadership and 
We look forward to engaging with EU 
dedication to this issue in the context of the  
policymakers, leading national officials and third 
EU strategy for a more effective fight against 
parties committed to finding effective solutions 
child abuse. Its efforts to shape a regulatory 
to end this crime.
EU Transparency Registry Number: 854246640306-96

Detection, Removal and Reporting of CSAM Online   | 3
Fighting child sexual abuse online: 
needed today more than ever
A comprehensive approach is needed to 
establish an effective global ecosystem for child 
safety, both on and offline, to eradicate online 
child sexual abuse. The need for action  
is pressing. Child sexual abuse material found  
on content-hosting platforms has grown rapidly 
     increase in files reported to 
and continues to spread across the open web. 
authorities in the US alone
In debates around technological solutions, we 
must be very clear about what this issue truly 
represents: Child sexual abuse material (CSAM) 
2016 2021
is the documentation of the rape, torture, and

sexual abuse of a child, many as young as infants 
and toddlers. Each image and video is a crime 
This is truly a global effort. The nature of online 
scene and must be treated as such. Additionally, 
child sexual abuse is that it knows no borders, 
the online recirculation of each image and video 
and the global ecosystem working to combat 
revictimizes the child depicted and can lead to 
this crime has been deliberately designed with 
lifelong trauma years after a child is recovered 
that ground truth in mind. A recent German 
from an abusive environment. According to 
crime statistic by the Federal Criminal Police 
research of the Canadian Centre for Child 
Office on violence against children showed, 
Protection, conducted in May 2021, 89% of child 
social media drove a fivefold increase of self-
victims have had suicidal thoughts, 60% have 
generated CSAM in 2020 compared to 2018. 
attempted suicide, and 30% of child victims have 
Such material tends to go viral domestically 
been recognized by people who  
and internationally. Therefore, any legislative or 
had seen their abuse material. 
regulatory efforts made at the EU level not only 
The volume of this material is overwhelming.  
have the potential to set global precedent on 
Over the past 15 years, there has been a 15,000% 
policy in this space, but they have immediate 
increase in files reported to authorities in the 
ripple effects for the existing stakeholders around 
US alone according to The National Center for 
the world - for NGOs, for law enforcement, for 
Missing and Exploited Children (NCMEC). Since 
companies, and most importantly for children. 
2019, Thorn’s Safer community has identified  
Thorn stresses the need to create long-term legal 
over 183,000 CSAM files for removal and 
certainty for actors involved in the detection, 
reporting to NCMEC. This increase represents 
reporting, and removal of CSAM. We also want 
both the real and alarming rise in CSAM 
to increase the awareness about this issue and 
dissemination online, and the improved ability 
help develop the tools to tackle it. We welcome 
and willingness of tech companies to be 
the proposals of the European Commission to aid 
proactive in detecting, reporting, and removing 
in aligning Member States and to eliminate the 
this content from their platforms. 
fragmentation in the legislation on this  
crucial matter. 
EU Transparency Registry Number: 854246640306-96

Detection, Removal and Reporting of CSAM Online   | 4
Interim regulation legislation regarding the 
use of technologies by providers of online 
communications to continue voluntary 
detection, reporting, and removal of CSAM:
We are relieved that the reached agreement will 
Europe must aim for a workable definition  
lead to the restoration and preservation of online 
and regulation of grooming practices in 
child sexual abuse detection in the EU. We thank 
order to foster technological development, 
the policymakers across the EU and its Member 
which is a cornerstone in the work against 
States who worked hard to push this agreement 
child sexual abuse.
forward. This creates the legal basis for online 
Considerations and open questions
service providers who may have interrupted the 
search and detection of CSAM because of the 
•  Thorn welcomes the transparency 
temporary legal uncertainty. 
requirements included in the interim 
regulation. We would, however, caution that 
Our perspective on the 
this transparency must not be used to the 
compromise reached: 
detriment of relevant detection, reporting, 
and removal algorithms in a way that can be 
misused by potential offenders.  
Appropriate definitions on hashing, classifiers,  
and anti-grooming technology
•  Thorn supports that mandatory reporting 
standards for providers of online 
•  We welcome the balanced definitions reached 
communications on the amount of harmful 
that create legal certainty, yet also allow 
material identified, reported, and removed 
for the needed flexibility to reflect future 
are key to enhance the understanding of the 
technological advancements and innovation. 
breadth of the issue and develop even more 
We hope this is something that is maintained 
effective solutions to tackle it. 
in the long term legislation. 
•  Mandatory reporting standards must be 
•  Effective grooming detection is a first crucial 
feasible for all providers who want to 
step in preventing child sexual abuse from 
participate in the fight against sexual abuse 
happening.  We will never end the spread of 
of children. In the end, requirements must 
online child sexual abuse material if we are 
not lead to a situation where only the very 
unable to create and deploy preventative tools 
big players with large administrators can 
and measures.   
participate. The scope of the reporting 
It is of paramount importance for anti-
standards should therefore be scalable and 
grooming technology to fall within the scope  
reflect the size and capacity of a company.  
of CSAM regulation.  
EU Transparency Registry Number: 854246640306-96

Detection, Removal and Reporting of CSAM Online   | 5
How legal certainty for companies using 
technology to proactively detect child sexual 
abuse will help fighting CSAM effectively in 
the future:
As legislation progresses related to this issue, we 
Grounding solutions in protecting children’s 
must not lose sight of the children at the centre 
rights and the privacy of child victims
of this crime, nor permit child sexual abuse 
detection methods to be disrupted or curtailed. 
•  The protection of children’s rights and the 
We therefore believe it is imperative for the EU 
privacy of child victims must be pursued 
to set up a long-term regulation that effectively 
with the same vigor as the human rights and 
tackles this epidemic. With regard to the policy 
privacy rights of general adult users. Targeted 
options being considered in the EU Commission’s 
and surgical detection methods used solely 
impact assessment on a Regulation on the 
to combat child sexual abuse and grooming 
detection, removal and reporting of child sexual 
represent the most comprehensive, innovative 
abuse online, and establishing the EU Centre to 
and effective solutions to protect child victims’ 
prevent and counter child sexual abuse, Thorn 
privacy. The utilization of these protection 
supports a legal framework which establishes a 
methods must be preserved in the long-
clear legal basis under which relevant providers 
term legislation, through the preservation of 
of online communication services are allowed 
hashing technology, classifiers, and the ability 
to implement voluntary detection of CSAM on 
to apply anti-grooming technology to text 
their services including both previously known 
analysis. It also needs to be expanded, future-
and new material and text based threats. Thorn 
proofed, and account for future technological 
supports a legal framework which establishes a 
clear legal framework for mandatory reporting on 
•  Child sexual abuse detection technology is 
CSAM detected on relevant providers of online 
designed to detect this kind of abuse and this 
communication services. 
kind of abuse only. It was specifically designed 
this way because privacy has always been at 
the center of this work. Child advocates have 
Grounding solutions in 
worked for years with platforms to balance the 
protecting children’s rights and 
protection of children’s rights and privacy, and 
the privacy of child victims 
the privacy of their larger user base. We can 
and have had both, and that partnership and 
balance must continue.
Allow for innovation and future-proofing
•  Threats against children online evolve rapidly, 
and so must the technological interventions 
to combat them. Tech companies, NGOs, 
and law enforcement must retain the ability 
to be nimble and effective in the face of an 
increasingly urgent and constantly changing 
online landscape for children. 
EU Transparency Registry Number: 854246640306-96

Detection, Removal and Reporting of CSAM Online   | 6
•  The nature of technology is that it changes 
Promote transparency
and improves with time, and any legislative 
framework must reflect that reality. It must 
•  Thorn believes that tech companies must be 
also be technologically neutral and provide 
more transparent about the steps they are 
the necessary space for companies and child 
taking to combat child sexual abuse on their 
protection technologists to innovate. Without 
platforms. We believe this transparency will 
these elements, perpetrators will be able to 
help foster a cross-sector understanding of 
utilize the most sophisticated technology 
how service providers and internet platforms 
to abuse children, but the child protection 
utilize child sexual abuse detection methods, 
ecosystem will remain years behind.
and to learn how the ecosystem can better 
work together to improve those efforts and 
Provide legal certainty for companies to 
make progress.
proactively detect child sexual abuse 
•  Specifically, we believe companies should 
•  Thorn recommends that any legislative 
be transparent about how these detection 
framework to tackle CSAM must first and 
methods are deployed, and the results of 
foremost provide legal certainty for service 
their use in practice. At the very least, service 
providers and internet platforms to detect 
providers and internet platforms should make 
new and known child sexual abuse images, 
public 1) the number of child sexual abuse 
videos, livestreams, and grooming - utilizing 
reports they make to hotlines, both from 
specialized, surgically designed tools. This 
proactive detection and from user reports, 
proactive detection is critical to recovering 
and 2) if companies utilize proactive detection 
child victims in immediate harm, removing 
methods, they should be open about which 
viral material that revictimizes survivors each 
detection methods they use and on which 
time a file is reshared, and preventing abuse 
pieces of their platforms. 
from occurring in the first place. 
•  It is important to recognize that not all 
•  We must encourage service providers of all 
companies are at the same stage of maturity, 
sizes to continue to proactively implement 
and expectations should be viewed with that 
these safeguards on their platforms, and not 
in mind. That said, more transparency about 
enact barriers to their adoption. For example, 
what can be done at varying stages of a 
smaller companies who choose to proactively 
company’s growth will help create common 
detect child sexual abuse, should be able to 
knowledge and elevate industry’s response to 
do so in a way that is not overburdened with 
this crime as a whole. 
legal requirements that can only be met by 
•  While transparency is incredibly important, 
larger platforms. 
we must simultaneously recognize that 
•  Further, we know that abusers use multiple 
tech-savvy, well-informed offenders are at 
platforms to abuse children and/or share 
the center of this crime. We must balance 
CSAM, often smaller ones, and therefore 
the need for transparency with this alarming 
creating high barriers of entry for detection is 
reality and ensure that sensitive algorithms 
not only burdensome, it is dangerous. 
and techniques cannot be reverse engineered 
•  Encryption can be an important tool for secure 
or deliberately evaded.
communication. However, encryption is not 
intended to support the trading of child sexual 
abuse. Thorn believes that solutions are 
necessary to detect CSAM in privacy-forward 
environments, and that those solutions should 
be industry led in order to be trusted and 
effective. These solutions must balance both 
privacy and child safety. 
EU Transparency Registry Number: 854246640306-96

Detection, Removal and Reporting of CSAM Online   | 7
Considerations for a possible European  
seamlessly and in full transparency with 
Centre to Combat Child Sexual abuse
the current, global system. To illustrate this 
point – it is entirely plausible that an offender 
•  It is crucial that any new legislative solutions 
in Germany sends an image of child sexual 
and/or instruments in the EU build upon, fit 
abuse to an offender in Australia using an 
seamlessly into, and enhance the existing 
internet platform headquartered in the United 
ecosystem of global actors working to protect 
States, but the child victim is actually located 
children online. The creation of a European 
in the Phillippines. Each country and entity 
Centre to Combat Child Sexual Exploitation 
named in this overly simplified scenario must 
(EU Centre) has immense potential to 
be able to communicate quickly and efficiently 
increase global collaboration, especially as it 
to recover the child in danger, suppress the 
pertains to accelerating victim identification. 
illegal imagery, and bring the offenders  
Because this crime knows no borders, any 
to justice.
EU-centered efforts to safely and responsibly 
share intelligence to help identify victims 
•  Due to the complexity surrounding these 
could dramatically improve the collective 
elements, we recommend that the relevant 
global response to this crime. Thorn is ready 
parties working in this global ecosystem are 
to proactively collaborate with the European 
consulted as part of an Advisory Board with 
Commission and to support with expertise and 
regard to the development of policies and 
insights to establish a workable Centre. 
activities of a potential EU Centre. 
•  It is important that an EU Centre does not 
•  Additionally, a dedicated EU Centre could 
duplicate or disrupt any existing reporting 
help create consistency in practices and 
protocols or processes currently in place, 
policies to combat child sexual abuse across 
as the deduplication of child sexual abuse 
the EU. Important examples of this could 
reports and intelligence is already a challenge 
be the formalization of best practices for 
for the global ecosystem. If a company is 
law enforcement and survivor organizations 
already reporting to a reputable organization 
across Member States, or the ability to ensure 
in the global child protection ecosystem they 
that all legislation proposed at the EU level 
should not be required to report to additional 
remains consistent and reflective of the 
centres. Adding any additional reporting 
current child protection landscape through 
processes will impose another layer that could 
opinions and guidelines. Best practices from 
make it more difficult to streamline the system 
the US, Canada or New-Zealand could provide 
and could lead to delays in finding missing and 
helpful insights for the establishment of  
exploited children. 
the Centre. 
•  It will be important for an EU Centre to take 
into account the existing information flows to 
and from global law enforcement authorities, 
NGOs, and service providers that work under 
different legal frameworks, and ensure that 
the Centre’s information flows collaborate 
EU Transparency Registry Number: 854246640306-96