On a yearly basis, 17 million voluntary reports of child sexual abuse material (CSAM) and grooming of children are made to authorities, containing almost 3 million pictures and conversations from the European Union. In 2019, online grooming had tripled on certain social networks, according to NSPCC, and in spring 2020, Europol reported a surge in online distribution of CSAM due to the COVID-19 crisis.

To prevent children from going missing, all forms of sexual abuse and grooming, online and offline, need to be better detected, reported, and prosecuted. All actors including private entities, such as payment and internet service providers, should be involved in this effort.

We are therefore pleased to endorse two statements joining in total 100 organisations in welcoming Apple’s efforts to protect children by including privacy-respecting technology in the next update of their operating system. This technology would enable the company to detect known child sexual abuse images on devices prior to being uploaded to iCloud.

“We applaud Apple’s decision not to ignore this issue, but rather to confront it”.

Joint Statement signed by 92 organisations


The recent legal gap in the European Union showed the danger of inadequate privacy laws: the entering into force of a new e-privacy law at the end of 2020 stopped companies from voluntarily detecting and reporting CSAM and grooming on their platforms . This led to a 51 % decrease in reports of child sexual abuse emanating from the EU and put thousands of children at risk of abuse, exploitation, revictimization, going missing and trafficking. Thankfully, this legal gap was closed in July 2021 when the European Parliament approved a law that would allow tech companies to voluntarily detect and report CSAM and grooming for the next three years.

“Privacy for internet users is a fundamental right that must be supported, including for victims and survivors of abuse. The issue does not need to be a binary choice between privacy and safety: we believe Apple can demonstrate that it is possible to utilise detection technology in a way that detects illegal abuse images and maintains user privacy”.

Joint Statement signed by 32 organisations


As Apple has recently decided to pause implementation of its policy towards child sexual abuse material, we call on the company to remain on course and to establish a clear timeline for implementing these vital steps to prevent the repeated sexual exploitation of children.

For more information, we invite you to read: