A recent report by Thorn, a technology nonprofit dedicated to child protection, reveals a troubling increase in online risks faced by children. The “Emerging Online Trends in Child Sexual Abuse 2023” report highlights that minors are increasingly involved in the sharing of sexual images, either consensually or coercively, while also experiencing a surge in risky online interactions with adults.
John Starr, VP of Strategic Impact at Thorn, emphasizes that child sexual abuse content is becoming more easily accessible on everyday digital platforms, and harmful interactions between youth and adults are no longer confined to the dark corners of the internet.
This report aligns with findings from other child safety organizations, such as the National Center for Missing and Exploited Children (NCMEC), which has witnessed a staggering 329% increase in reported child sexual abuse material (CSAM) files over the past five years. The rise in CSAM reports can be attributed to several factors, including the deployment of detection tools like Thorn’s Safer product, the audacious tactics of online predators using technologies such as chatbots, and the growing prevalence of self-generated CSAM (SG-CSAM). SG-CSAM alone saw a 9% increase from 2021 to 2022, according to the Internet Watch Foundation.
To combat this escalating issue, technology plays a pivotal role, particularly through the utilization of hashing and matching technology. This programmatic approach enables tech companies to detect and disrupt the spread of CSAM. By converting files into unique numerical hash values, tech platforms can compare these values against known CSAM hash lists, allowing for the identification, blocking, or removal of illicit content.
Thorn’s Safer tool, which hashed over 42.1 billion images and videos in 2022, has made significant strides in this area, aiding in the identification of more than two million CSAM pieces on their platforms. Collaborative efforts among tech companies and NGOs, as well as the expansion of known CSAM hash values, are crucial steps toward eliminating CSAM from the internet and ensuring online safety for children.