Discord, launched in 2015 as a platform for online gamers, has evolved into a global hub with 150 million users, attracting communities discussing diverse topics such as crypto trading, YouTube gossip, and K-pop. However, the platform has a darker side, serving as a space for adults to exploit and groom children. NBC News found 35 cases in the past six years involving adults prosecuted for kidnapping, grooming, or sexual assault linked to Discord communications.
During the COVID-19 pandemic, 22 of these cases occurred, highlighting an alarming trend. The cases identified likely represent only a fraction of the issue, according to Stephen Sauer, director of the tipline at the Canadian Centre for Child Protection. These cases involve a range of scenarios, from teens being abducted and assaulted after grooming on Discord to adults facing charges for transmitting child sexual exploitation material (CSAM) or extorting minors through the platform.
Discord’s decentralized structure, multimedia tools, and growing popularity, particularly among young users, contribute to its appeal for exploitation. Reports of CSAM on Discord spiked by 474% from 2021 to 2022, as per the National Center for Missing & Exploited Children (NCMEC). While Discord’s cooperation with law enforcement and tiplines has been praised for providing valuable information, concerns have been raised about a decline in responsiveness to complaints, with the average response time increasing from three to nearly five days in 2022. Discord’s struggle with child exploitation echoes broader challenges faced by tech platforms, but its unique features and user base make it an attractive space for offenders. As the platform grapples with these issues, maintaining a balance between user privacy and safety becomes imperative in addressing the growing concerns surrounding child exploitation on Discord.