Steven Anderegg, a 42-year-old man from Holmen, Wisconsin, has been arrested for creating, distributing, and possessing child sexual abuse material (CSAM) using generative artificial intelligence (GenAI). Anderegg allegedly utilized the Stable Diffusion text-to-image model to generate sexually explicit images of minors, many depicting nude or partially clothed minors engaging in sexual acts. Evidence from Anderegg’s devices revealed his intentional creation of CSAM, including communication with a 15-year-old boy to whom he sent explicit materials via Instagram.
Anderegg’s identification stemmed from a “CyberTip” from the National Center for Missing and Exploited Children (NCMEC), triggered by Instagram reporting Anderegg’s account for distributing such images. A federal grand jury in the Western District of Wisconsin indicted Anderegg on charges of producing, distributing, and possessing obscene visual depictions of minors engaged in sexually explicit conduct, as well as transferring obscene material to a minor under 16 years old. If convicted, Anderegg faces a maximum penalty of up to 70 years in prison, with a mandatory minimum sentence of five years.
Deputy Attorney General Lisa Monaco emphasized the Justice Department’s unwavering commitment to protecting children from exploitation, regardless of how the abusive material was created. Monaco stressed the department’s determination to pursue individuals who produce and distribute CSAM, asserting that CSAM generated by AI remains subject to prosecution. This case underscores the seriousness with which authorities approach the exploitation of technology to create increasingly realistic and harmful images of children, affirming their resolve to hold perpetrators accountable for their actions.