
United States Court of Robert W. Kastenmeyer, Madison, Wisconsin. Credits: Photo by Carol M. Highsmith. sauce
The US District Court opinion from last month could shape federal indecent jurisprudence in the age of AI-generated child sex abuse material (CSAM). In reiterating the constitutional right to private possession of obscene material, this decision is a timely reminder of the limitations of the initial amendment terms of government efforts to punish speeches that are harmful to children. However, by allowing the defendant’s prosecutors to advance other charges, the court’s opinion shows that the government has enough tools to bring criminals accused of the sexual exploitation and abuse of AI-responsive children to trial.
Last February I published a paper with laws that analyse the legal and policy aspects of AI-generated CSAM. As my paper explains, obscene and CSAM are two different categories of unprotected speech. As generative AI is now able to produce very realistic images, federal prosecutors predicted they would begin to rely more on the historically unused law, the federal child indecency law of 18 USC §1466a. Unlike federal CSAM laws that apply only to materials containing actual identifiable minors, child indecent laws do not require that “minors actually exist.” Thus, prosecutors can avoid potentially difficult questions that determine (and prove to the ju-describers) whether a photorealistic image is a real child.
Last May, the Federal Jury in Wisconsin indicted Stephen Andregg, a man who allegedly sent messages to a teenage boy on Instagram using stable spread to create obscene images of minors (stimulating Meta to report his account to CybertIPline). Anderegg was charged with three counts under Section 1466A for the production, distribution and possession of child indecency. He was also charged with one count under the different laws of transferring small amounts of material to minors.
(There was no claim that any of the images depicting real children distinguished most of the other federal criminal cases that I have seen in AI, including CSAM, from most of the other federal criminal cases that contain AI-modified images of real children.)
Anderegg moved to reject each of the four counts. Last month’s opinion was that the courts mostly rejected motions. However, the court dismissed the offence and found Section 1466A unconstitutional as applicable to Andreg’s private possession of an obscene “virtual” CSAM.
The Supreme Court states that the First Amendment is his home, Stanley V. Georgia, 394 US 557 (1969) has determined that it protects its right to own obscene material. More recently, the court found that the First Amendment also protects “virtual” CSAMs that do not involve actual children, Ashcroftv. FreeSpeech Coalition, 535 US 234 (2002). Under that series of cases, Anderegg argued that the initial modification protected private possession of CSAM generated by obscene AI.
The court agreed with Andeleg and rejected the government’s argument against it. The government argued that the incident resembles Osborne than Stanley, with Stanley being limited to indecent material depicting adults, and that Congress had a compelling interest in banning the ownership of indecent “virtual” CSAMs. The court rejected these arguments as contradicting the Free Speech Union, where the government basically failed to do the same argument (as I have argued in my paper). Osborne said the case was not appropriate because it did not involve real children. Rather, it is like Stanley, “reliant on the importance of freedom of thinking and the holiness of the house.”
Finally, the government tried to distinguish between Stanley. Because (unlike the state law in question) Section 1466A required hooks for interstate or foreign commercial jurisdictions, and Andereg was said to have owned the images on a foreign-made laptop. The court replied that this was not a meaningful distinction, “(i) the elements of the jurisdiction were sufficient to overcome Stanley, Stanley would become a dead letter.”
That’s exactly what I said a year ago. “If CG-CSAM creators maintain and do not share the material with themselves, they may be protected by the constitutional right to personally own the indecent issues,” and “they must have “some degree of limitations” on the jurisdiction hook of 1466A. It is nice to see the court agreeing to my analysis and repeating Stanley’s continued importance.
It was not unexpected to see the government attempt to relate a failed argument to the free speech coalition, essentially. At the time, Judge Thomas predicted that technological advancements might need to be reconsidered the ruling one day. Twenty years later, Justice Clarence Thomas believes that the time has come thanks to the advent of AI tools to generate highly light realistic images, as it is only justice from the decision that is still in court. The court tells them, “not that fast.”
Not all good news for Anderegg. The court agreed to dismiss the crime of possession against Andeleg, but refused to expand Stanley into the production of obscene ai-csams. According to the court, Stanley focuses on ownership, does not mention production, and the Supreme Court does not consider it suitable to recognize the protection of indecent production for 55 years. Additionally, the court refused to dismiss the distribution fees in Section 1466A for the transfer of images to minors.
If purely private ownership of AI-CSAM is constitutionally protected under the present Kaserou, but production is not, then using AI models (even locally hosted) to generate child indecency in their own homes is not entirely isolated from criminal prosecution. It is also a basis for responsibility to send it to someone else, especially minors. The court’s ruling shows that even with restrictions on Stanley’s possession charges, the legislation in the book provides the government with enough options to prosecute AI-CSAM without angering the First Amendment.
Despite avoiding firing three of the four counts, the government is suing an unfavourable ruling regarding the ownership count of the Seventh Circuit (where the case is 25-1354 is awarded). To my knowledge, this is the first criminal case that includes the Generation AI, the CSAM Act, and the first amendment to reach the federal court of appeals. This will be something to see.