- Free Confidential Consultation: (404) 816-5000 Tap Here To Call Us
AI Generated CSAM – What to Know
What is AI-Generated CSAM?
Synthetic child sexual abuse material (CSAM) may not involve a camera or physical contact but it will depicts a child in a sexual context and thus is subject to regulation. Artificial intelligence is used to create realistic images or videos of children in sexual situations. Sometimes, these materials depict completely fabricated children or the face or likeness of a real child, placing them into sexualized content without their knowledge or consent. When it is the face or likeness of a real child, there is a higher likelihood of charges being imminent.
Can I be Charged with Synthetic CSAM?
It is possible to be charged criminally when in possession of AI generated child sex abuse material. There has been some movement in this area to litigate the legality of charging such activity criminally and only time will tell what that means for how the Supreme Court will rule as recent cases move through the system. News continues to state that AI – Generated CSAM is on the rise and “a growing concern” which will in turn result in additional federal and state criminal charges for AI-Generated CSAM.
Regulation of AI-created intimate images, often referred to as “deep fakes” has been broadly adopted by most states and the federal government. In the state of Georgia, there have been several attempts to change legislation including this recent letter written by Jon Ossoff. There are also recent prosecutions including a man in KY who was sentenced to five years for using AI to create child sexual abuse material. Prosecutions at the federal level for CSAM is on the rise as well.
If you learn that you are under investigation for offenses related to CSAM, you should contact us immediately. We are here to help and have experience to fight these cases.








