Just one day ago, on April 8, 2026, a federal court in Ohio, United States, made history. James Strahler II, a 37-year-old man from Columbus, became the first person in the United States to be federally convicted for creating and distributing deepfake pornography.
He pleaded guilty to cyberstalking, producing obscene visual representations of child sexual abuse, and – most significantly – “publication of digital forgeries” under the landmark Take It Down Act of 2025.
According to the criminal complaint, Strahler targeted at least 10 victims, including children and former partners. He used AI tools to generate explicit, non-consensual images and videos, then weaponized them through harassing texts, voicemails, and obscene photos sent to victims and their families. In one case, a woman in Hilliard, Ohio, and her mother received rape threats alongside the fabricated content. U.S. Attorney Dominick S. Gerace II stated: “We will not tolerate the abhorrent practice of posting and publicizing A.I.-generated intimate images of real individuals without consent.”
This case is not an isolated incident. It is a stark snapshot of a rapidly growing underground deepfake porn business that turns ordinary people — overwhelmingly women and girls — into unwilling performers for profit, revenge, or sheer malice.
The Scale of the Deepfake Porn Business
Deepfake technology has democratized abuse. What once required expensive Hollywood-level editing now takes minutes on free or low-cost AI apps. The numbers are staggering:
- 96–98% of all deepfakes online are non-consensual intimate images (NCII), with 99–100% of victims being female.
- Deepfake files are projected to reach 8 million annually by the end of 2025 – a massive jump from roughly 500,000 in 2023.
- The global deepfake AI market is exploding, with estimates placing it at hundreds of millions today and billions by 2030, driven in large part by demand for explicit content.
The “business model” is disturbingly simple and profitable. Underground forums and dedicated websites offer:
- Subscription tiers for access to massive libraries of celebrity and “custom” deepfakes.
- Pay-per-request services where anyone can upload a face and receive fabricated porn in hours for as little as a few dollars.
- Monetized apps and “nudify” tools that have racked up hundreds of millions of downloads.
This isn’t fringe activity – it has become industrialized. Victims report seeing their faces and bodies sold alongside real pornography, shared across social platforms, and used for extortion or public shaming. High-school students, celebrities, ex-partners, and even family members have all become targets.
Why This Matters: The Human Cost
The psychological toll is devastating. Victims describe living in constant fear — wondering who has seen the images, whether colleagues or family received them, and if the harassment will ever stop. Unlike traditional revenge porn, deepfakes are nearly impossible for the untrained eye to spot, making denial and removal even harder.
Strahler’s case involved minors and direct threats, escalating it to federal child sexual abuse material charges. But the vast majority of deepfake porn targets adult women without crossing into explicit criminal territory – until now. The Take It Down Act changes that by specifically criminalizing the knowing publication of both real intimate images and AI-generated “digital forgeries.”
Signed into law in May 2025, the Act requires covered online platforms to remove non-consensual intimate content within 48 hours of a valid notice. It also creates federal criminal penalties: up to two years in prison for adult victims and up to three years when minors are involved.
A Turning Point — But the Fight Continues
The Ohio conviction is a powerful signal. For the first time, federal prosecutors have used the “publication of digital forgeries” charge successfully. It sends a clear message: creators and distributors of deepfake porn can no longer hide behind “it’s just AI” or claims of free speech.
Yet laws alone won’t stop the flood. The technology evolves faster than enforcement. New generative AI models make fakes more realistic every month, and anonymous marketplaces continue to thrive.
At vali.now, we believe technology should empower people — not weaponize their identities. That’s why we built tools specifically designed to defeat deepfakes in real time. Our Image Integrity and Live Video Deepfake Detection products help businesses and individuals verify authenticity during video calls, onboarding, or high-stakes interactions. Whether you’re protecting your team from deepfake fraud or simply want peace of mind that what you’re seeing is real, our solutions put control back in your hands.
What You Can Do Today
- Report immediately: Use platform tools and contact law enforcement. The Take It Down Act gives victims a clear path.
- Limit your digital footprint: Review your social media privacy settings and be cautious about sharing high-quality photos of your face.
- Stay vigilant: If something looks suspicious in a video or image, verify it with trusted detection tools.
- Support victims: Organizations like the Cyber Civil Rights Initiative offer resources for those targeted by image-based abuse.
The Strahler conviction is a victory, but the deepfake porn business won’t disappear overnight. Awareness, strong legislation, and practical technology are the only way forward.
Stay informed. Stay protected
