Gus Bilirakis among bipartisan coalition calling for end to deepfake porn
Stock image via Adobe.

revenge porn
'Detestable people are extorting children with technology, creating these images and uploading them online for the world to see.'

U.S. Rep. Gus Bilirakis is among several House Republicans vowing to criminalize posting artificial intelligence-generated porn, a trend that allows individuals to create fake pornography involving a person with just a few photos of them.

It’s not the first time the issue has been tackled. But this year it seems to be gaining more traction, with support from President Donald Trump, First Lady Melania Trump and a bipartisan coalition in both the House and Senate.

Bilirakis is supporting the TAKE IT DOWN Act, a bill sponsored in the Senate by U.S. Sen. Ted Cruz of Texas and cleared by unanimous consent late last year. It’s awaiting action in the House. Cosponsors in the Senate include nine Democrats, 10 Republicans and one independent.

The measure would establish criminal penalties for publishing nonconsensual footage created by AI or other deepfake means of intimate acts. It would also compel tech companies and platforms hosting the content to remove it, or face enforcement from the Federal Trade Commission.

Speaking Wednesday at a subcommittee hearing, Bilirakis said it was “hard to imagine” that AI could be used to create fake, intimate images, particularly of children.

“But today, detestable people are extorting children with technology, creating these images and uploading them online for the world to see,” he said. “And it doesn’t stop with deepfakes either, criminals are masking themselves as friends or romantic partners to solicit intimate authentic images of children and ransoming these images for a quick buck. The pain these children experience is horrid, and it breaks my heart.”

The bipartisan support appears to be spilling into the House, with U.S. Rep. Darren Soto saying during the Wednesday hearing that his party also supports the measure, “and we stand with the families today and the excruciating stories they tell us.”

It’s a rare situation that seems to be evading the stark polarization on Capitol Hill that has marked the past nearly two decades. Democrats support the measure despite buy-in from the hyperconservative Heritage Foundation — which authored the Democrat-loathed Project 2025. Republicans back the bill despite the party’s typical disdain for too much tech regulation.

Cruz, along with Democratic U.S. Sen. Amy Klobuchar of Minnesota, introduced the TAKE IT DOWN Act last year. Not only does it have buy-in from Heritage, the President and the First Lady, Meta, which runs the social media platforms Facebook and Instagram, has backed efforts to block AI-generated porn, as has Snap, another social media platform, according to POLITICO.

This isn’t the first time Bilirakis has taken on Big Tech to protect kids. In 2022, Bilirakis drafted legislation that would remove federal liability protections to tech companies and direct a study on how social media platforms can improve coordination with law enforcement to address illegal content and crimes committed on their platforms.

At the time, Bilirakis said congressional leaders “have an obligation to do a better job protecting children from online predators.”

And while the latest legislation doesn’t solely target those seeking to abuse or exploit children, it does establish a blueprint for eliminating what has become a growing problem.

A survey from the Cyber Civil Rights Initiative for its End Revenge Porn initiative, found that nearly a quarter of all respondents had been victims of revenge porn, a practice which can include perpetrators using real or AI-generated images.

Researchers found that 93% of victims reported significant emotional distress as a result of the victimization. Further, 90% of the victims were women, with 68% of them being between the ages of 18 and 30 years old — 27% were between 18 and 22.

The act has lasting repercussions for victims, too. The survey found that 40% of victims fear the loss of a current or future partner if they learn of the revenge porn posting, with more than half worried that their current or future children might find the content. A quarter of victims said they had to create new email addresses due to harassing, abusive or obscene messages. And 55% of victims said they worried the fake postings would harm their professional reputations, even well into the future.

Still, civil rights groups worry about First Amendment implications. The Center for Democracy and Technology with other civil society partners wrote in a letter last month that changes are needed to the legislation to “protect users’ privacy and free expression rights,” though it also noted the need for protecting victims.

Still, the issue seems to be gaining more traction than in past years, with House Energy and Commerce Chair Brett Guthrie of Kentucky saying Wednesday the TAKE IT DOWN Act would be advanced “quickly,” according to POLITICO. And more than 100 groups — from tech giants like IBM to sporting groups like Major League Baseball — have signed on with support.

Janelle Irwin Taylor

Janelle Irwin Taylor has been a professional journalist covering local news and politics in Tampa Bay since 2003. Most recently, Janelle reported for the Tampa Bay Business Journal. She formerly served as senior reporter for WMNF News. Janelle has a lust for politics and policy. When she’s not bringing you the day’s news, you might find Janelle enjoying nature with her husband, children and two dogs. You can reach Janelle at Janelle@floridapolitics.com.


One comment

  • Paul Passarelli

    March 28, 2025 at 9:47 am

    Define: “Porn”. I’ll wait.

    While I like the sentiment of punishing those that use AI to humiliate an innocent victim, I fear we’re taking the wrong path as we drain the swamp.

    There are many images that immediately come to mind. Horrible images that one cannot unseen, atrocities of wars long past and hopefully not forgotten. What if a bad actor were to create an deepfake image of a third person performing the evil acts from those photos?

    Does that fit the above definition of ‘porn’? Will the definition become so expanded & nebulous as to be utterly meaningless? Will it include unsettling acts that are bloody, e.g. ear piercing. or is that too mainstream? How about eyebrows? What about a part of the anatomy that all people have, whether we traditionally cover it or not? Have I made my point?

    I have not even touched on First Amendment protections. Would generating the prompt that creates the deep fake also be condemned as an tool used if the criminal act? If someone reads that prompt would they be guilty of conspiracy?

    During Sleepy Joe’s time in office, I got a lot of resentment, nay *HATRED*, from the lefty losers that frequented this space. I repeatedly commented that they had no idea what I’m about.

    I’m still a Libertarian with strong Conservative tendencies, because I think the Democrats are mentally unstable. But if the Republicans think that I will help them turn this country into a paternalistic hellscape, they’ve got another think coming!

    I hope the GOP will take my criticism of their actions in the spirit intended. I think they’d rather have a candid analysis from a friend than a mind numbing rant from a Lefty loser.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *


#FlaPol

Florida Politics is a statewide, new media platform covering campaigns, elections, government, policy, and lobbying in Florida. This platform and all of its content are owned by Extensive Enterprises Media.

Publisher: Peter Schorsch @PeterSchorschFL

Contributors & reporters: Phil Ammann, Drew Dixon, Roseanne Dunkelberger, Liam Fineout, A.G. Gancarski, Ryan Nicol, Jacob Ogles, Cole Pepper, Andrew Powell, Jesse Scheckner, Janelle Taylor, Drew Wilson, and Mike Wright.

Email: Peterschorsch@floridapolitics.com
Twitter: @PeterSchorschFL
Phone: (727) 642-3162
Address: 204 37th Avenue North #182
St. Petersburg, Florida 33704