United States Senator Ted Cruz has led a bipartisan action on deepfake pornography after a DFW area girl allegedly suffered from such material.

In June, Cruz introduced the Take It Down Act with Minnesota Senator Amy Klobuchar. The legislation unanimously passed the Senate on December 3 and has been sent to the House of Representatives for consideration. No bill summary was available on the congressional website.

Cruz described the bill on X this way: “This important legislation will require social media companies and other sites to remove non-consensual intimate imagery and criminalize the person who published the hurtful images.”

“After the bill passed, I spoke with Elliston Berry of Texas and Francesca Mani of New Jersey — two of the driving forces behind this legislative victory. They are teenage girls who are victims of AI-deepfakes — they told their stories and have worked hard to address this problem and protect victims of deepfakes revenge porn,” continued Cruz.

In an attached video, Elliston Berry of Fort Worth can be heard cheering as Cruz announces the bill’s passage. Later, she thanked Texas’ junior senator. “It is so rewarding knowing that somebody heard us and is really taking us seriously,” she said.

CLICK HERE TO GET THE DALLAS EXPRESS APP

Berry and another high school-aged girl in New Jersey named Francesca Mani came to national attention because their likenesses were used in deepfake pornography.

Berry previously described her case to Fox News’s Maria Bartiromo, telling the anchor that she woke up one morning to find that a photo of her had been manipulated using AI to give her a nude appearance. She told Bartiromo that a classmate created the image, which was quickly circulated among her classmates at her school.

A deepfake image is a digitally altered or entirely generated visual created using artificial intelligence, typically deep learning models. These images are designed to convincingly mimic real people, objects, or scenes, often blending or swapping elements to create realistic but false representations. Deepfakes can be used for entertainment, parody, art, or malicious purposes.

There have been numerous previous attempts to outlaw deepfakes in federal elections, but these efforts have not come to fruition.

In some cases, there are extant legal remedies for someone suffering from a deepfake image.

“A libel suit can be filed if a deepfake damages someone’s reputation by portraying him or her in a false light. If a deepfake creator doesn’t own the images in the video, there’s potential liability for copyright infringement. There are also state laws regarding privacy and publicity that bar the unauthorized use of someone’s name or likeness,” Ken Paulson, Director of the Free Speech Center at Middle Tennessee State University, wrote on the center’s website.

Paulson also noted that in some cases, laws against harassment provide a remedy for persistent deepfake usage.

Cruz’s bill would make it a criminal offense to create and publish deepfake imagery to “abuse, humiliate, harass, or degrade [a] minor” or “arouse or gratify the sexual desire of any person.” If the victim is not a minor but an adult, similar penalties apply if the victim suffers “harm, including psychological, financial, or reputational harm, to the identifiable individual.”

If the Cruz-Klobuchar bill passes both chambers, it will be sent to President Joe Biden’s desk for a signature or veto.