Sen. Ted Cruz held a field hearing in Dallas on Wednesday to address the spread of salacious deepfakes by big tech companies.

The hearing titled “Take It Down: Ending Big Tech’s Complicity In Revenge Porn” saw legislation proposed to tackle the growing nefarious use of artificial intelligence.

Cruz (R-TX) joined a bipartisan group of senators earlier in the month in Washington, D.C., to introduce the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks (TAKE IT DOWN) Act.

The TAKE IT DOWN Act would make it a federal crime with jail time for publishing non-consensual intimate imagery (NCII), including deepfake pornography, with heightened penalties if the victim is a child.

The bill would also cover images created by AI and apply to cases where the original image was consensual but the depicted individual did not consent to its publication. This bipartisan legislation would also make it a crime to threaten to publish such images online.

Cruz highlighted that this act would be the first to incorporate a notice and takedown requirement for social media platforms and other websites that allow users to post NCII. The bill has received support from over 40 organizations across the political spectrum, including unions and law enforcement.

“New generative AI tools have made creating realistic yet fake explicit images and videos of real people easier than ever,” Cruz said during the U.S. Senate Committee on Commerce, Science, & Transportation’s field hearing. “Due to advances in technology, now anyone can become a victim.”

He noted that there has been a boom in explicit images being created, particularly in high schools, where teenage girls have been the highest group of victims.

“The Take It Down Act ensures that social media prioritizes reports from victims, and if they don’t comply, it empowers the Federal Trade Commission to pursue enforcement actions. It is one of my top priorities that this bill be placed on the next committee markup so that it can be passed out of committee and it can be received by the Senate as soon as possible,” continued Cruz.

The field hearing in Dallas featured testimony from victims and advocates.

The two victims who testified were both 15-year-old females who were victims of AI-generated sexually exploitative images posted on Snapchat.

Elliston Berry, an Aledo ISD student from Parker County, had her world flipped upside down after a classmate created fake AI nude images. The incident occurred in October 2023 when a friend contacted Berry about the photos.

Berry was only 14 years old when she “feared her future was ruined,” she shared. “To this day, the number of people that these images or have seem them is still a mystery.”

She said it took eight months for the photos to be taken off of Snapchat.

“Every day I live in fear these photos will resurface, or someone can easily recreate them,” Berry said. “As a victim of AI deep fakes, it has created a tremendous amount of pain, and this is why I come here and share my testimony. My intent is to give victims a voice they never had and turn this horrible situation into something good.”