Momentum is growing to expand laws to battle the generation of child pornography using artificial intelligence technology.

With new “nudification” tools on the dark web and rising “sexploitation” cases involving adolescents, law enforcement agencies and non-profit organizations alike have been sounding the alarm about the growing problem of AI technology being used for such nefarious purposes.

The Texas Senate Committee on Criminal Justice convened on Thursday to learn more about the problem and discuss ways to stop it. Developing a legislative response to child predators using AI to harm and exploit children is one of the interim legislative charges issued by Lt. Gov. Dan Patrick ahead of the next session.

While Texas lawmakers have already expanded the definition of child pornography to include sexually explicit AI visuals created using actual photos of minors through HB 2700, the resounding message that panel speakers had for state senators was that the statute was not enough.

“I’ve prosecuted child pornography for 17 years, and my first experience with an AI problem happened when a grandfather cut the face of his granddaughter off of a picture and crudely cropped it into an animated picture of a child,” Tarrant County Assistant District Attorney Lori Varnell told the committee. “And that’s not illegal under our current statute. Not illegal because the images of the child’s genital area were not of a child.”

Although content-based regulation could stir up various First Amendment issues, some of those who addressed the senators, such as Varnell, suggested it might be the only way to close up the legal loopholes that currently allow those wielding AI-generated nudes for reprehensible purposes to go free.

“One of the areas that we haven’t talked about so far is sextortion and sexual coercion,” explained Brent Dupree, the director of law enforcement for the Office of the Attorney General of Texas.

“Imagine a child getting a photograph of themselves … originally a benign photo, where a bad actor has stripped the clothes off of them, has sent that to them, and said, ‘If you don’t send me money, you don’t perform sexual acts, or you don’t actually produce real content for me, then I’m going to spread these around your school or at your work,'” he continued.

“That’s a problem that we are currently seeing with adult and child victims alike. But I think it’s also a problem that is going to be compounded with the production of and the promotion of AI capabilities. Bad actors are always going to reach out in new methods and new ways,” he added.

Yet another area of consideration is when the perpetrator is a minor.

Anna McAdams explained to the committee how a 15-year-old classmate of her 14-year-old daughter at Aledo High School used photos from Instagram to generate nudes of her and her friends using AI technology last October. The fake nudes were then distributed to the rest of the student body using various Snapchat accounts.

“I want to point out that he didn’t just take her face and put it on some random new person; it was actually their bodies, so the realistic part of that is horrific,” she said.

McAdams described the horror, despair, and helplessness experienced by the victims when confronted with the inability to do anything about it, even when the perpetrator was identified. Ultimately, the teenager was given probation, but his identity was never revealed to the victims, and he was allowed to continue to go to school without further repercussions.

“The school board was ill-equipped; there’s nothing in the school code of conduct to approach this, and so even law enforcement didn’t really know what to do with this child. … It’s not just adults who do this,” she said. “There’s not really any law in place for if you’re a minor and you’re doing these kinds of crimes.”