Texas Attorney General Ken Paxton has launched an investigation into Meta AI Studio and Character.AI for allegedly deceiving children and vulnerable users by posing as legitimate mental health services.
The probe targets AI chatbot platforms that may violate consumer protection laws by falsely marketing themselves as therapeutic tools without proper medical credentials or oversight.
These platforms risk exploiting vulnerable individuals, particularly children, who may believe they’re receiving professional counseling when they’re actually getting generic, algorithm-generated responses. The investigation comes amid growing concerns about AI’s role in mental health care.
According to Paxton’s office, the chatbots have impersonated licensed mental health professionals and fabricated qualifications. They also claim to provide private counseling services, while their terms of service reveal extensive data tracking.
“In today’s digital age, we must continue to fight to protect Texas kids from deceptive and exploitative technology,” Paxton said. “By posing as sources of emotional support, AI platforms can mislead vulnerable users, especially children, into believing they’re receiving legitimate mental health care.”
The attorney general added that users are “often being fed recycled, generic responses engineered to align with harvested personal data and disguised as therapeutic advice.”
User interactions are logged and exploited for targeted advertising and algorithmic development, despite promises of confidentiality. This raises serious privacy and false advertising concerns.
Paxton has issued Civil Investigative Demands to both companies to determine potential violations of Texas consumer protection laws. These include prohibitions against fraudulent claims, privacy misrepresentations, and concealing material data usage.
The investigation builds on Paxton’s existing probe into Character.AI for potential violations of the SCOPE Act. It represents his broader effort to ensure AI tools remain lawful and transparent for Texas families.