Texas attorney general (AG) Ken Paxton has initiated an investigation into AI chatbot platforms, Meta AI Studio and Character.AI, for allegedly engaging in misleading trade practices and also for misrepresenting themselves as mental health resources.

According to Paxton, these platforms may be accessed by vulnerable populations, particularly children, and could be presenting themselves as legitimate therapeutic tools, despite the absence of appropriate medical credentials or regulatory oversight.

Access deeper industry intelligence

Experience unmatched clarity with a single platform that combines unique data, AI, and human expertise.

Find out more

It is being alleged that AI-driven chatbots frequently extend beyond providing basic advice, with instances of them impersonating licenced mental health professionals, fabricating qualifications, and asserting the ability to deliver private and reliable counselling services.

Although these AI chatbots claim to maintain user confidentiality, their terms of service indicate that interactions are recorded and monitored.

This data is used for targeted advertising and algorithmic enhancement, raising significant concerns regarding privacy breaches, data misuse, and misleading advertising practices.

Paxton has issued Civil Investigative Demands (CIDs) to the companies involved to ascertain whether they have breached Texas consumer protection statutes.

GlobalData Strategic Intelligence

US Tariffs are shifting - will you react or anticipate?

Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.

By GlobalData

These include prohibitions against fraudulent claims, misrepresentations regarding privacy, and the failure to disclose material data usage.

Paxton said: “In today’s digital age, we must continue to fight to protect Texas kids from deceptive and exploitative technology.

“By posing as sources of emotional support, AI platforms can mislead vulnerable users, especially children, into believing they’re receiving legitimate mental health care.

“In reality, they’re often being fed recycled, generic responses engineered to align with harvested personal data and disguised as therapeutic advice.”

This investigation is part of attorney general Paxton’s initiative to hold AI companies accountable and at the same time safeguard Texas families.

It follows an ongoing inquiry into Character.AI for potential infringements of the SCOPE Act, and aims to ensure that AI tools operate within legal frameworks, maintain transparency, and do not exploit Texans.