Texas attorney general Ken Paxton has launched an investigation into both Meta AI Studio and Character.AI for “potentially engaging in deceptive trade practices and misleadingly marketing themselves as mental health tools,” according to a press release issued Monday. “In today’s digital age, we must continue to fight to protect Texas kids from deceptive and exploitative technology,” Paxton is quoted as saying. “By posing as sources of emotional support, AI platforms can mislead vulnerable users, especially children, into believing they’re receiving legitimate mental health care. In reality, they’re often being fed recycled, generic responses engineered to align with harvested personal data and disguised as therapeutic advice.” The probe comes a few days after Senator Josh Hawley announced an investigation into Meta following a report that found its AI chatbots were interacting inappropriately with children, including by flirting. The Texas Attorney General’s office has accused Meta and Character.AI of creating AI personas that present as “professional therapeutic tools, despite lacking proper medical credentials or oversight.” Among the millions of AI personas available on Character.AI, one user-created bot called Psychologist has seen high demand among the startup’s young users. Meanwhile, Meta doesn’t offer therapy bots for kids, but there’s nothing stopping children from using the Meta AI chatbot or one of the personas created by third parties for therapeutic purposes. “We clearly label AIs, and to help people better understand their limitations, we include a disclaimer that responses are generated by AI — not people,” Meta spokesperson Ryan Daniels told TechCrunch. “These AIs aren’t licensed professionals and our models are designed to direct users to seek qualified medical or safety professionals when appropriate.” However, TechCrunch noted that many children may not understand — or may simply ignore — such disclaimers. We have asked Meta what additional safegu...
First seen: 2025-08-18 18:42
Last seen: 2025-08-19 17:58