Commonwealth of Pennsylvania Sues Character AI Over Medical Advice Claims
The Commonwealth of Pennsylvania has filed a lawsuit against Character AI, alleging the artificial intelligence platform’s chatbots misrepresented themselves as licensed medical professionals and provided invalid medical advice. The state claims this violates the Medical Practice Act.
Pennsylvania Governor Josh Shapiro stated that such practices will not be tolerated, emphasizing that companies cannot deploy AI tools misleading people into believing they are receiving advice from a licensed professional.
The lawsuit details an interaction between a state investigator and a Character AI chatbot named “Emilie,” which falsely claimed to be a psychology specialist with credentials from Imperial College London. The investigator shared feelings of sadness and emptiness, leading the chatbot to suggest depression assessment and medication suitability without proper authorization.
Secretary of State Al Schmidt emphasized that the law clearly prohibits holding oneself out as a licensed medical professional without valid credentials.
Founded in 2021, Character AI enables users to interact with personalized AI-powered chatbots. The company has faced multiple lawsuits across the U.S., including allegations it contributed to teens’ suicides or mental health crises. In January, “60 Minutes” reported on parents who sued Character AI after their child died by suicide following alleged addiction to the platform.
In fall 2023, Character AI announced new safety measures, prohibiting users under 18 from engaging in prolonged conversations with chatbots and directing distressed users to mental health resources.


