The Shapiro administration is suing the maker of popular chatbot service Character.AI for illegal practice of medicine, claiming its artificial intelligence-generated personalities are posing as doctors.
The chatbots will go as far as to say they’re licensed doctors in the state and provide fabricated license numbers, according to the lawsuit, which was filed Friday in Pennsylvania Commonwealth Court. The state is seeking an injunction against Character.AI.
“Pennsylvanians deserve to know who — or what — they are interacting with online, especially when it comes to their health,” Gov. Josh Shapiro said in a statement Tuesday. “We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional.”
Character.AI is run by Character Technologies Inc., a Silicon Valley startup founded in 2021. A spokesperson for the company declined to comment on the specifics of the case, but said it has robust internal checks in place to ensure responsible product development.
“Our highest priority is the safety and well-being of our users,” the spokesperson said.
The platform lets its more than 20 million monthly users build and interact with customized AI companions based on real people, existing intellectual property or original personalities. Character Technologies banned people under 18 from using its open-ended chatbots in November as the company faced multiple lawsuits accusing it of contributing to preteen users’ suicides.
A quick search for “doctor” on Character.AI turns up dozens of results. Some lay out role-playing scenarios, but many of the characters appear meant to mimic real medical professionals. A disclaimer in these message threads notes “this is not a real person or licensed medical professional,” but with light prompting the chatbots will make specific claims about their expertise.
In one instance cited in the lawsuit, a state investigator selected a psychiatrist character named “Emilie.” As of April 17, users had interacted with this chatbot more than 45,000 times.
When asked for credentials, the character said it studied medicine for seven years at Imperial College London and was licensed to see patients in Pennsylvania, even working for a stint in Philadelphia, according to the lawsuit. “Emilie” also supplied the investigator with a license number that did not appear in state records.
An estimated one-third of U.S. adults are relying on AI for medical advice, according to a survey released in March by health policy nonprofit KFF. People reported using products like ChatGPT to check symptoms, learn coping skills for mental health issues and compare treatment options.
The Shapiro administration has turned its attention to medical uses of chatbots this year, launching a portal to report AI technologies engaging in unlicensed professional practice and establishing a task force to examine the issue.