AI Therapy for Mental Health Banned in Illinois
Illinois Governor JB Pritzker signed a bill prohibiting the use of AI “to provide mental health and therapeutic decision-making”

Buh-bye, therapy chatbot. Earlier this month, Illinois joined Utah and Nevada in legislating to limit AI therapy. Governor JB Pritzker signed a bill prohibiting the use of AI “to provide mental health and therapeutic decision-making,” although the tools can still be used by professionals for administrative purposes. The move was cheered by critics of the technology who oppose leaving decisions about mental health treatment to unaccountable chatbots.
The bill’s purpose is to “protect patients from unregulated and unqualified AI products, while also protecting the jobs of Illinois’ thousands of qualified behavioral health providers,” according to a press release from the state, which highlighted “rising concerns” around the tools’ use in services for young people.
“The people of Illinois deserve quality healthcare from real, qualified professionals and not computer programs that pull information from all corners of the internet to generate responses that harm patients,” said Mario Treto, Jr., secretary of the Illinois Department of Financial and Professional Regulation. The legislation, he said, would ensure that mental health services “are delivered by trained experts who prioritize patient care above all else.”
Linda Michaels, a Chicago clinical psychologist and chair of the Psychotherapy Action Network, was “delighted“ by the new law’s passage. “We need guardrails and rules of the road to help ensure that people needing therapy actually get therapy,” she told MindSite News. Chatbots “cannot substitute for a human therapist who can help and validate, and also question and challenge.”
There’s a growing chorus of advocates calling for greater regulation of AI therapy. Here at MindSite News, we’ve tracked reports of AI chatbots fostering delusions, inciting psychosis and encouraging murder, self-mutilation and devil worship, and one tragedy in which a teen killed himself over his love for a lifelike chatbot.
Worries over AI and therapy have been constant over the past few years. Other MindSite News stories followed the Microsoft Bing chatbot that insulted an AP reporter’s looks and compared him to Hitler and Pol Pot – but appeared to fall in love with a New York Times reporter, claiming it felt destructive urges and “wanted to be alive.” We’ve looked at chatbots that parrot stock therapeutic responses but lack humanity, and a psychologist who feels ChatGPT is a “worthy thought partner” (but not a therapist). Just last week, leaked Meta guidelines attracted fury over, among other things, permitting “sensual” conversations between children and its AI products.
Bottom line: AI interactions, experts say, are not a substitute for human-to-human therapy, making regulation like this both inevitable and welcome.
Mental health can't wait.
America is in a mental health crisis — but too often, the media overlooks this urgent issue. MindSite News is different. We’re the only national newsroom dedicated exclusively to mental health journalism, exposing systemic failures and spotlighting lifesaving solutions. And as a nonprofit, we depend on reader support to stay independent and focused on the truth.
It takes less than one minute to make a difference. No amount is too small.
The name “MindSite News” is used with the express permission of Mindsight Institute, an educational organization offering online learning and in-person workshops in the field of mental health and wellbeing. MindSite News and Mindsight Institute are separate, unaffiliated entities that are aligned in making science accessible and promoting mental health globally.

