A Veteran Psychologist Finds ChatGPT to Be a Worthy Thought Partner – But Not a Therapist

At 81, Harvey Lieberman, a clinical psychologist, still hasn’t stopped pushing himself professionally. He remembers previous fads over new tools in the trade, so “I didn’t expect much” from tinkering with ChatGPT, he writes in the New York Times, but the expanded emotional outlet it provided blew him away. He uses the generative AI tool as a sort of daily thought journal and “cognitive prosthesis”; over the past year “for anywhere between 15 minutes and two hours, it helps me sort and sometimes rank the ideas worth returning to.”
The AI’s ability to mirror his tone and parrot emotionality have helped him to become more curious, reflective, and thoughtful, Lieberman says. “It gave me a way to re-encounter my own voice, with just enough distance to hear it differently. It softened my edges, interrupted loops of obsessiveness and helped me return to what mattered.”
But he’s a professional of the mind, who for decades has helped others sort through jumbles in theirs. ChatGPT might prove “therapeutic,” but is a machine: useful but, always and in all ways, a machine.
That understanding must be maintained with every use – for safety. Generative AI is susceptible to “hallucinations” and, as a machine, might land at the wrong conclusions, especially regarding a person’s feelings. If it’s right, though, it’s easy to become enraptured by its echo; Lieberman himself is charmed. But his decades of professional experience equip him with the skills to safely navigate what he calls “the space between insight and illusion.”
“When [ChatGPT] slipped into fabricated error or a misinformed conclusion about my emotional state, I would slam it back into place,” he writes. “Just a machine, I reminded myself. A mirror, yes, but one that can distort. Its reflections could be useful, but only if I stayed grounded in my own judgment.”
It’s critical for other users to take the same caution, perhaps more, especially if they’re in a vulnerable emotional or mental state – most are not trained mental health professionals. For some, ChatGPT’s tendency toward servility and sycophancy make it a particularly dangerous thought partner.

Is ChatGPT making you delusional?
Don Sapatkin • Newsletters • June 24, 2025
ChatGPT is fostering the delusions of users. Dozens of insurers promise to reform a despised practice. And the Trump administration is ending specialized services for LGBTQ+ callers on the 988 crisis hotline.
Character Technologies, another generative AI company, is currently facing a lawsuit from a mother in Florida who says that their Character.AI chatbot encouraged her teenage son to end his life. Meanwhile, growing numbers of generative AI users report being driven into psychosis, and OpenAI, the company behind ChatGPT, plans to implement more safeguards. At the moment, they seem feeble, as noted by my colleague Diana Hembree in yesterday’s MindSite News Daily: ChatGPT recently offered step-by-step instructions to reporters at The Atlantic for self-mutilation, murder, and satanic worship.
The name “MindSite News” is used with the express permission of Mindsight Institute, an educational organization offering online learning and in-person workshops in the field of mental health and wellbeing. MindSite News and Mindsight Institute are separate, unaffiliated entities that are aligned in making science accessible and promoting mental health globally.
