1 in 8 Youth Turn to AI for Mental Health. ‘It Just Tells Me What I Want to Hear’ 

Youth with the deepest emotional needs are using chatbots for therapy the most – but many are unhappy with the outcome.

Getting your Trinity Audio player ready...

Image of a chatbot emerging froma smartphone
Ole.CNX/Shutterstock

Part of a new, ongoing series: AI and Mental Health.

Young people are flocking to use artificial intelligence tools, and among the heaviest users are youth with the greatest emotional needs – those who have been bullied or cyberbullied, who feel put down by their parents or who have experienced violence in their home. That’s according to a new survey of 1,340 teens and young adults aged 13 to 24.

The heavy AI users – dubbed “emotionally entangled superusers” by Surgo Health, the AI-powered research company that organized the study – make up just 9% of all youth using AI. But it’s the group that is becoming most dependent on the technology and may be the most poorly served by it.

Cover of report on AI and youth mental health
Cover of Surgo Health report on AI and youth mental health

Youth who report being bullied are 3.6 times more likely to be classified in the “emotionally entangled“ group. Those coping with discrimination, limited family or community support, or financial strain are also more likely to rely on AI for emotional or social support. Black youth were three times more likely than white youth – 18% vs. 6% – to use AI daily or weekly for emotional support.  

“The people who have the most barriers are turning to a system that wasn’t necessarily built to help them but is accessible,” said Hannah Kemp, Surgo’s chief solution officer. “Most young people have access to some generative AI tool on their phone, on their computer, at school. It’s much easier to access Chat GPT to talk about a mental health struggle than figuring out, “How do I get help? Does my health insurance cover this? I’d have to talk to my parents about finding a therapist and I don’t want to burden them.”

By contrast, the report finds, “youth with strong family, school, and community support tend to use AI pragmatically and with fewer negative mental health tradeoffs.”

Receive thoughtful coverage of mental health policy and solutions daily.

Subscribe to our free newsletter!

In general, almost half of the young people surveyed said they use generative AI platforms like ChatGPT and Claude on a daily or weekly basis, and a third use them less frequently. 

One in five young people don’t use AI tools at all. About a third – dubbed “thriving light touch pragmatists“ – use the technology but maintain a “healthy, arms-length relationship.” Other small groups, each making up 10% or less, either use AI optimistically – to learn or to try to improve their station – or with skepticism or fear, worried about how AI will affect their futures and that of the world.   

Desperately seeking bot support 

About 12% of all users say they were experiencing mental health struggles and were using AI to ease their pain and isolation or otherwise address their need for help. 

Kemp says that a key problem for young people who are seeking support from chatbots is that a large majority of them – 69% – are turning to all-purpose AI programs like ChatGPT or Claude that are not designed to be mental health applications.

“These general purpose tools are not meant to be substitutes for human care,” Kemp said. “They’re not vetted, they’re not trained in that way and they’re telling people what they want to hear. That’s a real problem.”

Adele Wang, Surgo’s associate director of research and data science, said one user who took the survey told the Surgo team: “It didn’t challenge me. It just tells me what I want to hear. It tells me things that I’ve told it, and it just repeats it back to me. It’s not helping me to think through these emotions in an effective way.”

Even when people consulted an AI chatbot because they were experiencing mental health struggles and felt they needed support, the chatbot failed 41% of the time to suggest that they seek professional help from human clinicians, the survey found. 

When AI and youth mental health collide

“That’s a huge missed opportunity that tech companies and regulators really need to pay attention to,” Kemp said. “They’re being used for mental healthcare and yet there’s no kind of bridge to pushing people into the formal care system who need it.”

The survey also found that about half of young people using AI were displeased – “neutral to negative,” in Kemp’s words – with the mental health support they received from bots. That was especially true for people who were using AI in place of professional care rather than an add-on to help they might get from a therapist or counselor.

A young advocate reflects on AI

Had AI chatbots been around early in the COVID-19 pandemic when Saanvi Aroro was attending high school in San Jose, her friends and peers probably would have used them, she says. It was a deeply stressful time. She was 15 when she lost one of her closest friends to suicide, and the experience propelled her into mental health activism.

Would access to AI platforms have helped back then? Aroro’s not so sure. 

Saanvi Arora is a UC Berkeley student from San Jose who got active in mental health advocacy after losing a friend to suicide during the pandemic. Photo: DailyCal

Today, at 21, she’s a computer science and legal studies double major at UC Berkeley who still finds time to write model legislation and to lobby state and federal officials to boost funding for mental health. But even as she has immersed herself in that work, and won recognition for her efforts, she has seen the signs of slippage.

Cutbacks and reductions in mental health services and funding, especially over the past year have had “dire and significant impacts on young people’s morale,” she says, and made it far more difficult for people to access the care they need. 

“This phenomenon of young people turning to chatbots is almost a natural result of that. If you don’t have the ability to seek health care traditionally, like in a clinical setting,” she says, “it’s reasonable to turn to chatbots.”

Her worry, however, is that a growing number of young people are turning to chatbots because the barriers of seeking professional care from human beings are too great. It’s simply too expensive, takes too much time, requires leaping too many hurdles.

“When you’re in a mental health crisis or are feeling like you want or need to talk to someone, it’s easy to turn to a chatbot because they’re immediate,” Aroro says – much quicker and easier than scheduling an appointment with an “actual therapist.”

‘Instant, judgement-free support’

Many people also like the anonymity of talking to a chatbot and appreciate being able to engage with one “as a source of instant, judgement-free support,” the report noted.

Aroro thinks they also like not having to worry that if they say the wrong thing, 911 might be called and the police might be summoned. But perhaps the biggest motivator, she adds, “is the lack of an alternative and just in the moment needing support and having that available.”

Surgo Health leaders hope the report will help ensure that AI complements, rather than replaces, human connection and off-line clinical support and will push AI developers to take responsibility for how their platforms influence youth mental health, particularly during moments of distress.

“The most effective responses to youth mental health in the AI era won’t come from technology alone,” Sema Sgaier, CEO and co-founder of Surgo Health, said in a statement. “They will come from investing in the social and emotional environments that shape how young people use these tools, alongside responsible design and governance.”

About the Studies

Two studies were released. The larger one was based on online and telephone interviews in English and Spanish with 1340 people ages 13 to 24 and were conducted in October and November of 2025. The second zeroed in on a subset – the 12% of young people who used AI primarily to access support for their mental health challenges.

The survey and studies were led by Surgo Health, a public benefit corporation that is building an AI-powered platform that “reveals the why behind people’s behaviors.”

An advisory board composed of young people helped Surgo Health create the questions and design the studies. Two nonprofit groups, Young Futures and the JED Foundation, collaborated in the research.

The name “MindSite News” is used with the express permission of Mindsight Institute, an educational organization offering online learning and in-person workshops in the field of mental health and wellbeing. MindSite News and Mindsight Institute are separate, unaffiliated entities that are aligned in making science accessible and promoting mental health globally.

Creative Commons License

Author

Rob Waters, the founding editor of MindSite News, is an award-winning health and mental health journalist. He was a contributing writer to Health Affairs and has worked as a staff reporter or editor at Bloomberg News, Time Inc. Health and Psychotherapy Networker. His articles have appeared in the Washington Post, Kaiser Health News, STAT, the Atlantic.com, Mother Jones and many other outlets. He was a 2005 fellow with the Carter Center for Mental Health Journalism.

Take our reader survey and help shape MindSite News reporting

Close the CTA