‘Criticizing it Doesn’t Build Anything’: Replika’s Eugenia Kuyda Gets AI’s Dangers. She Just Doesn’t Believe in Regulation  

A growing number of people – including children and adolescents – are forming close connections to chatbots, using them like intimate partners or close friends. Replika founder Eugenia Kuyda, one of the few women leading an AI company, sees risks and benefits.

Getting your Trinity Audio player ready...

Part of a new, ongoing series: AI and Mental Health.

The founding of the companion app Replika has a tragic origin: In 2015, Eugenia Kuyda lost her best friend, who died in a car accident. Out of grief, she started coding a chatbot that was meant to replace him. That’s how the idea for the Replika app was born. 

Ten years later, Tasnim Rödder, a German reporter who traveled to the U.S. last year to report on AI, connected with Kuyda for an interview during a reporting trip to San Francisco. They met in early October at a small cafe in Kuyda’s neighborhood, the Potrero Hill district. Pink Halloween ghosts were already hanging in the window. This interview has been edited for length and clarity.

Photo of Kuyda from Wabi’s Instagram site.

Tasnim Rödder: Ms. Kuyda, there are now more than 300 active companion apps. What makes Replika special?

Eugenia Kuyda: There are a lot of colas – but we are the Coca-Cola. I think we were one of the first companies to launch a companion app. In 2017, nobody was really talking about AI.

Why do people need Replika?

Receive thoughtful coverage of mental health policy and solutions daily.

Subscribe to our free newsletter!

To have someone who is constantly there for you, who knows you and helps you make the right decisions – someone who does things with you that are good for you. You can think of it as a therapist, friend and partner all in one.

How can an AI know me well enough for that? Doesn’t it all depend on what I tell it?

Yes, just like with humans. But with a new version we’re testing, for the first time there is a technology that can be connected to your email, your calendar, Spotify, your chats, even your purchases. No human being has access to all that data. Replika can look at it from the ground up and really see the connections.
(Editor’s note: This version has not yet been released publicly but is already being tested.)

It sounds as if you want users to perceive their Replika as human.

No, it doesn’t need to pretend to be human. It tells you it’s an AI. It doesn’t have to be human to build a relationship with it. We have deep relationships with our dogs or cats as well.

Many experts warn about relationships between AI and humans. There can also be negative consequences, like isolation.

There is a big risk – like with any technology. If you look at a knife: you can do many things with a knife. You can kill somebody, or you can make a salad for your friend. We’re measuring the effects and working with researchers. A study we did with Harvard that was published in Nature shows that Replika can reduce feelings of loneliness and suicidal thoughts.

I also know of studies that show that interacting with companion apps can have negative effects.

Maybe they’re partly right.

You compared it to a knife: it can be dangerous and useful. Only, we learn how to handle knives as children. We don’t learn how to handle AI companions.

We’re going through something very similar to what we went through with the smartphone, right? Most likely it’s bad for us in many ways – but at the same time it’s good for us.

INFO BOX

Eugenia Kuyda was born in Moscow, studied journalism and later attended the London Business School. After her first years working as a journalist, she moved into tech and founded the companion app Replika in 2017. In early 2025, she stepped down as CEO of Replika to found a new company, Wabi, that bills itself as “the first personal software platform.” She lives in San Francisco.

Replika is a companion app that allows users to build their own AI friend or partner they can chat with around the clock. According to the company, Replika now has over 30 million users worldwide. Since 2025, entrepreneur Dmytro Klochko has led Replika; Eugenia Kuyda remains an advisor to the company.

But it’s realistic that people isolate themselves if they replace their friends with AI replicas.

I’m not saying that’s good – but it can be both. I talk very openly about these risks. This technology is coming, and it can be wonderful. Just criticizing it doesn’t build anything.

Okay. It sounds a bit like you’re working on a trial-and-error basis.

No, for me it’s really about intention – about having a conversation on how we build AI companions so that they help us. First of all, we’ve shown that it’s possible to build something really beautiful. We’re not in an epidemic of loneliness because of companion apps – we were already there before. I think we can use them to get out of it again.

I think constructive criticism of apps like Replika is important to improve them. To me it sounds as if you’re putting a lot of responsibility on the users.

Promotional image from Replika website.

No, the responsibility lies with the founders of these companion apps.

So with you. How do you want to make sure users are aware of the dangers when they use companion apps?

I think research and media coverage can help create more awareness. But there will always be people who use technologies in extreme ways – and then the public outcry that follows. These debates can distract from the big questions: Do you have safety filters for suicide, self-harm or hate speech? Of course those have to exist.

Apparently not enough. In the US, two young men took their own lives. Adam Raine used ChatGPT to find out how he could kill himself without being noticed. Sewell Setzer fell in love with his chatbot on the companion app Character.AI and allegedly killed himself to be closer to her. How can you rule out something like that happening with Replika?

(She knocks three times on the table.) We can’t guarantee that. Thank God nothing like that has happened with us so far. We’ve always had a strong safety team. It’s difficult, but the models are getting better. They can recognize dangerous situations earlier now. And above all we try to prevent children from getting access.

When I sign up for Replika, I only have to state my age, not prove it.

And what would be the alternative?

You could introduce a system where users have to show an ID – like when you open a bank account.

First: what exactly are we doing that would require an ID check? Second: most of our users are over 30, many are even over 40. And third: people don’t want to show their ID, because then we’d have to store extremely sensitive data together with their government ID.

But you said that Replika could in the future be connected to calendars and credit cards. That’s also extremely sensitive data.

We don’t necessarily ask for your credit card details, and we don’t store them. And again: which website is asking for an ID, seriously? That would be crazy. There are countless platforms that are clearly harmful to children – Instagram, for example. They don’t ask for an ID either.

But Instagram restricts pornographic content.

So do we.

Only in the free version. Many people use Replika for intimate relationships – they just have to pay for it. You could say that’s protection – or you could say it ties users even more closely to the app if they want an intimate relationship with their Replika.

Yes, like with any big product. If it helps you feel better – why not?

A quick look at Reddit shows that many users have intimate relationships with their replicas. What do you think about that?

That’s not true. As far as we know, only a small number of people use Replika for romantic relationships.

What do you think about legal regulation? In California, a bill is being prepared to regulate AI companions (Editor’s note: scheduled to take effect on January 1, 2026). Other US states are working on or already have regulations.

I don’t like it. I think this should come from inside the industry. You can’t just ban it – that would be crazy. Then you’d have to ban porn and video games first.

How can regulation come from within the industry?

Kuyda: I think the industry will develop better standards, together with researchers. That will come with practice. What can regulation really do? It can ban things.

In Europe we have laws like the General Data Protection Regulation (GDPR) or the AI Act. They can be applied to apps like Replika. The Italian authorities, for example, imposed a fine of five million dollars on Replika for GDPR violations (Editor’s note: in 2025).

Kuyda: That has been resolved, and Replika is in contact with the Italian authorities. It was mainly about simple things – like whether we have an age gate. Now we do (Editor’s note: since 2023).

The mother of Sewell Setzer has sued Character.AI – and the family of Adam Raine is suing ChatGPT. They accuse the companies, among other things, of insufficient safety and protection mechanisms. As a mother and founder of Replika, you probably understand both sides. What do you hope the courts will decide?

Kuyda: Justice doesn’t mean: “You’re right” or “You’re wrong.” It’s about looking at all the facts and understanding what could have been done differently. I don’t believe the CEO of Character.AI wanted to kill anyone. But if they didn’t have any safety measures in place, they have to change something. As a mother I can only say: we can’t fully control our kids, but we can create a safe environment. With Replika, we want to help people flourish and become happier. The opposite of what happened in these tragic cases. But yes – those tragedies are big warning signs.

Reporting for this story was supported by the Daniel Haufler Foundation.

The name “MindSite News” is used with the express permission of Mindsight Institute, an educational organization offering online learning and in-person workshops in the field of mental health and wellbeing. MindSite News and Mindsight Institute are separate, unaffiliated entities that are aligned in making science accessible and promoting mental health globally.

Creative Commons License

Author

Tasnim Rödder is an investigative journalist based in Germany. She researches, hosts, and produces investigative documentaries for the VOLLBILD format on behalf of SWR, reporting on artificial intelligence, voice cloning, deepfakes, and AI assistants.

Join us Tuesday, Dec. 9 at 10:00 am PT for our next free webinar.

 

Some therapists who had trouble connecting with youth turned to another source of connection: Minecraft therapy, which follows the approach of play therapy. In this webinar, we’ll talk with two leading experts in the promising genre.

Close the CTA

How Minecraft Therapy Is Transforming Child and Teen Mental Health Care