OpenAI launches parental safety controls for ChatGPT after a teen’s death

After Southern California teenager Adam Raine died by suicide this spring, his parents filed a lawsuit blaming OpenAI’s “deliberate design choices” for the death, citing numerous instances where ChatGPT provided him with specific advice on how to kill himself.
Now, OpenAI has launched parental controls that parents can use with their teenagers. The update, released worldwide yesterday, enables parents and law enforcement to get notifications if a user between 13 and 18 talks to the chatbot about suicide or self-harm.
The update will also make changes to the content experience for teens using ChatGPT. If a teen user types in a prompt related to self-harm or suicidal ideation, that prompt will be forwarded to a team of human reviewers who decide whether to trigger a potential parental notification – though those are expected to come hours after a concerning conversation has been flagged.
If OpenAI’s teams determine a child is in danger, and parents can’t be reached, they might also contact law enforcement. Parents can also limit hours, voice mode and image tools.
“Once parents and teens connect their accounts, the teen account will automatically get additional content protections including reduced graphic content, viral challenges, sexual, romantic or violent roleplay, and extreme beauty ideals, to help keep their experience age-appropriate,” OpenAI said in a blog post announcing the launch.
The blogpost also said that OpenAI has been working with Common Sense Media, a nonprofit that reviews tech products and advocates for improved child safety, as well as the attorneys general of California and Delaware, and expects to refine and expand on the new controls over time.
“These parental controls are a good starting point for parents in managing their teen’s ChatGPT use,” Robbie Torney, Common Sense Media’s senior director for AI programs, was quoted as saying in the blog post.
Mental health can't wait.
America is in a mental health crisis — but too often, the media overlooks this urgent issue. MindSite News is different. We’re the only national newsroom dedicated exclusively to mental health journalism, exposing systemic failures and spotlighting lifesaving solutions. And as a nonprofit, we depend on reader support to stay independent and focused on the truth.
It takes less than one minute to make a difference. No amount is too small.
The name “MindSite News” is used with the express permission of Mindsight Institute, an educational organization offering online learning and in-person workshops in the field of mental health and wellbeing. MindSite News and Mindsight Institute are separate, unaffiliated entities that are aligned in making science accessible and promoting mental health globally.



Comments (1)
Comments are closed.