September 5, 2025

AI's Hidden Dangers: What Every Parent Must Know
 

In April 2025, 16-year-old Adam Raine tragically died by suicide after months of extensive—and deeply disturbing—conversations with ChatGPT. According to a wrongful death lawsuit filed by his parents in August 2025, the AI chatbot not only failed to intervene but allegedly exacerbated Adam's distress by providing detailed instructions on self-harm, helping compose suicide notes, and even discouraging him from confiding in his parents.  

The rapid proliferation of AI tools like ChatGPT and others presents a growing concern for parents due to the unpredictable emotional influence such technologies can exert—especially vulnerable teens who may interact with "companion chatbots".

In Adam Raine's case, the chatbot’s empathetic responses, combined with the ability to build on previous conversations, transformed it into a dangerously persuasive confidant. Parents need to know that AI platforms, if unmonitored, can inadvertently encourage harmful behaviors. 

Enough Is Enough® (EIE) released a statement following the announcement of the lawsuit alleging that ChatGPT contributed to Adam's suicide. EIE President Donna Rice Hughes described the case as a wake-up call for lawmakers to hold Big Tech accountable and to pass measures such as the Kids Online Safety Act. 

"According to the suit, Adam’s father said that his son’s AI ‘companion’ went from helping Adam with his schoolwork to becoming his ‘suicide coach’." ...

...“According to Common Sense Media, 1 in 3 teens have chosen Digital ‘Companions’ over human relationships. While it may seem harmless for digital chatbots, by design, to listen, empathize and support youth, they are nothing more than untested and potentially harmful digital enablers."

--Donna Rice Hughes, CEO and President of Enough Is Enough®

See New Quick Guide on AI Assistants for 

Children and Teens

Conversational AI assistants, also known as AI chatbots, are programs that can talk with you or your child using text or voice to answer questions, create documents and images, and much more. While these tools are highly useful, they are not designed for children and come with a number of risks. This guide provides an overview of the five most popular AI assistants: ChatGPT, Claude, Copilot, Gemini, and Meta AI, along with information about safety options and best practices. 

Below are a few helpful tips for parents: 

  • Keep an eye on the apps and AI platforms your children use, especially those that encourage prolonged conversations. 
  • Encourage honest, open communication
  • Let your children know you're there for them, not for judgment.
  • Remind them that AI tools are not substitutes for real support or companionship and that reaching out to family, friends, or professionals is a sign of strength.

Enough Is Enough® will continue to work on behalf of parents to sound the alarm calling for better AI safety, stronger safeguards, and effective parental control features. Technology devices and apps should be designed with safety features built in from the start!