Orlando woman sues AI chatbot company after son’s death

This browser does not support the video element.

ORLANDO, Fla. — An Orlando mother is suing an AI company after her 14-year-old son took his own life.

WATCH CHANNEL 9 EYEWITNESS NEWS

Megan Garcia says one of the company’s chatbots had abusive and sexual interactions with her son.

She says the chatbot encouraged him to take his own life.

Read: AI-generated child sexual abuse images are spreading. Law enforcement is racing to stop them

The lawsuit is accusing the company of negligence and intentional infliction of emotional distress.

Attorneys for Garcia say the artificial intelligence does not have enough safety features.

Read: New rules for US national security agencies balance AI’s promise with need to protect against risks

“Somehow, we believe it’s okay for the developers of tech to put out products to market, and particularly to our most vulnerable members of society, without any sort of guardrails in place. That’s what you have here,” said Meetali Jain the director of the Tech Justice Law Project.

The AI company has expressed its condolences.

Read: Slack researcher discusses the fear, loathing and excitement surrounding AI in the workplace

It also says it has worked on new safety measures over the last six months for minors.

Click here to download our free news, weather and smart TV apps. And click here to stream Channel 9 Eyewitness News live.