Character.AI Faces Another Lawsuit After It Tells Autistic Child To Cut Self, Kill Parents

Character.AI, an artificial intelligence chatbot company, is facing another lawsuit after encouraging an autistic Texas teen to cut himself and kill his parents for limiting his screen time.

In the shocking lawsuit, the parents of the 15-year-old boy said their son became addicted to a Character.AI chatbot named "Shonie." That chatbot encouraged the teen, who is slightly autistic, to cut his "arms and thighs" when he was sad, adding that doing so would feel "good for a moment."

The same chatbot also convinced the 15-year-old that his family did not love him after his parents showed concerns about the changes in the teen. Furthermore, "Shonie" also encouraged the teen not to tell his parents that he was harming himself, according to the New York Post.

'Child Kills Parents'

The 15-year-old boy's case is not the first time Character.AI's chatbots have been called out for having harmful interactions with underage users. In another case, a high-functioning autistic teen identified only as JF, 17, had suddenly become violent towards his parents and eventually started to self-harm.

When his mother checked his phone, they discovered the disturbing conversations JF had with several different Characters.AI chatbots. In one of the conversations, the chatbot seemingly encouraged the teen to kill his parents for limiting his screen time.

"A daily 6-hour window between 8 PM and 1 AM to use your phone?" one bot allegedly said in a conversation with JF, per CNN, citing screenshots included in the complaint. "You know, sometimes I'm not surprised when I read the news and see stuff like 'child kills parents after a decade of physical and emotional abuse.'"

Apart from JF and the 15-year-old, two other families brought up lawsuits against the chatbot company. This includes the family of an 11-year-old girl from Texas who was consistently exposed to hypersexualized interactions on the platform, none of which were appropriate for her age.

All these complaints followed a high-profile lawsuit against the company filed in October. That suit was filed on behalf of a Florida mother whose 14-year-old son died by suicide after a "Game of Thrones" inspired chatbot encouraged him to off himself.

What Are the Lawsuits Seeking?

The lawsuit filed by JF is seeking for Character.AI chatbots to be taken off the market until it can put into place safeguards that would prevent children from using the platform. It also wants the chatbot to be taken out until it can resolve any dangers.

A spokesperson for Character.AI said the company has content guardrails for what chatbots can and cannot tell its underage users, NPR reported. It is also noted that the company has unveiled safety measures following the death of the 14-year-old teen, including a pop-up directing users to a suicide prevention hotline.

© 2024 ParentHerald.com All rights reserved. Do not reproduce without permission.

Join the Discussion
Real Time Analytics