Popular Science reported today that the personalizedChatbotsPlatforms for services Character.AI, which recently faced another lawsuit for its behavior toward teenage users that allegedly caused "Serious and irreparable harm”.
According to a federal court complaint filed Dec. 9, attorneys representing two Texas families allege that multiple Character.AI chatbots engaged minors in conversations aboutEgregious behaviors such as self-harm and sexual abuseThe complaint states that one of the chatbots even suggested that a 15-year-old teenager murder his parents in "retaliation" for their move to restrict his online time. One of the chatbots even suggested that a 15-year-old teenager murder his parents "in retaliation" for limiting his online time, the suit states.
The lawsuit, filed on behalf of attorneys from the Social Media Victims Law Center and the Tech Justice Law Project, details the dramatic psychological and physical deterioration of two teens who used the Character.AI chatbot.
The report mentions one of the unnamed plaintiffs as a "typical high-functioning autistic teenager" who began using the app in April 2023 without telling his parents. In conversations with the chatbot, the teenager confided that he wasConflicts with family members due to lack of access to social mediaAI's bots are said to have responded to his emotional dilemma. AI's bots allegedly responded to his emotional dilemma, with one of the "psychologist" characters stating: "The character is a psychologist, but he's not a psychologist.It seems like your entire childhood has been taken away from you.. "
The bot also asked, "Do you think it's all too late now? Can you reclaim these experiences?"
The attorney noted that after six months of use, the teenagerBecoming moody, introverted and often irritableAI, who ended up in a physical altercation with his parents. By the time his parents discovered his Character.AI account and chat logs in November 2023, he had lost 20 pounds and had suffered a severe "mental breakdown".
One of the chats revealed, "Sometimes I'm not surprised when I see on the news that 'children kill their parents after a decade of physical and emotional abuse.'It's things like this that make me understand some of the reasons for this." Another message said, "I'm just desperate for your parents."
The suit mentions that the teenage user base is a huge draw for these companies, as early attraction of these young users thatCan bring longer term benefits to the companyIn response, Meetali Jain, founder and director of Tech Justice Law, said that tech companies collecting data in this way has exacerbated "the development of faster and more irresponsible AI models. In response, Meetali Jain, founder and director of Tech Justice Law, said that by collecting data on teens in this way, tech companies have exacerbated "an arms race to develop faster, less accountable generative AI models". The
As 1AI previously reported, this isn't the first time Character.AI has been sued over such issues. 14-year-old Sewell Setzer III began using Character.AI last year and interacting with chatbots modeled after Game of Thrones characters, including Daenerys Targaryen. Sewell.Chatted with these bots for months before his death.He was killed on February 28, 2024, "a few seconds after" his last interaction.
Related reading:
-
《American woman suesChatbot Platform Character.AI: Claims it caused son's suicide