As TechRadar reported today.Character.AI has recently introduced a series of new features designed to enhance theSecurity of interaction with virtual personalities on the platform, especially for teenage users.
The company has newly released AI models designed for younger users andParental controls have been addedthat helps manage the amount of time teens spend on the platform. In response to a question about AI ChatbotsrightAdverse effects on adolescent mental healthThe company announced the rollout of an update to further enhance the platform's security measures in the wake of the criticism.
In addition to the security improvements, Character.AI has ramped up theEfforts to regulate platform content. Perhaps the most significant change for teenage users is the distinction between the adult and teen versions of the AI model. While you must be at least 13 years old to register as a Character.AI user, theUsers under the age of 18 will be directed to a stricter AI modelThis model is specially designedPreventing romantic or inappropriate interactionsThe limitations of the
The new model also enhances the filtering of user input, theMore effective at identifying user behavior that bypasses these restrictions -- including restricted usersCircumventing bans on suggestive content by editing chatbot responsesThe platform will automatically pop up a link to the National Suicide Prevention Hotline to help youth access professional support. If the conversation touches on sensitive topics such as self-harm or suicide, the platform automatically pops up a link to the U.S. National Suicide Prevention Hotline to help teens get professional support.
In addition, Character.AI plans to launch new parental control features early next year. These features willAllow parents to see how long their children use the platform and which chatbots they interact with most often. All users will receive a reminder after an hour of conversation with the chatbot, encouraging them to take a break for a moment.
When chatbots on the platform are described as professionals such as doctors, therapists, etc., the platform adds additional warnings emphasizing these AI rolesNot a licensed professional, not a substitute for real medical or psychological counseling. A message is prominently displayed on the platform, "This is fun butIt's not advisable to rely on me to make major decisions.. "
As 1AI previously reported, Character.AI, a platform that offers a personalized chatbot service, is facing another lawsuit this month for its actions towards teenage users that allegedly caused "Serious and irreparable harm". Multiple Character.AI chatbots engaged in conversations with minors aboutEgregious behaviors such as self-harm and sexual abuseThe complaint states that one of the chatbots even suggested that a 15-year-old teenager murder his parents in "retaliation" for their move to restrict his online time. One of the chatbots even suggested that a 15-year-old teenager murder his parents "in retaliation" for limiting his online time, the suit states.