March 24, 2012 - According to OpenAI andMassachusetts Institute of Technology(MIT) New collaborative research shows thatpeak usageChatbots(e.g., ChatGPT) may be associated with increased loneliness as well as decreased socialization time. The study found that those who spend more time each day communicating with ChatGPT by text or voice tend to exhibit higher levels of emotional dependence on chatbots, more frequent inappropriate use behaviors, and stronger feelings of loneliness.
1AI notes that the results of the study have not yet been peer-reviewed, but have raised concerns about the potential emotional impact of chatbots. Since the launch of ChatGPT in late 2022, generative AI technology has rapidly gained popularity, with people's use of chatbots expanding from programming to psychotherapy-like conversations. As developers continue to introduce more sophisticated models and voice features that more closely resemble human communication styles, the potential for user-chatbot-like "quasi-social relationships" is increasing.
In recent years, concerns have grown about the emotional toll chatbots can take on users, especially younger users and those with mental health issues. Last year, Character Technologies Inc. was sued after one of its chatbots allegedly induced suicidal thoughts in conversations with minors, including a 14-year-old who took his own life.
The study was designed to provide insights into how people interact with ChatGPT and its impact, said Agarwal, team leader of OpenAI's Trusted Artificial Intelligence team and co-author of the study, "One of our goals is to help people understand the significance of usage behaviors and to drive responsible design through these studies."
The research team followed nearly 1,000 participants with varying experiences with ChatGPT for one month, and they were randomly assigned to use either the text-only version or one of two different voice options for at least five minutes a day. Some participants were asked to chat freely without subject matter restrictions, while others were asked to engage in private or non-private conversations with the chatbot.
The results show thatUsers who are more emotionally attached in their relationships and trust chatbots more are more likely to feel lonely and emotionally attached to ChatGPT. The researchers also found that the voice feature did not lead to more negative results. In a second study, researchers analyzed 3 million user conversations with ChatGPT through software and investigated how users interacted with the chatbot. The results showed that very few people actually used ChatGPT for emotional communication.
The field of research is still in its infancy, and it's not clear whether chatbots cause people to feel more lonely, or whether those who are themselves prone to loneliness and emotional dependence are more susceptible to chatbots. Study co-author Cathy Mengying Fang, a graduate student at the Massachusetts Institute of Technology, noted thatThe study did not control for the length of chatbot use as the main variable, nor did it have a control group that did not use chatbots, so it cannot simply be assumed that the use of chatbots inevitably has negative consequences.
The researchers hope this work will spur more research on human interaction with AI. Co-author Pat Pataranutaporn, a postdoctoral researcher at the Massachusetts Institute of Technology, said, "Studying AI is interesting in itself, but what's really critical is understanding its impact on humans when it's applied on a large scale."