2025-05-27
Heavy ChatGPT users tend to be more lonely, suggests research
高频使用 ChatGPT 的用户往往更孤独
Heavy users of ChatGPT tend to be lonelier, more emotionally dependent on the AI tool and have fewer offline social relationships, new research suggests.
最新研究指出,频繁使用 ChatGPT 的用户不仅更容易感到孤独,还可能在情感上愈发依赖这一 AI 工具,同时其线下社交圈也相对更为狭窄。
Only a small number of users engage emotionally with ChatGPT, but those who do are among the heaviest users, according to a pair of studies from OpenAI and the MIT Media Lab.
根据 OpenAI 与麻省理工学院媒体实验室联合开展的两项研究,尽管只有少数用户会与 ChatGPT 建立起情感联系,但这类用户恰恰是使用频率最高的群体。
The researchers wrote that the users who engaged in the most emotionally expressive personal conversations with the chatbots tended to experience higher loneliness –though it isn't clear if this is caused by the chatbot or because lonely people are seeking emotional bonds.
研究人员在报告中指出,那些与聊天机器人进行深度情感交流的用户,往往孤独感更为强烈。不过,目前尚无法确定这是聊天机器人本身所导致的结果,还是孤独人群在主动寻求情感慰藉。
While the researchers have stressed that the studies are preliminary, they ask pressing questions about how AI chatbot tools, which according to OpenAI is used by more than 400 million people a week, are influencing people's offline lives.
尽管研究人员强调这些研究尚处于初步阶段,但它们无疑提出了一个亟待解答的问题:鉴于 OpenAI 透露,每周有超过4亿人使用人工智能聊天机器人工具,那么这些工具究竟是如何影响人们的线下生活的呢?
The researchers, who plan to submit both studies to peer-reviewed journals, found that participants who "bonded" with ChatGPT – typically in the top 10% for time spent with the tool – were more likely than others to be lonely, and to rely on it more.
研究人员计划将这两项研究提交至同行评审期刊。他们发现,与 ChatGPT 建立起“情感纽带”的参与者(即使用该工具时间最长的前10%用户)相较于其他用户,更容易陷入孤独,且对聊天机器人的依赖程度也更高。
The researchers established a complex picture in terms of the impact. Voice-based chatbots initially appeared to help mitigate loneliness compared with text-based chatbots, but this advantage started to slip the more someone used them.
在探讨这些工具对用户的影响时,研究人员勾勒出了一幅复杂的图景。相较于基于文本的聊天机器人,基于语音的聊天机器人在初期似乎更有助于缓解用户的孤独感。然而,随着使用频率的增加,这一优势逐渐减弱。
After using the chatbot for four weeks, female study participants were slightly less likely to socialise with people than their male counterparts. Participants who interacted with ChatGPT's voice mode in a gender that was not their own for their interactions reported significantly higher levels of loneliness and more emotional dependency on the chatbot at the end of the experiment.
在使用聊天机器人四周后,女性研究参与者的社交意愿相较于男性参与者有所降低。实验结束时,那些选择使用与自己性别不符的语音模式与 ChatGPT 互动的参与者,其孤独感显著增强,对聊天机器人的情感依赖也更为严重。
In the first study, the researchers analysed real-world data from close to 40m interactions with ChatGPT, and then asked the 4,076 users who had those interactions how they felt.
在第一项研究中,研究人员对近4000万次与 ChatGPT 的互动数据进行了深入分析,并随后询问了这4076名参与互动的用户他们的真实感受。
For the second study, the Media Lab recruited almost 1,000 people to take part in a four-week trial examining how participants interacted with ChatGPT for a minimum of five minutes each day. Participants then completed a questionnaire to measure their feelings of loneliness, levels of social engagement, and emotional dependence on the bot.
而在第二项研究中,媒体实验室招募了近1000名志愿者,进行了一项为期四周的试验。该试验旨在考察参与者每天至少与 ChatGPT 互动五分钟的情况。试验结束后,参与者需填写一份问卷,以评估他们的孤独感、社交参与度以及对聊天机器人的情感依赖程度。
The findings echo earlier research, for example in 2023 MIT Media Lab researchers found that chatbots tended to mirror the emotional sentiment of a user's messages – happier messages led to happier responses.
这些发现与早期的研究成果不谋而合。例如,2023年麻省理工学院媒体实验室的研究就曾指出,聊天机器人往往会模仿用户消息中的情感倾向——即用户发送更快乐的消息时,聊天机器人也会给出更积极的回应。
Dr Andrew Rogoyski, a director at the Surrey Institute for People-Centred Artificial Intelligence, said that because people were hard-wired to think of a machine behaving in human-like ways as a human, AI chatbots could be "dangerous", and far more research was needed to understand their social and emotional impacts.
萨里以人为本人工智能研究所主任安德鲁·罗戈伊斯基博士对此表示忧虑。他认为,由于人类天生会将表现得像人类一样的机器视为同类,因此人工智能聊天机器人可能潜藏着巨大的风险。他强调,为了深入了解这些工具的社会和情感影响,还需要开展更多的研究。
"In my opinion, we are doing open-brain surgery on humans, poking around with our basic emotional wiring with no idea of the long-term consequences. We've seen some of the downsides of social media – this is potentially much more far-reaching," he said.
他警告称:“在我看来,我们目前的行为无异于对人类进行一场无知的‘开颅手术’,随意摆弄着人们的基本情感线路,却对可能产生的长期后果一无所知。我们已经目睹了社交媒体所带来的种种弊端,而人工智能聊天机器人的潜在影响可能更为深远且复杂。”
Dr Theodore Cosco, a researcher at the University of Oxford, said the research raised "valid concerns about heavy chatbot usage", though he noted it "opens the door to exciting and encouraging possibilities".
牛津大学研究员西奥多·科斯科博士则认为,这项研究虽然提出了对高频使用聊天机器人的合理担忧,但同时也为探索人工智能的积极应用提供了新的视角。
"The idea that AI systems can offer meaningful support — particularly for those who may otherwise feel isolated — is worth exploring. However, we must be thoughtful and intentional in how we integrate these tools into everyday life."
他强调:“人工智能系统确实有可能为那些感到孤立的人提供有意义的支持。然而,在将这些工具融入日常生活的过程中,我们必须保持谨慎和深思熟虑的态度。”
Dr Doris Dippold, who researches intercultural communication at the University of Surrey, said it would be important to establish what caused emotional dependence on chatbots. "Are they caused by the fact that chatting to a bot ties users to a laptop or a phone and therefore removes them from authentic social interaction? Or is it the social interaction, courtesy of ChatGPT or another digital companion, which makes people crave more?"
萨里大学专注于跨文化交际研究的多丽丝·迪波尔德博士则指出,确定人们为何会对聊天机器人产生情感依赖至关重要。“是因为与聊天机器人聊天让用户被束缚在电子设备前,从而远离了真实的社交互动吗?还是因为 ChatGPT 或其他数字伴侣所提供的社交体验让人们欲罢不能呢?”


评论 0