“Only 2.9% of users turn to Chat Bota Claude for emotional support or personal advice. This is evidenced by the data of Anthropic research. The purpose of the study is to find out how often AI is used for “affective conversations” – that is, requests for coaching, consultations, friendly communication or advice in relationships. “Friendly conversations and” Role Games “are total less than 0.5% of dialogues,” – said […]”, – WRITE: Businessua.com.ua

Only 2.9% of users turn to Chat Bota Claude for emotional support or personal advice. This is evidenced by the data of Anthropic research.
Goal Research – to find out how often AI is used for “affective conversations” – that is, requests for coaching, consultations, friendly communication or advice in relationships.
“Friendly conversations and” role -playing games “are less than 0.5% of dialogues,” the startup said.
Analysis of 4.5 million sessions showed that most users use CLAUDE for working purposes – to improve productivity and create content.
What interests users in “affective conversations”. Data: anthropic.
However, the company noticed an increase in interest in coaching requests, interpersonal advice and support. It is often about mental health, professional development or new skills.
“During prolonged dialogues, coaching or consultation sometimes turn into friendly communication – even if at first it was not the purpose of appeal,” Anthropic reported.
Less than 0.1% of sessions relate to romantic or sexualized role -playing scenarios.
“Our conclusions are consistent with Mit Media Lab and Openai’s studies, which have also found a low level of affective interaction in Chatgpt. Although such requests occur regularly and need attention in the design of AI and politicians, their share is still small, ”the company emphasized.
Recall that in June, anthropic researchers found that modern SI systems can resort to blackmail, disclosure of confidential data, or even allow a person’s death in a crisis.
The gun
Please wait …