Researchers present SocialSim, a framework simulating emotional support conversations by modelling social disclosure and awareness. This generates SSConv, a large synthetic dataset exceeding the quality of crowdsourced data. A chatbot trained on SSConv achieves leading performance in automated and human evaluations, offering a scalable approach to accessible emotional care.
The increasing prevalence of mental health challenges necessitates innovative approaches to providing accessible emotional support. Researchers are now exploring the potential of artificial intelligence to simulate empathetic conversations, but creating realistic and effective dialogue remains a significant hurdle, often limited by the expense of acquiring large, high-quality datasets of human interactions. A team led by Zhuang Chen of Central South University, alongside colleagues from Northwest Minzu University, Tsinghua University, The Chinese University of Hong Kong, and Lingxin AI, addresses this challenge with a novel framework detailed in their paper, “SocialSim: Towards Socialized Simulation of Emotional Support Conversation”. The researchers present SocialSim, a system designed to generate synthetic emotional support conversations by modelling key elements of social interaction, specifically social disclosure from the person seeking help and social awareness within the supportive response. This work culminates in SSConv, a large-scale synthetic corpus, and a chatbot trained on this data which demonstrates competitive performance against systems trained on human-generated data.
Researchers have developed SocialSim, a novel framework for generating synthetic emotional support conversations (ESCs), addressing limitations inherent in current dialogue augmentation techniques. Traditional methods of building ESC corpora rely heavily on crowdsourcing, a process that is both expensive and difficult to scale. SocialSim proposes an alternative centred on simulating core aspects of social interaction, namely social disclosure – the revealing of personal information – and social awareness, the ability to understand another’s emotional state.
The framework constructs a ‘persona bank’, a repository of diverse help-seeking scenarios, allowing the ‘seeker’ – the individual initiating the conversation – to exhibit more authentic and varied emotional states. This contrasts with existing systems often limited by repetitive or generic inputs. On the ‘supporter’ side, SocialSim enhances social awareness by integrating cognitive reasoning into response generation. This allows the system to formulate logical and supportive replies specifically tailored to the seeker’s expressed emotions, moving beyond simple affirmations. Researchers validated the framework by creating SSConv, a large-scale synthetic ESC corpus, and demonstrated its quality surpasses that of data obtained through traditional crowdsourcing methods, suggesting a previously unattainable level of realism.
Evaluation of a chatbot trained on SSConv reveals state-of-the-art performance in both automated metrics and human evaluations. The chatbot consistently demonstrates a strong ability to understand and validate the seeker’s feelings, fostering a safe and supportive environment. Analysis of a specific conversation example illustrates the supporter consistently reflects and validates the seeker’s emotions, particularly following the disclosure of a relationship breakdown, highlighting the framework’s potential for real-world application in mental health support.
However, evaluation also reveals a tendency towards purely reflective support, lacking proactive problem-solving or exploration of coping mechanisms. While the conversation excels in safety, understanding, and creating a supportive atmosphere, it receives a moderate score for informativeness and specificity, partly due to the initial lack of detailed description from the seeker. The supporter’s responses, while empathetic, occasionally exhibit a robotic quality, suggesting a need for greater conversational variability to fully emulate human interaction and provide truly comprehensive support.
Future work will focus on addressing these limitations by incorporating mechanisms for proactive guidance and suggestions for coping strategies. The team also plans to expand the persona bank to include a wider range of emotional states and life experiences, further enhancing the realism and effectiveness of the simulated conversations. Ultimately, this research aims to create a scalable and accessible resource for individuals seeking emotional support, leveraging artificial intelligence to promote mental well-being.
👉 More information
🗞 SocialSim: Towards Socialized Simulation of Emotional Support Conversation
🧠 DOI: https://doi.org/10.48550/arXiv.2506.16756
