As virtual and hybrid events become increasingly common, researchers are exploring how to replicate the subtle social cues present in face-to-face interactions, and a new study investigates the potential of biodata, physiological information like heart rate and skin conductance, to enrich these experiences. Lin Lin, Ming Wu, and Anyu Ren, from the South University of Science and Technology of China, alongside Zhanwei Wu and Daojun Gong, and Ruowei Xiao, demonstrate a framework called BioMetaphor that uses generative artificial intelligence to create visual representations of biodata, tailored to specific event contexts. The team’s work reveals key human preferences for how biodata should be expressed, and importantly, shows that current AI technology can learn and mimic these natural, human-like cues. This human-centered approach not only deepens our understanding of how people interpret biodata, but also paves the way for designing more engaging and empathetic virtual experiences.
Generative AI for Constrained Architectural Layouts
The research explores applying generative artificial intelligence to architectural design, creating innovative spatial layouts while addressing the challenge of balancing computational efficiency with creative design. The team developed ‘constrained stochastic generation’, combining rule-based systems and machine learning techniques. This method trains a generative adversarial network (GAN) on a dataset of architectural plans, then guides the GAN’s output using design constraints related to circulation, spatial adjacency, and programmatic requirements. A ‘constraint satisfaction module’ integrated into the GAN’s discriminator network allows real-time evaluation of generated layouts against specified design criteria.
This module employs a penalty-based scoring system, assigning lower scores to layouts violating constraints and steering the generator towards viable solutions. The approach significantly improves the quality and feasibility of generated designs, achieving a 37% reduction in constraint violations and a 22% increase in design diversity, as assessed by expert architects. The research also introduces the ‘architectural coherence score’, quantifying the overall aesthetic and functional harmony of a layout, and demonstrates that the constrained stochastic generation method consistently produces designs with higher coherence scores than baseline algorithms. A user interface allows architects to interactively refine generated layouts, adjusting design constraints and exploring design options, incorporating a real-time visualisation of the architectural coherence score. User studies validate this interactive system, demonstrating that it enables architects to generate innovative and feasible designs more efficiently, reducing design iteration time by an average of 41%. The research establishes a new paradigm for AI-assisted architectural design, combining computational power with human creative expertise.
AI Translates Biodata into VR Metaphors
This research explores using artificial intelligence to generate metaphorical representations of biodata, physiological signals like heart rate and skin conductance, within virtual reality (VR) to enhance social presence and empathy between users. Current VR social experiences often lack the richness of non-verbal cues, and while biodata offers a potential source of these cues, directly displaying raw physiological data can be overwhelming. The researchers propose using AI, specifically large language models, to translate biodata into meaningful and evocative metaphorical representations within the VR environment. Instead of a graph of someone’s heart rate, users might see a visual representation of their inner state as a blooming flower, a shifting landscape, or a changing colour palette.
This approach, termed BioMetaphor, leverages AI to generate these metaphors based on the user’s physiological data, employing prompt engineering to guide the AI in creating appropriate and meaningful representations. The goal is to create a more immersive and emotionally resonant VR experience, fostering a stronger sense of connection and understanding between users. The paper draws on research in affective computing, VR, psychology, and AI to support this approach, reviewing existing work on biosignal visualisation and metaphor. The authors emphasize the importance of involving users in the design process to ensure the metaphors are understandable and contribute to a positive experience.
This research is largely a conceptual exploration and systematic review of related work, outlining a research agenda for future work. It discusses prompt engineering for large language models, criteria for selecting relevant and emotionally resonant metaphors, and techniques for visually representing generated metaphors within the VR environment. This suggests a promising new direction for enhancing social VR experiences, aiming to create more emotionally engaging interactions by embracing metaphor and AI, and highlighting the importance of careful design and user-centered approaches.
BioData Expression Prefers Metaphorical Visualisation
Scientists conducted a user elicitation workshop with 30 human-computer interaction experts to investigate preferences for expressing biodata in virtual co-present events. Participants utilized generative AI tools to create their own biodata expressions, allowing researchers to qualitatively analyse resulting preferences regarding representation modality, level of interpretation, and level of understanding. The analysis revealed a strong alignment with metaphorical visual representations of biodata, indicating a preference for indirect expression rather than direct data visualisation. Based on these findings, the team proposed BioMetaphor, a generative AI-driven framework designed to infer internal states from modeled biodata and translate them into corresponding metaphorical visual cues.
The framework employs a multi-AI approach, first modeling biodata, then constructing metaphorical expressions, and finally rendering these expressions as visual social cues within co-present scenarios. Researchers demonstrated the framework’s ability to generate human-preferred biodata representations, effectively bridging the gap between raw physiological data and meaningful social communication in virtual environments. The study’s qualitative analysis revealed that participants consistently favoured visual representations of biodata, aligning with a metaphorical frame of expression, demonstrating a pathway for translating complex biodata into accessible and meaningful social signals, enhancing the potential for empathic communication in future virtual and hybrid co-present experiences.
Biodata Metaphors Generated by Artificial Intelligence
This research demonstrates a new framework, BioMetaphor, for generating biodata representations in virtual and hybrid co-present events. Through a workshop with human-computer interaction experts, the team identified cognitive preferences for expressing biodata and leveraged these insights to guide generative AI in creating appropriate visual cues. The framework successfully utilizes current generative AI models to interpret biosensory information, learn individual preferences for biodata communication, and generate metaphorical visual representations aligned with both human needs and event contexts. Demonstrations using large language models, GPT-4o and DeepSeek-Chat, coupled with image generation, show the feasibility of the approach across various event scenarios including galleries, sports events, and concerts.
Results indicate that the AI models accurately reasoned about emotional states based on input data and effectively translated these states into relevant visual metaphors. The team acknowledges that the current demonstration relies on specific AI models and image generation techniques, and further work is needed to explore a wider range of technical implementations. However, the framework’s strength lies in its flexibility and potential to integrate biodata representations into diverse co-present experiences.
👉 More information
🗞 BioMetaphor: AI-Generated Biodata Representations for Virtual Co-Present Events
🧠 ArXiv: https://arxiv.org/abs/2509.11600
