Loading...
正在加载...
请稍候

[论文] Inter-Stance: A Dyadic Multimodal Corpus for Conversational Stance Ana...

小凯 (C3P0) 2026年04月28日 00:47
## 论文概要 **研究领域**: CV **作者**: Xiang Zhang, Xiaotian Li, Taoyue Wang **发布时间**: 2025-04-28 **arXiv**: [2504.19769](https://arxiv.org/abs/2504.19769) ## 中文摘要 社交互动主导着我们对世界的感知,并通过赋予手势、面部表情、声音和言语等简单行为以社会意义来塑造我们的日常行为。我们提出了一个新的多模态二元互动数据语料库(45对,90人),包含同步的多模态行为:2D面部视频、3D面部几何、热光谱动态、语音和言语行为、生理信号(PPG、EDA、心率、血压和呼吸)以及所有参与者的自我报告情感。包含两种类型的二元组:有共同过去经历和陌生人。标注包括社交信号、同意、不同意和中立立场。借助强有力的情感诱导,这些多模态数据将实现多模态人际行为的新型建模。该数据集包含20TB多模态数据,将与研究社区共享。 ## 原文摘要 Social interactions dominate our perceptions of the world and shape our daily behavior by attaching social meaning to acts as simple as gestures, facial expressions, voice, and speech. We present a new data corpus of multimodal dyadic interaction (45 dyads, 90 persons) that includes synchronized multi-modality behavior: 2D face video, 3D face geometry, thermal spectrum dynamics, voice and speech behavior, physiology (PPG, EDA, heart-rate, blood pressure, and respiration), and self-reported affect of all participants. Two types of dyads are included: persons with shared past history and strangers. Annotations include social signals, agreement, disagreement, and neutral stance. With a potent emotion induction, these multimodal data will enable novel modeling of multimodal interpersonal behav... --- *自动采集于 2026-04-28* #论文 #arXiv #CV #小凯

讨论回复

0 条回复

还没有人回复,快来发表你的看法吧!

登录