|<   <   Page 1 / 1   >   >|

open  

USC CreativeIT database of multimodal dyadic interactions

Data from 16 actors, male and female, during their affective dyadic interactions ranging from 2-10 minutes each, and two types of improvised interactions: 2 sentence exercises and paraphrases.

Authors:  Angeliki MetallinouZhaojun YangChi-Chun LeeCarlos BussoSharon CarnickeShrikanth Narayanan
Updated:  2015-04-17
Source:  https://sail.usc.edu/CreativeIT/
Keywords:  dyadic-interactionsspeechgesturesmotion-captureemotionenglish

open   documented  

IEMOCAP: The Interactive Emotional Dyadic Motion Capture Database

An acted, multimodal and multispeaker database containing approximately 12 hours of audiovisual data, including video, speech, motion capture of face, text transcriptions.

Authors:  Carlos BussoMurtaza BulutChi-Chun LeeAbe KazemzadehEmily MowerSamuel KimJeannette N. ChangSungbok LeeShrikanth Narayanan
Updated:  2008-11-09
Source:  https://sail.usc.edu/iemocap/
Keywords:  emotionsbehaviorspeechgesturemotion-captureenglish

|<   <   Page 1 / 1   >   >|