|<   <   Page 1 / 1   >   >|

open   documented  

IEMOCAP: The Interactive Emotional Dyadic Motion Capture Database

An acted, multimodal and multispeaker database containing approximately 12 hours of audiovisual data, including video, speech, motion capture of face, text transcriptions.

Authors:  Carlos BussoMurtaza BulutChi-Chun LeeAbe KazemzadehEmily MowerSamuel KimJeannette N. ChangSungbok LeeShrikanth Narayanan
Updated:  2008-11-09
Source:  https://sail.usc.edu/iemocap/
Keywords:  emotionsbehaviorspeechgesturemotion-captureenglish

|<   <   Page 1 / 1   >   >|