IEMOCAP: The Interactive Emotional Dyadic Motion Capture Database

Authors: Carlos BussoMurtaza BulutChi-Chun LeeAbe KazemzadehEmily MowerSamuel KimJeannette N. ChangSungbok LeeShrikanth Narayanan
Updated: Sun 09 November 2008
Source: https://sail.usc.edu/iemocap/
Type: mulitmodal-database
Languages: english
Keywords: emotionsbehaviorspeechgesturemotion-captureenglish
Open Access: yes
License: https://sail.usc.edu/iemocap/Data_Release_Form_IEMOCAP.pdf
Documentation: https://sail.usc.edu/iemocap/iemocap_info.htm
Publications: Busso et al. (2008)
Citation: Busso, C., Bulut, M., Lee, C. C., Kazemzadeh, A., Mower, E., Kim, S., Chang, J. N., Lee, S., & Narayanan, S. S. (2008). IEMOCAP: Interactive emotional dyadic motion capture database. Journal of Language Resources and Evaluation, 42(4), 335-359.
Summary:

The IEMOCAP dataset consists of dyadic sessions where actors perform improvisations or scripted scenarios, specifically selected to elicit emotional expressions. IEMOCAP database is annotated by multiple annotators into categorical labels, such as anger, happiness, sadness, neutrality, as well as dimensional labels such as valence, activation and dominance. The detailed motion capture information, the interactive setting to elicit authentic emotions, and the size of the database make this corpus a valuable addition to the existing databases in the community for the study and modeling of multimodal and expressive human communication.