|<   <   Page 1 / 1   >   >|

open  

Pavlovia

Pavlovia is a place for the wide community of researchers in the behavioural sciences to run, share, and explore experiments online.

Authors:  Jonathan W. Peirce
Updated:  2021-01-24
Source:  https://pavlovia.org/
Keywords:  behaviorexperimentrepositorypsychology

open   documented  

BITTSy: Behavioral Infant & Toddler Testing System

BITTSy is capable of running key infant behavioral testing paradigms, including Headturn Preference Procedure [HPP], Preferential Looking, and Visual Fixation/Habituation, through the same interface.

Authors:  Rochelle NewmanEmily ShroadsElizabeth JohnsonRuth TincoffKris OnishiGiovanna Morini
Updated:  2020-09-04
Source:  http://langdev.umd.edu/bittsy/
Keywords:  child-developmentsoftwareexperimentbehaviorEnglish

open   documented  

jsPsych

jsPsych is a JavaScript framework for creating behavioral experiments that run in a web browser.

Authors:  Joshua de Leeuw
Updated:  2015-12-24
Source:  https://www.jspsych.org/7.1/
Keywords:  experimentpsychologybehaviordataprogramming

open  

Alberta Language Environment Questionnaire

The ALEQ consists of questions about family demographics, language use among family members in the home, and other aspects of an ESL child's language environment.

Authors:  Johanne Paradis
Updated:  2011-12-24
Source:  https://sites.google.com/ualberta.ca/chesl/questionnaires
Keywords:  language-developmentenvironmentlanguage-exposurebehaviorspeech-pathologymultilingual

open  

Alberta Language Development Questionnaire

The ALDeQ consists of questions for parents concerning the early and current development of an ESL child's first language. The purpose is to understand whether there may be evidence of delay or difficulties in the first language.

Authors:  Johanne ParadisKristyn EmmerzaelTamara Sorenson Duncan
Updated:  2010-01-25
Source:  https://sites.google.com/ualberta.ca/chesl/questionnaires
Keywords:  language-developmentbehaviorlanguage-delaymultilingualspeech-pathology

open   documented  

IEMOCAP: The Interactive Emotional Dyadic Motion Capture Database

An acted, multimodal and multispeaker database containing approximately 12 hours of audiovisual data, including video, speech, motion capture of face, text transcriptions.

Authors:  Carlos BussoMurtaza BulutChi-Chun LeeAbe KazemzadehEmily MowerSamuel KimJeannette N. ChangSungbok LeeShrikanth Narayanan
Updated:  2008-11-09
Source:  https://sail.usc.edu/iemocap/
Keywords:  emotionsbehaviorspeechgesturemotion-captureenglish

|<   <   Page 1 / 1   >   >|