EATMINT database

Database example

The EATMINT database contains multi-modal and multi-user recordings of affect and social behaviors in a collaborative setting. The following signals were recorded for 30 dyads (i.e. 60 participants) :

  • Physiological signals: electrocardiogram, electrodermal activity (GSR), blood volume pulse, respiration and skin temperature;
  • Behaviors: eye-movements (eye-tracking), facial expressions, software actions logs;
  • Discourse: speech signals and transcripts.

Each interaction is annotated in term of affect, collaboration and social aspects (e.g. conflict, grounding, etc.)

The dataset is publicly available and we encourage researchers to use it for affective state estimation and social signal processing. If you want to access the data just request an account using the link on the top.

In the case you are publishing results obtained from this dataset please cite:

Chanel, G., Bétrancourt, M., Pun, T., Cereghetti, D., & Molinari, G. (2013). Assessment of computer-supported collaborative processes using interpersonal physiological and eye-movement coupling. In Proceedings - 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, ACII 2013 (pp. 116-122). Geneva, Switzerland: IEEE. doi:10.1109/ACII.2013.26