In order to study the phenomena occurring in social interactions between humans in more detail,and to allow machine analysis of these social signals, researchers are in need of rich sets oflabeled data of repeatable experiments, which should represent situations occurring in daily life.The MHI-Mimicry database was created in order to address this issue, and more particularly toanalyze mimicry in human-human interaction scenarios. Specifically, the goal of the databaseis to provide a collection of recordings in which the participating subjects are acting with asignificant amount of resemblance and/or synchrony.
The recordings were made under controlled laboratory conditions using 15 cameras and 3microphones, to obtain the most favorable conditions possible for analysis of the observedbehavior. All sensory data was synchronized with extreme accuracy (less than 10ns) using hardware triggering. The related paper regarding the multi-sensory synchronization is:
J. Lichtenauer, J. Shen, M. F. Valstar, and M. Pantic.
“Cost-effective solution to synchronised audio-visual data capture using multiple sensors”
Image and Vision Computing Journal, vol. 29, pp. 666-680, 2011.
Recordings were made of two experiments: a discussion on a political topic, and a role-playinggame. In total there are 54 recordings, of which 34 are of the discussions and 20 of the role-playing game. Apart from the recordings, the database contains annotations for many differentphenomena, including dialogue acts, turn-taking, affect, head gestures, hand gestures, bodymovements and facial expressions.
In total, 40 participants were recruited, of which 28 were male and 12 female, aged between18 and 40 years old at the time of the recordings. All of the participants self-reported their feltexperiences after the conduction of the experiments.
The paper describing the database (please cite this paper whenever using the data from the MHI-Mimicry database):
S. Bilakhia, S. Petridis, A. Nijholt, M. Pantic.
“The MAHNOB Mimicry Database - a database of naturalistic human interactions”
Pattern Recognition Letters, vol. 66, pp. 52-61, 2015.