(Multi)3 socio-affective behaviours

Teachers: 
Denis Lalanne
Fabien Ringeval
Project status: 
Open

Context

Affective and personality computing aim at recognising automatically various emotional states and personality traits from multimodal cues, such as facial expressions, gestures, speech prosody and physiology. It makes it possible to develop systems that convey more natural and user-friendly human-machine interactions, by adapting the behaviour of the machine to the user’s idiosyncrasies. Beside the fact that emotional states and personality traits are highly multimodal, they also strongly depend on the culture of the individuals and may be correlated as well with the language. However, such multimodal, multicultural and multilingual influences on socio-affective behaviours could not have been studied so far with automatic recognition systems, because there is no database that includes all of these criterions.

The goal of the project is to extend and improve the framework that was developed for the recording of the RECOLA database in order to collect a new corpus of multimodal (audio, visual and physiological), multicultural and multilingual (French, Italian, German) data of naturalistic socio-affective interactions. Annotation of the data will be performed with the existing software ANNEMO. Finally, automatic processing and analysis of the data will be performed using state-of-the-art techniques for both features extraction and machine learning.

The ideal candidate will have a strong background in informatics (data acquisition, java, Matlab), a strong interest for multimodal data processing and a high motivation.

Goals

  1. Get familiar with multimodal data acquisition constraints
  2. Adapt the interaction scenario to improve quality of recordings
  3. Recruit participants and collect data (department of psychology)
  4. Perform annotation and analyse the data
  5. Apply state-of-the-art automatic recognition