Multimodal Interaction

Human-IST Institute research projects in multimodal interaction.

  • iKnowU – Exploring the Potential of Multimodal AR Smart Glasses for the Decoding and Rehabilitation of Face Processing in Clinical Populations

    Our ability to identify faces and decode facial expressions of emotion is essential for everyday life social interactions. Very recent advances in wearable and augmented reality technologies provide a unique opportunity to create perceptual and cognitive prostheses. This innovative project aims at developing smart glasses to help visually or cognitively impaired individuals with face and emotion recognition. Using collected information about the user’s relatives, friends and colleagues, the system will automatically recognize their presence in the visual field of the camera, their identity and emotion. This information will then be reported to the user via audio, tactile and/or visual feedback.

    Human-IST Collaborators: Denis Lalanne

    External Collaborators: Roberto Caldara, Simon Ruffieux, Nicolas Ruffieux

    Start date: July 2016

  • Autonomous Vehicles and Us : Designing Pedestrian-AV interaction

    Autonomous vehicles currently lack dedicated interfaces that allows them to communicate their intentions clearly to other road users. In this project that was started in collaboration with CarPostal, we aim to develop novel interfaces to mediate the interaction between autonomous vehicles and pedestrians. The design of the tool is realized using a participatory approach with test users. CarPostal also provides an access to their “SmartShuttles” to help us understant the various interactions that take place in real settings, and users acceptance of new interfaces.

    Human-IST Collaborators: Florian Evéquoz, Himanshu VermaDenis Lalanne

    External Collaborators:  Grace Eden (HES-SO), CarPostal

    Start date: September 2017