Projects

Robots that adapt to us

Are you tired of typing? Emoting with your fingers? We believe that robots should adapt to our human methods of communication, and not the other way around. The following robotics projects help us move away from screens and bring technology closer to humanity.

  • Gaze and Filled Pause Detection for Smooth Human-Robot Conversations. Miriam Bilac, Marine Chamoux, Angelica Lim. (Humanoids 2017)
  • How does the robot feel? Annotation of emotional expressions generated by a humanoid robot with affective quantifiers. Mina Marmpena, Angelica Lim, and Torbjørn S. Dahl (BAILAR 2017)
  • Habit detection within a long-term interaction with a social robot: an exploratory study. Claire Rivoire, Angelica Lim (DAA 2016)
  • Robot Musical Accompaniment: Integrating Audio and Visual Cues for Real-time Synchronization with a Human Flutist. Angelica Lim, Takeshi Mizumoto, Louis-Kenzo Cahier, Takuma Otsuka, Toru Takahashi, Kazunori Komatani, Tetsuya Ogata, Hiroshi G. Okuno. (IROS 2010)

Perceiving what humans do, feel and think

Robots still have difficulty navigating our human world. Using the sensor data available to them and perhaps a little learning, could it look at us and understand what we're doing? Feeling? Thinking? For example, if you walked in front of a robot, could it tell if you did it on purpose or by accident?

  • UE-HRI: A New Dataset for the Study of User Engagement in Spontaneous Human-Robot Interactions. Atef Ben Youssef, Miriam Bilac, Slim Essid, Chloé Clavel, Marine Chamoux, Angelica Lim. (ICMI 2017)
  • The MEI Robot: Towards Using Motherese to Develop Multimodal Emotional Intelligence. Angelica Lim, Hiroshi G. Okuno. (TAMD 2014)

Understanding the human mind

It has been said that what we cannot build, we do not fully understand. We implement models of the brain and learning from psychology, neuroscience and developmental science, with the goal of discovering new ways for robots to learn and adapt.

  • A Recipe for Empathy: Integrating the Mirror System, Insula, Somatosensory Cortex and Motherese. Angelica Lim, Hiroshi G. Okuno. (J. Social Robotics 2014)
  • Developing Robot Emotions through Interaction with Caregivers. Angelica Lim, Hiroshi G. Okuno. (Synthesizing Human Emotion in Intelligent Systems and Robotics 2014)
  • Towards expressive musical robots: A cross-modal framework for emotional gesture, voice and music. Angelica Lim, Tetsuya Ogata, Hiroshi G. Okuno. (EURASIP 2012)