Robots that adapt to us
Are you tired of typing? Emoting with your fingers? We believe that robots should adapt to our human methods of communication, and not the other way around. The following robotics projects help us move away from screens and bring technology closer to humanity.
- Generating Robotic Emotional Body Language of Targeted Valence and Arousal with Conditional Variational Autoencoders. Mina Marmpena, Fernando Garcia, Angelica Lim. HRI 2020 LBR [Paper][Video]
- Investigating Positive Psychology Principles in Affective Robotics. Jimin Rhim, Anthony Cheung, David Pham, Subin Bae, Zhitian Zhang, Trista Townsend, Angelica Lim. ACII 2019 [Video]
- Generating robotic emotional body language with variational autoencoders. Mina Marmpena, Angelica Lim, Torbjorn S. Dahl and Nikolas Hemion. ACII 2019 [Paper][Video]
- How does the robot feel? Annotation of emotional expressions generated by a humanoid robot with affective quantifiers. Mina Marmpena, Angelica Lim, and Torbjørn S. Dahl. BAILAR 2017
- Habit detection within a long-term interaction with a social robot: an exploratory study. Claire Rivoire, Angelica Lim. DAA 2016
- Robot Musical Accompaniment: Integrating Audio and Visual Cues for Real-time Synchronization with a Human Flutist. Angelica Lim, Takeshi Mizumoto, Louis-Kenzo Cahier, Takuma Otsuka, Toru Takahashi, Kazunori Komatani, Tetsuya Ogata, Hiroshi G. Okuno. IROS 2010
Perceiving what humans do, feel and think
Robots still have difficulty navigating our human world. Using the sensor data available to them and perhaps a little learning, could it look at us and understand what we're doing? Feeling? Thinking? For example, if you walked in front of a robot, could it tell if you did it on purpose or by accident?
- The OMG-Empathy Dataset: Evaluating the Impact of Affective Behavior in Storytelling. Pablo Barros, Nikhil Churamani, Angelica Lim and Stefan Wermter. ACII 2019 [Paper]
- Towards an EmoCog Model for Multimodal Empathy Prediction. Bita Azari*, Zhitian Zhang*, Angelica Lim (*equal contribution). Workshop on Empathy Modelling in Real-World Scenarios: Lessons Learned from the OMG Empathy Prediction Challenge, FG 2019 [Poster] [Paper]
- Commodifying Pointing in HRI: Simple and Fast Pointing Gesture Detection from RGB-D Images. Bita Azari, Angelica Lim, Richard Vaughan. CRV 2019 [Paper]
- Gaze and Filled Pause Detection for Smooth Human-Robot Conversations. Miriam Bilac, Marine Chamoux, Angelica Lim. Humanoids 2017
- UE-HRI: A New Dataset for the Study of User Engagement in Spontaneous Human-Robot Interactions. Atef Ben Youssef, Miriam Bilac, Slim Essid, Chloé Clavel, Marine Chamoux, Angelica Lim. ICMI 2017
- The MEI Robot: Towards Using Motherese to Develop Multimodal Emotional Intelligence. Angelica Lim, Hiroshi G. Okuno. TAMD 2014
Understanding the human mind
It has been said that what we cannot build, we do not fully understand. We implement models of the brain and learning from psychology, neuroscience and developmental science, with the goal of discovering new ways for robots to learn and adapt.
- Grounding Adult Voices to Motherese and Feelings, Angelica Lim. Poster, SAS 2019
- A Recipe for Empathy: Integrating the Mirror System, Insula, Somatosensory Cortex and Motherese. Angelica Lim, Hiroshi G. Okuno. J. Social Robotics 2014
- Developing Robot Emotions through Interaction with Caregivers. Angelica Lim, Hiroshi G. Okuno. Synthesizing Human Emotion in Intelligent Systems and Robotics 2014
- Towards expressive musical robots: A cross-modal framework for emotional gesture, voice and music. Angelica Lim, Tetsuya Ogata, Hiroshi G. Okuno. EURASIP 2012