Perceiving what humans do, feel and think

Robots still have difficulty navigating our human world. Using the sensor data available to them, could a robot look at us and understand what we're doing? Feeling? Thinking?

  • Towards Inclusive HRI: Using Sim2Real to Address Underrepresentation in Emotion Expression Recognition. Saba Akhyani, Mehryar Abbasi Boroujeni, Mo Chen, Angelica Lim, IROS 2022 [Paper] [Github] [Project]

  • Optimal Control Inspired Probabilistic Model-Based Human Navigational Intent Inference. Pedram Agand, Mahdi Taherahmadi, Angelica Lim, Mo Chen, ICRA 2022 [Paper]

  • A Sim2Real Approach to Augment Low-Resource Data for Dynamic Emotion Expression Recognition. Saba Akhyani, Mehryar Abbasi, Mo Chen, Angelica Lim Workshop on 3D Vision and Robotics, CVPR 2021 [Paper]

  • The Many Faces of Anger: A Multicultural Video Dataset of Negative Emotions. Roya Javadi, Angelica Lim. FG 2021 [Paper][Dataset]

  • Investigating the Role of Culture on Negative Emotion Expressions in the Wild. Emma Hughson*, Roya Javadi*, James Thompson, Angelica Lim. Frontiers in Integrative Neuroscience, December 2021 [Paper][Dataset]

  • A Multimodal and Hybrid Framework for Human Navigational Intent Inference. Zhitian Zhang, Jimin Rhim, Angelica Lim, Mo Chen. IROS 2021 [Paper]

  • The SFU-Store-Nav 3D Virtual Human Platform for Human-Aware Robotics. Bronwyn Biro, Zhitian Zhang, Mo Chen, Angelica Lim. Workshop on Long-term Human Motion Prediction, ICRA 2021 [Paper][Video][Github][Dataset]

  • Developing a Data-Driven Categorical Taxonomy of Emotional Expressions in Real World Human Robot Interactions. Ghazal Saheb Jam, Jimin Rhim, Angelica Lim. HRI 2021 LBR [Paper][Video][Github] 🏆 Best Late Breaking Report Award (1/109 papers)

  • SFU-store-nav: A multimodal dataset for indoor human navigation. Zhitian Zhang, Jimin Rhim, Mahdi TaherAhmadi, Kefan Yang, Angelica Lim, Mo Chen. Data in Brief, December 2020 [Paper][Dataset]

  • Investigating the Role of Culture on Negative Emotional Expressions in the Wild (Abstract). Emma Hughson*, Roya Javadi*, Angelica Lim. Workshop on Affective Shared Perception, ICDL-EPIROB 2020 [Video][Abstract]

  • Joyful or Nervous? A Dataset of Awkward, Embarrassed and Uncomfortable Smiles (Abstract). Justin Heer, Jie Yan, Angelica Lim. Workshop on Affective Shared Perception, ICDL-EPIROB 2020 [Video][Abstract][Dataset][Code]

  • The OMG-Empathy Dataset: Evaluating the Impact of Affective Behavior in Storytelling. Pablo Barros, Nikhil Churamani, Angelica Lim and Stefan Wermter. ACII 2019 [Paper]

  • Towards an EmoCog Model for Multimodal Empathy Prediction. Bita Azari*, Zhitian Zhang*, Angelica Lim. Workshop on Empathy Modelling in Real-World Scenarios: Lessons Learned from the OMG Empathy Prediction Challenge, FG 2019 [Poster] [Paper]

  • Commodifying Pointing in HRI: Simple and Fast Pointing Gesture Detection from RGB-D Images. Bita Azari, Angelica Lim, Richard Vaughan. CRV 2019 [Paper]

  • Gaze and Filled Pause Detection for Smooth Human-Robot Conversations. Miriam Bilac, Marine Chamoux, Angelica Lim. Humanoids 2017

  • UE-HRI: A New Dataset for the Study of User Engagement in Spontaneous Human-Robot Interactions. Atef Ben Youssef, Miriam Bilac, Slim Essid, Chloé Clavel, Marine Chamoux, Angelica Lim. ICMI 2017 [Dataset]

  • The MEI Robot: Towards Using Motherese to Develop Multimodal Emotional Intelligence. Angelica Lim, Hiroshi G. Okuno. TAMD 2014

* equal contribution

Robots that adapt to us

Are you tired of typing? Emoting with your fingers? We believe that robots should adapt to our human methods of communication, and not the other way around. The following robotics projects help us move away from screens and bring technology closer to humanity.

  • Gesture2Vec: Clustering Gestures using Representation Learning Methods for Co-speech Gesture Generation. Payam Jome Yazdian, Mo Chen, Angelica Lim. IROS 2022. 🏆 Best Paper Award on Cognitive Robotics [Paper][Presentation][Demo][Project]

  • Perceptual Effects of Ambient Sound on An Artificial Agent's Rate of Speech. Akihiro Matsufuji and Angelica Lim. HRI 2021 LBR [Paper][Video]

  • How a Robot Should Speak Depends on Social, Environmental, Cognitive, Emotional, and Cultural Contexts. Akihiro Matsufuji and Angelica Lim. Sound in Human-Robot Interaction Workshop at HRI 2021 [Paper][Video]

  • Generating Robotic Emotional Body Language of Targeted Valence and Arousal with Conditional Variational Autoencoders. Mina Marmpena, Fernando Garcia, Angelica Lim. HRI 2020 LBR [Paper][Video]

  • Investigating Positive Psychology Principles in Affective Robotics. Jimin Rhim, Anthony Cheung, David Pham, Subin Bae, Zhitian Zhang, Trista Townsend, Angelica Lim. ACII 2019 [Video][Paper]

  • Generating robotic emotional body language with variational autoencoders. Mina Marmpena, Angelica Lim, Torbjorn S. Dahl and Nikolas Hemion. ACII 2019 [Paper][Video]

  • How does the robot feel? Annotation of emotional expressions generated by a humanoid robot with affective quantifiers. Mina Marmpena, Angelica Lim, and Torbjørn S. Dahl. BAILAR 2017

  • Habit detection within a long-term interaction with a social robot: an exploratory study. Claire Rivoire, Angelica Lim. DAA 2016

  • Robot Musical Accompaniment: Integrating Audio and Visual Cues for Real-time Synchronization with a Human Flutist. Angelica Lim, Takeshi Mizumoto, Louis-Kenzo Cahier, Takuma Otsuka, Toru Takahashi, Kazunori Komatani, Tetsuya Ogata, Hiroshi G. Okuno. IROS 2010

Understanding the human mind

It has been said that what we cannot build, we do not fully understand. We implement models of the brain and learning from psychology, neuroscience and developmental science, with the goal of discovering new ways for robots to learn and adapt.

  • A Deeper Look at Autonomous Vehicle Ethics: An Integrative Ethical Decision-Making Framework to Explain Moral Pluralism. Jimin Rhim, Ji-Hyun Lee, Mo Chen, Angelica Lim. Frontiers in Robotics and AI 2021 [Paper]

  • Grounding Adult Voices to Motherese and Feelings, Angelica Lim. Poster, SAS 2019

  • A Recipe for Empathy: Integrating the Mirror System, Insula, Somatosensory Cortex and Motherese. Angelica Lim, Hiroshi G. Okuno. J. Social Robotics 2014

  • Developing Robot Emotions through Interaction with Caregivers. Angelica Lim, Hiroshi G. Okuno. Synthesizing Human Emotion in Intelligent Systems and Robotics 2014

  • Towards expressive musical robots: A cross-modal framework for emotional gesture, voice and music. Angelica Lim, Tetsuya Ogata, Hiroshi G. Okuno. EURASIP 2012