Projects
A full list of publications can be found here.
Perceiving what humans do, feel and think
Robots still have difficulty navigating our human world. Using the sensor data available to them, could a robot look at us and understand what we're doing? Feeling? Thinking?
React to This! How Humans Challenge Interactive Agents using Nonverbal Behaviors. Chuxuan Zhang, Bermet Burkanova, Lawrence H. Kim, Lauren Yip, Ugo Cupcic, Stéphane Lallée, Angelica Lim. IROS 2024 [Paper][Project]
Predicting Long-Term Human Behaviors in Discrete Representations via Physics-Guided Diffusion. Zhitian Zhang, Anjian Li, Angelica Lim, Mo Chen. IROS 2024 [Paper][Project]
Contextual Emotion Recognition using Large Vision Language Models. Yasaman Etesam, Özge Nilay Yalçın, Chuxuan Zhang, Angelica Lim. IROS 2024 [Paper][Project]
Emotional Theory of Mind: Bridging Fast Visual Processing with Slow Linguistic Reasoning Yasaman Etesam. Nilay Yalcin, Chuxuan Zhang, Angelica Lim. ACII 2024 (<40% acceptance rate) [Paper][Project]
Contextual Emotion Estimation from Image Captions. Vera Yang, Archita Srivastava, Yasaman Etesam, Chuxuan Zhang, Angelica Lim. ACII 2023 (39.3% acceptance rate) [Project][Paper]
BERSting at the Screams: Recognition of Shouted and Distressed Speech from Smartphone Recordings. Paige Tüttosì and Angelica Lim. International Workshop on Affective Computing for Mental Wellbeing, (42% acceptance rate) ACII 2023 [Paper]
Towards Inclusive HRI: Using Sim2Real to Address Underrepresentation in Emotion Expression Recognition. Saba Akhyani, Mehryar Abbasi Boroujeni, Mo Chen, Angelica Lim, IROS 2022 [Paper] [Github] [Project]
Optimal Control Inspired Probabilistic Model-Based Human Navigational Intent Inference. Pedram Agand, Mahdi Taherahmadi, Angelica Lim, Mo Chen, ICRA 2022 [Paper]
A Sim2Real Approach to Augment Low-Resource Data for Dynamic Emotion Expression Recognition. Saba Akhyani, Mehryar Abbasi, Mo Chen, Angelica Lim Workshop on 3D Vision and Robotics, CVPR 2021 [Paper]
The Many Faces of Anger: A Multicultural Video Dataset of Negative Emotions. Roya Javadi, Angelica Lim. FG 2021 [Paper][Dataset]
Investigating the Role of Culture on Negative Emotion Expressions in the Wild. Emma Hughson*, Roya Javadi*, James Thompson, Angelica Lim. Frontiers in Integrative Neuroscience, December 2021 [Paper][Dataset]
A Multimodal and Hybrid Framework for Human Navigational Intent Inference. Zhitian Zhang, Jimin Rhim, Angelica Lim, Mo Chen. IROS 2021 [Paper]
The SFU-Store-Nav 3D Virtual Human Platform for Human-Aware Robotics. Bronwyn Biro, Zhitian Zhang, Mo Chen, Angelica Lim. Workshop on Long-term Human Motion Prediction, ICRA 2021 [Paper][Video][Github][Dataset]
Developing a Data-Driven Categorical Taxonomy of Emotional Expressions in Real World Human Robot Interactions. Ghazal Saheb Jam, Jimin Rhim, Angelica Lim. HRI 2021 LBR [Paper][Video][Github] 🏆 Best Late Breaking Report Award (1/109 papers)
SFU-store-nav: A multimodal dataset for indoor human navigation. Zhitian Zhang, Jimin Rhim, Mahdi TaherAhmadi, Kefan Yang, Angelica Lim, Mo Chen. Data in Brief, December 2020 [Paper][Dataset]
Investigating the Role of Culture on Negative Emotional Expressions in the Wild (Abstract). Emma Hughson*, Roya Javadi*, Angelica Lim. Workshop on Affective Shared Perception, ICDL-EPIROB 2020 [Video][Abstract]
Joyful or Nervous? A Dataset of Awkward, Embarrassed and Uncomfortable Smiles (Abstract). Justin Heer, Jie Yan, Angelica Lim. Workshop on Affective Shared Perception, ICDL-EPIROB 2020 [Video][Abstract][Dataset][Code]
The OMG-Empathy Dataset: Evaluating the Impact of Affective Behavior in Storytelling. Pablo Barros, Nikhil Churamani, Angelica Lim and Stefan Wermter. ACII 2019 [Paper]
Towards an EmoCog Model for Multimodal Empathy Prediction. Bita Azari*, Zhitian Zhang*, Angelica Lim. Workshop on Empathy Modelling in Real-World Scenarios: Lessons Learned from the OMG Empathy Prediction Challenge, FG 2019 [Poster] [Paper]
Commodifying Pointing in HRI: Simple and Fast Pointing Gesture Detection from RGB-D Images. Bita Azari, Angelica Lim, Richard Vaughan. CRV 2019 [Paper]
Gaze and Filled Pause Detection for Smooth Human-Robot Conversations. Miriam Bilac, Marine Chamoux, Angelica Lim. Humanoids 2017
UE-HRI: A New Dataset for the Study of User Engagement in Spontaneous Human-Robot Interactions. Atef Ben Youssef, Miriam Bilac, Slim Essid, Chloé Clavel, Marine Chamoux, Angelica Lim. ICMI 2017 [Dataset]
The MEI Robot: Towards Using Motherese to Develop Multimodal Emotional Intelligence. Angelica Lim, Hiroshi G. Okuno. TAMD 2014
* equal contribution
Robots that adapt to us
Are you tired of typing? Emoting with your fingers? We believe that robots should adapt to our human methods of communication, and not the other way around. The following robotics projects help us move away from screens and bring technology closer to humanity.
Adapting to Frequent Human Direction Changes in Autonomous Frontal Following Robots. Sahar Leisiazar, Seyed Roozbeh, Razavi Rohani, Edward J. Park, Angelica Lim, Mo Chen. Robotics and Automation Letters (RA-L) [Paper][Code][Project]
Mmm whatcha say? Uncovering distal and proximal context effects in first and second-language word perception using psychophysical reverse correlation. Paige Tuttösí, Henny Yeung, Yue Wang, Fenqi Wang, Guillaume Denis, Jean-Julien Aucouturier, Angelica Lim. Interspeech 2024 [Paper][Project]
EmoStyle: One-Shot Facial Expression Editing Using Continuous Emotion Parameters. Bita Azari and Angelica Lim. WACV 2024 [Paper][Project]
An MCTS-DRL Based Obstacle and Occlusion Avoidance Methodology in Robotic Follow-Ahead Applications. Sahar Leisiazar, Edward J. Park, Angelica Lim, Mo Chen. IROS 2023 [Paper][Project]
Read the Room: Adapting a Robot's Voice to Ambient and Social Contexts. Paige Tuttosi, Emma Hughson, Akihiro Matsufuji, Chuxuan Zhang, Angelica Lim. IROS 2023 [Paper][Project]
Gesture2Vec: Clustering Gestures using Representation Learning Methods for Co-speech Gesture Generation. Payam Jome Yazdian, Mo Chen, Angelica Lim. IROS 2022. 🏆 Best Paper Award on Cognitive Robotics [Paper][Presentation][Demo][Project]
Read the Room: Adapting a Robot's Voice to Ambient and Social Contexts. Emma Hughson, Paige Tuttosi, Akihiro Matsufuji, Angelica Lim. Workshop on Sound for Robots: Understanding and Harnessing Sound for Robotic Systems. ICRA 2022 [Paper]
Perceptual Effects of Ambient Sound on An Artificial Agent's Rate of Speech. Akihiro Matsufuji and Angelica Lim. HRI 2021 LBR [Paper][Video]
How a Robot Should Speak Depends on Social, Environmental, Cognitive, Emotional, and Cultural Contexts. Akihiro Matsufuji and Angelica Lim. Sound in Human-Robot Interaction Workshop at HRI 2021 [Paper][Video]
Generating Robotic Emotional Body Language of Targeted Valence and Arousal with Conditional Variational Autoencoders. Mina Marmpena, Fernando Garcia, Angelica Lim. HRI 2020 LBR [Paper][Video]
Investigating Positive Psychology Principles in Affective Robotics. Jimin Rhim, Anthony Cheung, David Pham, Subin Bae, Zhitian Zhang, Trista Townsend, Angelica Lim. ACII 2019 [Video][Paper]
Generating robotic emotional body language with variational autoencoders. Mina Marmpena, Angelica Lim, Torbjorn S. Dahl and Nikolas Hemion. ACII 2019 [Paper][Video]
How does the robot feel? Annotation of emotional expressions generated by a humanoid robot with affective quantifiers. Mina Marmpena, Angelica Lim, and Torbjørn S. Dahl. BAILAR 2017
Habit detection within a long-term interaction with a social robot: an exploratory study. Claire Rivoire, Angelica Lim. DAA 2016
Robot Musical Accompaniment: Integrating Audio and Visual Cues for Real-time Synchronization with a Human Flutist. Angelica Lim, Takeshi Mizumoto, Louis-Kenzo Cahier, Takuma Otsuka, Toru Takahashi, Kazunori Komatani, Tetsuya Ogata, Hiroshi G. Okuno. IROS 2010
Understanding the human mind
It has been said that what we cannot build, we do not fully understand. We implement models of the brain and learning from psychology, neuroscience and developmental science, with the goal of discovering new ways for robots to learn and adapt.
Systematic Review of Social Robots for Health and Wellbeing: A Personal Healthcare Journey Lens. Moojan Ghafurian, Shruti Chandra, Rebecca Hutchinson, Angelica Lim, Ishan Baliyan, Jimin Rhim, Garima Gupta, Alexander M. Aroyo, Samira Rasouli, Kerstin Dautenhahn, ACM Transactions on Human-Robot Interaction, October 2024 [Paper]
Co-design of a digital app "WhatMatters" to support person-centred care: A critical reflection. Guo, Yi Peng (Ellen); Sakamoto, Mariko; Wong, Karen Lok Yi; Mann, Jim; Berndt, Annette; Boger, Jenifer; Currie, Leanne; Raber, Caylee; Egeberg, Eva; Burke, Chelsea; Sood, Garima; Lim, Angelica; Yao, Sasha; Phinney, Alison; Hung, Lillian. International Journal of Geriatric Psychiatry. [Paper]
Speech adaptation in interactive scenarios: Talking with adults, babies, & robots. Angelica Lim, Paige Tuttösí, Henny Yeung, Yue Wang. Workshop on Workshop on Limits and benefits of information-theoretic perspectives in spoken communication, Interspeech 2023 [Abstract]
A Deeper Look at Autonomous Vehicle Ethics: An Integrative Ethical Decision-Making Framework to Explain Moral Pluralism. Jimin Rhim, Ji-Hyun Lee, Mo Chen, Angelica Lim. Frontiers in Robotics and AI 2021 [Paper]
Grounding Adult Voices to Motherese and Feelings, Angelica Lim. Poster, SAS 2019
A Recipe for Empathy: Integrating the Mirror System, Insula, Somatosensory Cortex and Motherese. Angelica Lim, Hiroshi G. Okuno. J. Social Robotics 2014
Developing Robot Emotions through Interaction with Caregivers. Angelica Lim, Hiroshi G. Okuno. Synthesizing Human Emotion in Intelligent Systems and Robotics 2014
Towards expressive musical robots: A cross-modal framework for emotional gesture, voice and music. Angelica Lim, Tetsuya Ogata, Hiroshi G. Okuno. EURASIP 2012
More publications.