Angelica Lim
Cited by
Cited by
UE-HRI: a new dataset for the study of user engagement in spontaneous human-robot interactions
A Ben-Youssef, C Clavel, S Essid, M Bilac, M Chamoux, A Lim
Proceedings of the 19th ACM international conference on multimodal …, 2017
A recipe for empathy: Integrating the mirror system, insula, somatosensory cortex and motherese
A Lim, HG Okuno
International Journal of Social Robotics 7, 35-49, 2015
The MEI robot: Towards using motherese to develop multimodal emotional intelligence
A Lim, HG Okuno
IEEE Transactions on Autonomous Mental Development 6 (2), 126-138, 2014
Robot musical accompaniment: integrating audio and visual cues for real-time synchronization with a human flutist
A Lim, T Mizumoto, LK Cahier, T Otsuka, T Takahashi, K Komatani, ...
2010 IEEE/RSJ International Conference on Intelligent Robots and Systems …, 2010
Towards expressive musical robots: a cross-modal framework for emotional gesture, voice and music
A Lim, T Ogata, HG Okuno
EURASIP Journal on Audio, Speech, and Music Processing 2012, 1-12, 2012
Musical robots and interactive multimodal systems: An introduction
J Solis, K Ng
Musical Robots and Interactive Multimodal Systems, 1-12, 2011
How does the robot feel? Perception of valence and arousal in emotional body language
M Marmpena, A Lim, TS Dahl
Paladyn, Journal of Behavioral Robotics 9 (1), 168-182, 2018
Converting emotional voice to motion for robot telepresence
A Lim, T Ogata, HG Okuno
2011 11th IEEE-RAS International Conference on Humanoid Robots, 472-479, 2011
A deeper look at autonomous vehicle ethics: an integrative ethical decision-making framework to explain moral pluralism
J Rhim, JH Lee, M Chen, A Lim
Frontiers in Robotics and AI 8, 632394, 2021
Commodifying pointing in hri: simple and fast pointing gesture detection from rgb-d images
B Azari, A Lim, R Vaughan
2019 16th Conference on Computer and Robot Vision (CRV), 174-180, 2019
Generating robotic emotional body language with variational autoencoders
M Marmpena, A Lim, TS Dahl, N Hemion
2019 8th international conference on affective computing and intelligent …, 2019
Gesture2Vec: Clustering gestures using representation learning methods for co-speech gesture generation
PJ Yazdian, M Chen, A Lim
2022 IEEE/RSJ International Conference on Intelligent Robots and Systems …, 2022
Inclusive HRI: equity and diversity in design, application, methods, and community
MMA De Graaf, G Perugia, E Fosch-Villaronga, A Lim, F Broz, ES Short, ...
2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI …, 2022
Integration of flutist gesture recognition and beat tracking for human-robot ensemble
T Mizumoto, A Lim, T Otsuka, K Nakadai, T Takahashi, T Ogata, ...
Proc of IEEE/RSJ-2010 Workshop on Robots and Musical Expression 11, 2010
Gaze and filled pause detection for smooth human-robot conversations
M Bilac, M Chamoux, A Lim
2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids …, 2017
Habit detection within a long-term interaction with a social robot: an exploratory study
C Rivoire, A Lim
Proceedings of the International Workshop on Social Learning and Multimodal …, 2016
The omg-empathy dataset: Evaluating the impact of affective behavior in storytelling
P Barros, N Churamani, A Lim, S Wermter
2019 8th International Conference on Affective Computing and Intelligent …, 2019
Using speech data to recognize emotion in human gait
A Lim, HG Okuno
Human Behavior Understanding: Third International Workshop, HBU 2012 …, 2012
Human navigational intent inference with probabilistic and optimal approaches
P Agand, M Taherahmadi, A Lim, M Chen
2022 International Conference on Robotics and Automation (ICRA), 8562-8568, 2022
Developing a data-driven categorical taxonomy of emotional expressions in real world human robot interactions
G Saheb Jam, J Rhim, A Lim
Companion of the 2021 ACM/IEEE international conference on human-robot …, 2021
The system can't perform the operation now. Try again later.
Articles 1–20