Angelica Lim
Title
Cited by
Cited by
Year
Robot musical accompaniment: integrating audio and visual cues for real-time synchronization with a human flutist
A Lim, T Mizumoto, LK Cahier, T Otsuka, T Takahashi, K Komatani, ...
2010 IEEE/RSJ International Conference on Intelligent Robots and Systems …, 2010
392010
UE-HRI: a new dataset for the study of user engagement in spontaneous human-robot interactions
A Ben-Youssef, C Clavel, S Essid, M Bilac, M Chamoux, A Lim
Proceedings of the 19th ACM international conference on multimodal …, 2017
322017
The MEI Robot: Towards Using Motherese to Develop Multimodal Emotional Intelligence
A Lim, HG Okuno
IEEE Transactions on Autonomous Mental Development 6 (2), 126-138, 2014
312014
Towards expressive musical robots: a cross-modal framework for emotional gesture, voice and music
A Lim, T Ogata, HG Okuno
EURASIP Journal on Audio, Speech, and Music Processing 2012 (1), 3, 2012
282012
A Recipe for Empathy: Integrating the Mirror System, Insula, Somatosensory Cortex and Motherese
A Lim, HG Okuno
International Journal of Social Robotics 7 (1), 35-49, 2015
262015
Converting emotional voice to motion for robot telepresence
A Lim, T Ogata, HG Okuno
2011 11th IEEE-RAS International Conference on Humanoid Robots, 472-479, 2011
182011
Integration of flutist gesture recognition and beat tracking for human-robot ensemble
T Mizumoto, A Lim, T Otsuka, K Nakadai, T Takahashi, T Ogata, ...
Proc of IEEE/RSJ-2010 Workshop on Robots and Musical Expression, 159-171, 2010
142010
Using speech data to recognize emotion in human gait
A Lim, HG Okuno
International Workshop on Human Behavior Understanding, 52-64, 2012
112012
How does the robot feel? perception of valence and arousal in emotional body language
M Marmpena, A Lim, TS Dahl
Paladyn, Journal of Behavioral Robotics 9 (1), 168-182, 2018
92018
Habit detection within a long-term interaction with a social robot: an exploratory study
C Rivoire, A Lim
Proceedings of the International Workshop on Social Learning and Multimodal …, 2016
72016
A musical robot that synchronizes with a coplayer using non-verbal cues
A Lim, T Mizumoto, T Ogata, HG Okuno
Advanced Robotics 26 (3-4), 363-381, 2012
72012
Hri 2018 workshop: Social robots in the wild
R Mead, DH Grollman, A Lim, C Yeung, A Stout, WB Knox
Companion of the 2018 ACM/IEEE International Conference on Human-Robot …, 2018
62018
A multimodal tempo and beat-tracking system based on audiovisual information from live guitar performances
T Itohara, T Otsuka, T Mizumoto, A Lim, T Ogata, HG Okuno
EURASIP Journal on Audio, Speech, and Music Processing 2012 (1), 6, 2012
62012
Audio-visual musical instrument recognition
A Lim, K Nakamura, K Nakadai, T Ogata, HG Okuno
情報処理学会第 73 回全国大会 5, 9, 2011
62011
Developing robot emotions through interaction with caregivers
A Lim, HG Okuno
Handbook of Research on Synthesizing Human Emotion in Intelligent Systems …, 2015
52015
Robot musical accompaniment: real-time synchronization using visual cue recognition
A Lim, T Mizumoto, T Otsuka, T Takahashi, K Komatani, T Ogata, ...
Proceedings of (IROS) IEEE/RSJ International Conference on Intelligent …, 2010
42010
Generating robotic emotional body language with variational autoencoders
M Marmpena, A Lim, TS Dahl, N Hemion
2019 8th International Conference on Affective Computing and Intelligent …, 2019
22019
The OMG-Empathy Dataset: Evaluating the Impact of Affective Behavior in Storytelling
P Barros, N Churamani, A Lim, S Wermter
2019 8th International Conference on Affective Computing and Intelligent …, 2019
22019
Making a robot dance to diverse musical genre in noisy environments
JL Oliveira, K Nakamura, T Langlois, F Gouyon, K Nakadai, A Lim, ...
2014 IEEE/RSJ International Conference on Intelligent Robots and Systems …, 2014
22014
The DESIRE Model: Cross-modal emotion analysis and expression for robots
A Lim, T Ogata, HG Okuno
情報処理学会第 74 回全国大会 5, 4, 2012
22012
The system can't perform the operation now. Try again later.
Articles 1–20