Schwarz, Patricia and Spanknebel, Sebastian and Immel, Diana and Hurlemann, Rene and Hein, Andreas and Hellmers, Sandra
Frontiers in Robotics and AI
Realistic reproduction of human facial expressions is essential for realistic interactions between humans and humanoid robots. This work presents a data-driven framework for transferring human facial expressions to a humanoid robot and a virtual avatar, aiming to enhance emotional expressiveness and assess its applicability in psychiatric training scenarios. The proposed approach enables cross-domain facial expression mapping while accounting for mechanical constraints of robotic actuation. A user study (n = 40) evaluated emotion recognition across three stimulus categories: human faces (H), unconstrained virtual avatars (A) and humanoid robots with limited facial actuation (R). Participants identified emotions from static images and from dynamic expression sequences, presented with and without speech. Perceived realism and uncanny valley effects were assessed using an eight-item questionnaire rated on a 7-point Likert scale. Results indicate that human-to-robot facial expression transfer is feasible but constrained by mechanical expressivity. Highly expressive emotions such as surprise (H: 87.5%; A: 57.5%; R: 65%) and fear (H: 45%; A: 27.5%; R: 57.5%) achieved moderate recognition rates, whereas subtle emotions such as anger (H: 65%; A: 40%; R: 12.5%) and disgust (H: 60%; A: 10%; R: 22.5%) were poorly recognized on the robot. Dynamic expressions combined with speech significantly improved recognition. These findings demonstrate the feasibility of transferring human facial expressions to humanoid robots while highlighting current limitations of robotic facial actuation. The proposed framework provides a promising basis for emotionally realistic patient simulation and training applications in mental healthcare.