Affective responses are one of the primary and clearer signals used by agents for communicating their internal state. These internal states can represent a positive or negative acceptance of a robotic agent's behavior during a humanrobot interaction (HRI). In these scenarios, it is fundamental for robots to be able to interpret people's emotional responses and to adjust their behaviors accordingly, to appease them, and to provoke an emotional change in them. This research investigates the impact of robot facial expressions on human emotional experiences within HRI, focusing specifically on whether a robot's expressions can amplify or mitigate users' emotional responses when viewing emotion-eliciting videos. To evaluate participants' emotional states, an AI-based multimodal emotion recognition approach was employed, combining analysis of facial expressions and physiological signals, complemented by a self-assessment questionnaire. Findings indicate that participants responded more positively when the robot's facial expressions aligned with the emotional tone of the videos, suggesting that emotioncoherent displays could enhance user experience and strengthen engagement. These results underscore the potential for expressive social robots to influence human emotions effectively, offering promising applications in therapy, education, and entertainment. By incorporating emotional facial expressions, socially assistive robots could foster behavior change and emotional engagement in HRI, broadening their role in supporting human emotional well-being.
Assessing Emotion Mitigation through Robot Facial Expressions for Human-Robot Interaction / D'Arco, L.; Rossi, A.; Rossi, S.. - 3932:(2025), pp. 46-51. ( 2024 Workshop on Advanced AI Methods and Interfaces for Human-Centered Assistive and Rehabilitation Robotics, AIxIA F4MR WS 2024 ita 2024).
Assessing Emotion Mitigation through Robot Facial Expressions for Human-Robot Interaction
D'Arco L.
;Rossi A.;Rossi S.
2025
Abstract
Affective responses are one of the primary and clearer signals used by agents for communicating their internal state. These internal states can represent a positive or negative acceptance of a robotic agent's behavior during a humanrobot interaction (HRI). In these scenarios, it is fundamental for robots to be able to interpret people's emotional responses and to adjust their behaviors accordingly, to appease them, and to provoke an emotional change in them. This research investigates the impact of robot facial expressions on human emotional experiences within HRI, focusing specifically on whether a robot's expressions can amplify or mitigate users' emotional responses when viewing emotion-eliciting videos. To evaluate participants' emotional states, an AI-based multimodal emotion recognition approach was employed, combining analysis of facial expressions and physiological signals, complemented by a self-assessment questionnaire. Findings indicate that participants responded more positively when the robot's facial expressions aligned with the emotional tone of the videos, suggesting that emotioncoherent displays could enhance user experience and strengthen engagement. These results underscore the potential for expressive social robots to influence human emotions effectively, offering promising applications in therapy, education, and entertainment. By incorporating emotional facial expressions, socially assistive robots could foster behavior change and emotional engagement in HRI, broadening their role in supporting human emotional well-being.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


