Advancing intuitive human-machine interaction with human-like social capabilities for education in schools




Europe Flag

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 765955

Identifiez-vous pour accéder à la partie privée du site ou le modifier. Déconnexion.

ESR1 – Social context effects on expressive behaviour of embodied systems

Early-Stage Researcher: Rebecca Stower


Hi, my name is Rebecca and I’m from Brisbane, Australia. In 2016 I received my Bachelor of Psychological Science (Hons I), and am now currently completing my PhD under the supervision of Professor Arvid Kappas at Jacobs University as part of the ANIMATAS project. My background is in social and developmental psychology, with a specific interest in human-robot and human-computer interaction. As part of my PhD I therefore hope to investigate the role of emotion in the psychology of human-robot interaction. I am looking forward to working with the rest of the ANIMATAS fellows over the course of the project!

Main host institution: JacobsUni

Supervisor: Arvid Kappas (JacobsUni), in association with Catherine Pelachaud (UPMC)

Secondment institution: UPMC

Objectives: This ESR project will aim to model the moderating and generative role of social context on nonverbal expressive displays. It will consist of a consideration of explicit and implicit social context, the social relationship of agent and social context (e.g., moderator, peer), and cultural differences, in the mapping of internal affective states to expression on the one hand, as well as empathic responding to the social environment (e.g., interaction partner) on the other. There are two types of theories describing social modulation of expressions 1) display-rule type (e.g., Ekman’s neuro-cultural theory), or 2) behavioural ecology type (e.g., Fridlund). The project shall be relevant to both frameworks. Specifically, a joint framework shall be developed that can be used to modulate expressive behaviour in real time taking into account emotional states, task context, and social context (who is present and what is the relationship of sender to social context). Expression will be implemented via the virtual agent Greta platform that allows modelling of embodied conversational agents ECAs. ECAs can display a large variety of socio-emotional behaviours, be reactive to user’s behaviours and actions, be synchronized, and mimic and adapt their behaviour to that of users’. Emotional state will be based on an established framework, such as FAtiMA. 


Expected results: Completed PhD dissertation, software and algorithms for modelling socially-context sensitive facial behaviour, peer-reviewed publications, international journal and conference publications

For further information, contact: Arvid Kappas

Jacobs PhD policies:



ANIMATAS – MSCA – ITN – 2017 - 765955 2