Program (CET):

16:00 – 16:10       Introduction by the organizers

16:10 – 16:35       Invited Speaker: M. Chetouani – Quantifying and Exploiting Implicit Human Communication Changes (20m talk+5m quest.)

16:35 – 16:43      Soumaya Sabry – People Identity Learning in HRI through an Incremental Multimodal Approach (5m video + 3m quest.)

16:43 – 17:08       Invited Speaker: G. Sandini – Developing Familiarity with Robots (20m talk+5m quest.)

17:08 – 17:16      Fatemeh Ziaeetabar – Representation and Recognition of Manipulation actions in a Human Robot Interaction (5m video + 3m quest.)

17:16 – 17:30       Break

17:30 – 17:55       Invited Speaker: A. D’Ausilio – The Sensorimotor Basis of Human Communication (20m talk+5m quest.)

17:55 – 18:03      Matej Hoffmann – Should a small robot have a small personal space? Investigating personal spatial zones and proxemic behavior in human-robot interaction (5m video + 3m quest.)

18:03 – 18:28       Invited Speaker: M. Staffa – From Humanized Robots to “Robotized” Humans (20m talk+5m quest.)

18:28 – 18:36      Mohamadreza Faridghasemnia – Towards Abstract Relational Learning in Human Robot Interaction (5m video + 3m quest.)

18:36 – 19:30       Debate/Panel

 

 

Keynote speaker: Mohamed Chetouani

Title: Quantifying and Exploiting Implicit Human Communication Changes

Humans exploit verbal and non-verbal communication channels to convey information, intentions and/or express emotions. In this talk, we will focus on non-verbal cues employed to communicate about goals such as engagement, showing, teaching, etc. These cues could take various forms, including being multimodal, but they share the fact that humans exaggerate them. We will discuss machine learning methods designed for quantification of such non-verbal cues. In the case of interactive human-robot learning, such non-verbal cues impact the learning process. We will show how to develop models able to exploit such non-verbal cues.

Keynote speaker: Giulio Sandini

Title: Developing Familiarity with Robots

In spite of the very fast advancements of robot technology and the general impression that personal robots will soon enter our everyday life to support activities of daily living, the impact of social robots is still very limited and human’s personal interaction with robots rarely goes beyond a temporary, novelty-driven period.

During the presentation I will present what I consider are some of the ingredients needed to endow the robot (as an embodied agent) with the ability to build a personalized long-term relationship with humans and to develop the kind of positive familiarity necessary to establish a mutually predictive assistance tailored to individual situations and users.

Keynote speaker: Alessandro D’Ausilio

Title: The Sensorimotor Basis of Human Communication

Humans are innately social creatures, but cognitive neuroscience, traditionally focused on individual brains, is just beginning to investigate social cognition through the lens of realistic interpersonal interaction. However, quantitative investigation of the dynamical sensorimotor communication among interacting individuals in goal-directed ecological tasks is particularly challenging. The presentation will start from basic neurophysiological studies describing the basic mechanisms of inter-individual sensorimotor communication. I will then move to the discussion of current attempts to quantify sensorimotor information flow among interacting participant through the verbal and non-verbal channels.

Keynote speaker: Mariacarla Staffa

Title: From Humanized Robots to “Robotized” Humans

New trends in Human Robot communication research suggest the need to verturning the classic vision in which the human user is at the center of the design to also account for robots’ needs, in terms of useful information to infer human internal state and, in turn, achieve a deeper understanding of her. While on the one hand, robots should be designed in a humanized way in order to let humans ascribing to them some particular functionality with the aim of enhancing acceptability and understanding, as a counterpart, the need arose for a human to facilitate the interaction understanding by robotizing her own body by means of non-invasive wearable devices. This talk offers a new vision allowing a bi-directional interaction design, in which  a human actively participate to the facilitation of the understanding of the communication by allowing access to her abilities and emotions in order to attempt to achieve a sort of theory of mind between humans and robots based not only on external observable behaviors but also basing on objective gathered from bio-signals feedbacks.