Intelligent and Interactive Systems

User Tools

Site Tools


research:projects

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision Both sides next revision
research:projects [2021/10/25 22:35]
Justus Piater
research:projects [2021/11/19 20:07]
Justus Piater
Line 3: Line 3:
 ===== Current Projects ===== ===== Current Projects =====
  
-**ELSA** - Effective Learning of Social ​Affordancesfor human-robot interaction ​(ANR/FWF AAPG, 2022-2026): Affordances are action opportunities directly perceived by an agent to interact with its environment. The concept is gaining interest in robotics, where it offers a rich description of the objects and the environment,​ focusing on the potential interactions rather than the sole physical properties. In this project, we extend this notion to social affordances. The goal is for robots to autonomously learn not only the physical effects of interactive actions with humans, but also the humansʼ reactions they produce (emotion, speech, movement). For instance, pointing and gazing in the same direction make humans orient towards the pointed direction, while pointing and looking at the finger make humans look at the finger. Besides, scratching the robotʼs chin makes some but not all humans smile. The project will investigate how learning human- general and human-specific social affordances can enrich a robotʼs action repertoire for human-aware task planning and efficient human-robot interaction.+**ELSA** - Effective Learning of Social ​Affordances for Human-Robot Interaction ​(ANR/FWF AAPG, 2022-2026): Affordances are action opportunities directly perceived by an agent to interact with its environment. The concept is gaining interest in robotics, where it offers a rich description of the objects and the environment,​ focusing on the potential interactions rather than the sole physical properties. In this project, we extend this notion to social affordances. The goal is for robots to autonomously learn not only the physical effects of interactive actions with humans, but also the humansʼ reactions they produce (emotion, speech, movement). For instance, pointing and gazing in the same direction make humans orient towards the pointed direction, while pointing and looking at the finger make humans look at the finger. Besides, scratching the robotʼs chin makes some but not all humans smile. The project will investigate how learning human- general and human-specific social affordances can enrich a robotʼs action repertoire for human-aware task planning and efficient human-robot interaction.
  
 <​html>​ <​html>​
research/projects.txt · Last modified: 2024/02/19 12:24 by Antonio Rodriguez-Sanchez