During social interactions, humans are capable of initiating and responding to rich and complex social actions despite having incomplete world knowledge, and physical, perceptual and computational constraints. This capability relies on action perception mechanisms that exploit regularities in observed goal-oriented behaviours to generate robust predictions and reduce the workload of sensing systems. To achieve this essential capability, we argue that the following three factors are fundamental. First, human knowledge is frequently hierarchically structured, both in the perceptual and execution domains. Second, human perception is an active process driven by current task requirements and context; this is particularly important when the perceptual input is complex (e.g. human motion) and the agent has to operate under embodiment constraints. Third, learning is at the heart of action perception mechanisms, underlying the agent’s ability to add new behaviours to its repertoire. Based on these factors, we review multiple instantiations of a hierarchically-organised biologically-inspired framework for embodied action perception, demonstrating its flexibility in addressing the rich computational contexts of action perception and learning in robotic platforms.

Ognibene, D., Wu, Y., Lee, K., Demiris, Y. (2013). Hierarchies for embodied action perception. In M.M. Baldassarre G. (a cura di), Computational and Robotic Models of the Hierarchical Organization of Behavior (pp. 237-254). Springer [10.1007/978-3-642-39875-9_5].

Hierarchies for embodied action perception

Ognibene D
Primo
;
2013

Abstract

During social interactions, humans are capable of initiating and responding to rich and complex social actions despite having incomplete world knowledge, and physical, perceptual and computational constraints. This capability relies on action perception mechanisms that exploit regularities in observed goal-oriented behaviours to generate robust predictions and reduce the workload of sensing systems. To achieve this essential capability, we argue that the following three factors are fundamental. First, human knowledge is frequently hierarchically structured, both in the perceptual and execution domains. Second, human perception is an active process driven by current task requirements and context; this is particularly important when the perceptual input is complex (e.g. human motion) and the agent has to operate under embodiment constraints. Third, learning is at the heart of action perception mechanisms, underlying the agent’s ability to add new behaviours to its repertoire. Based on these factors, we review multiple instantiations of a hierarchically-organised biologically-inspired framework for embodied action perception, demonstrating its flexibility in addressing the rich computational contexts of action perception and learning in robotic platforms.
Capitolo o saggio
Forward Model; Inverse Model; Parse Tree; Minimum Description Length; Grip Aperture; Intention recognition; Robotics; action perception; cognitve robots; social robotics;
English
Computational and Robotic Models of the Hierarchical Organization of Behavior
Baldassarre G., Mirolli M.
2013
978-3-642-39874-2
Springer
237
254
Ognibene, D., Wu, Y., Lee, K., Demiris, Y. (2013). Hierarchies for embodied action perception. In M.M. Baldassarre G. (a cura di), Computational and Robotic Models of the Hierarchical Organization of Behavior (pp. 237-254). Springer [10.1007/978-3-642-39875-9_5].
open
File in questo prodotto:
File Dimensione Formato  
ognibene2013hierarchies.pdf

accesso aperto

Tipologia di allegato: Submitted Version (Pre-print)
Dimensione 2.27 MB
Formato Adobe PDF
2.27 MB Adobe PDF Visualizza/Apri
ognibene2013hierarchies.pdf

accesso aperto

Tipologia di allegato: Submitted Version (Pre-print)
Dimensione 2.27 MB
Formato Adobe PDF
2.27 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/302026
Citazioni
  • Scopus 5
  • ???jsp.display-item.citation.isi??? ND
Social impact