One of the defining characteristics of human cognition is our outstanding capacity to cooperate. A central requirement for cooperation is the ability to establish a “shared plan”-which defines the interlaced actions of the two cooperating agents-in real time, and even to negotiate this shared plan during its execution. In the current research we identify the requirements for cooperation, extending our earlier work in this area. These requirements include the ability to negotiate a shared plan using spoken language, to learn new component actions within that plan, based on visual observation and kinesthetic demonstration, and finally to coordinate all of these functions in real time. We present a cognitive system that implements these requirements, and demonstrate the system's ability to allow a Nao humanoid robot to learn a nontrivial cooperative task in real-time. We further provide a concrete demonstration of how the real-time learning capability can be easily deployed on a different platform, in this case the iCub humanoid. The results are considered in the context of how the development of language in the human infant provides a powerful lever in the development of cooperative plans from lower-level sensorimotor capabilities.

Petit, M., Lallee, S., Boucher, J., Pointeau, G., Cheminade, P., Ognibene, D., et al. (2013). The Coordinating Role of Language in Real-Time Multimodal Learning of Cooperative Tasks. IEEE TRANSACTIONS ON AUTONOMOUS MENTAL DEVELOPMENT, 5(1), 3-17 [10.1109/TAMD.2012.2209880].

The Coordinating Role of Language in Real-Time Multimodal Learning of Cooperative Tasks

Ognibene D;
2013

Abstract

One of the defining characteristics of human cognition is our outstanding capacity to cooperate. A central requirement for cooperation is the ability to establish a “shared plan”-which defines the interlaced actions of the two cooperating agents-in real time, and even to negotiate this shared plan during its execution. In the current research we identify the requirements for cooperation, extending our earlier work in this area. These requirements include the ability to negotiate a shared plan using spoken language, to learn new component actions within that plan, based on visual observation and kinesthetic demonstration, and finally to coordinate all of these functions in real time. We present a cognitive system that implements these requirements, and demonstrate the system's ability to allow a Nao humanoid robot to learn a nontrivial cooperative task in real-time. We further provide a concrete demonstration of how the real-time learning capability can be easily deployed on a different platform, in this case the iCub humanoid. The results are considered in the context of how the development of language in the human infant provides a powerful lever in the development of cooperative plans from lower-level sensorimotor capabilities.
Articolo in rivista - Articolo scientifico
Cooperation; humanoid robot; shared plans; situated and social learning; spoken language interaction;
English
2013
5
1
3
17
6249732
reserved
Petit, M., Lallee, S., Boucher, J., Pointeau, G., Cheminade, P., Ognibene, D., et al. (2013). The Coordinating Role of Language in Real-Time Multimodal Learning of Cooperative Tasks. IEEE TRANSACTIONS ON AUTONOMOUS MENTAL DEVELOPMENT, 5(1), 3-17 [10.1109/TAMD.2012.2209880].
File in questo prodotto:
File Dimensione Formato  
stamped_Petit-IEEE-TAMD-finalaccepted.pdf

Solo gestori archivio

Dimensione 963.64 kB
Formato Adobe PDF
963.64 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/301957
Citazioni
  • Scopus 35
  • ???jsp.display-item.citation.isi??? 34
Social impact