We address the problem of teaching a robot how to autonomously perform table-cleaning tasks in a robust way. In particular, we focus on wiping and sweeping a table with a tool (e.g., a sponge). For the training phase, we use a set of kinestethic demonstrations performed over a table. The recorded 2D table-space trajectories, together with the images acquired by the robot, are used to train a deep convolutional network that automatically learns the parameters of a Gaussian Mixture Model that represents the hand movement. After the learning stage, the network is fed with the current image showing the location/shape of the dirt or stain to clean. The robot is able to perform cleaning arm-movements, obtained through Gaussian Mixture Regression using the mixture parameters provided by the network. Invariance to the robot posture is achieved by applying a plane-projective transformation before inputting the images to the neural network; robustness to illumination changes and other disturbances is increased by considering an augmented data set. This improves the generalization properties of the neural network, enabling for instance its use with the left arm after being trained using trajectories acquired with the right arm. The system was tested on the iCub robot generating a cleaning behaviour similar to the one of human demonstrators.

Autonomous table-cleaning from kinesthetic demonstrations using deep learning / Cauli N.; Vicente P.; Kim J.; Damas B.; Bernardino A.; Cavallo F.; Santos-Victor J.. - ELETTRONICO. - (2018), pp. 26-32. (Intervento presentato al convegno Joint 8th IEEE International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018 tenutosi a Waseda University Ono Auditorium, jpn nel 2018) [10.1109/DEVLRN.2018.8761013].

Autonomous table-cleaning from kinesthetic demonstrations using deep learning

Cauli N.;Kim J.;Cavallo F.
;
2018

Abstract

We address the problem of teaching a robot how to autonomously perform table-cleaning tasks in a robust way. In particular, we focus on wiping and sweeping a table with a tool (e.g., a sponge). For the training phase, we use a set of kinestethic demonstrations performed over a table. The recorded 2D table-space trajectories, together with the images acquired by the robot, are used to train a deep convolutional network that automatically learns the parameters of a Gaussian Mixture Model that represents the hand movement. After the learning stage, the network is fed with the current image showing the location/shape of the dirt or stain to clean. The robot is able to perform cleaning arm-movements, obtained through Gaussian Mixture Regression using the mixture parameters provided by the network. Invariance to the robot posture is achieved by applying a plane-projective transformation before inputting the images to the neural network; robustness to illumination changes and other disturbances is increased by considering an augmented data set. This improves the generalization properties of the neural network, enabling for instance its use with the left arm after being trained using trajectories acquired with the right arm. The system was tested on the iCub robot generating a cleaning behaviour similar to the one of human demonstrators.
2018
IEEE 8th International Conference on Development and Learning and Epigenetic Robotics
Joint 8th IEEE International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018
Waseda University Ono Auditorium, jpn
2018
Cauli N.; Vicente P.; Kim J.; Damas B.; Bernardino A.; Cavallo F.; Santos-Victor J.
File in questo prodotto:
File Dimensione Formato  
2018 - Autonomous table-cleaning from kinesthetic demonstrations using deep learning.pdf

accesso aperto

Tipologia: Versione finale referata (Postprint, Accepted manuscript)
Licenza: Open Access
Dimensione 2.84 MB
Formato Adobe PDF
2.84 MB Adobe PDF

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/1255037
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 5
social impact