Recognizing human actions in 3D video sequences is an important open problem that is currently at the heart of many research domains including surveillance, natural interfaces and rehabilitation. However, the design and development of models for action recognition that are both accurate and efficient is a challenging task due to the variability of the human pose, clothing and appearance. In this paper, we propose a new framework to extract a compact representation of a human action captured through a depth sensor, and enable accurate action recognition. The proposed solution develops on fitting a human skeleton model to acquired data so as to represent the 3D coordinates of the joints and their change over time as a trajectory in a suitable action space. Thanks to such a 3D joint-based framework, the proposed solution is capable to capture both the shape and the dynamics of the human body simultaneously. The action recognition problem is then formulated as the problem of computing the similarity between the shape of trajectories in a Riemannian manifold. Classification using kNN is finally performed on this manifold taking advantage of Riemannian geometry in the open curve shape space. Experiments are carried out on four representative benchmarks to demonstrate the potential of the proposed solution in terms of accuracy/latency for a low-latency action recognition. Comparative results with state-of-the-art methods are reported.

3-D Human Action Recognition by Shape Analysis of Motion Trajectories on Riemannian Manifold / M. Devanne; H. Wannous; S. Berretti; P. Pala; M. Daoudi; A. Del Bimbo. - In: IEEE TRANSACTIONS ON CYBERNETICS. - ISSN 2168-2267. - STAMPA. - 45:(2015), pp. 1340-1352. [10.1109/TCYB.2014.2350774]

3-D Human Action Recognition by Shape Analysis of Motion Trajectories on Riemannian Manifold

DEVANNE, MAXIME;BERRETTI, STEFANO;PALA, PIETRO;DEL BIMBO, ALBERTO
2015

Abstract

Recognizing human actions in 3D video sequences is an important open problem that is currently at the heart of many research domains including surveillance, natural interfaces and rehabilitation. However, the design and development of models for action recognition that are both accurate and efficient is a challenging task due to the variability of the human pose, clothing and appearance. In this paper, we propose a new framework to extract a compact representation of a human action captured through a depth sensor, and enable accurate action recognition. The proposed solution develops on fitting a human skeleton model to acquired data so as to represent the 3D coordinates of the joints and their change over time as a trajectory in a suitable action space. Thanks to such a 3D joint-based framework, the proposed solution is capable to capture both the shape and the dynamics of the human body simultaneously. The action recognition problem is then formulated as the problem of computing the similarity between the shape of trajectories in a Riemannian manifold. Classification using kNN is finally performed on this manifold taking advantage of Riemannian geometry in the open curve shape space. Experiments are carried out on four representative benchmarks to demonstrate the potential of the proposed solution in terms of accuracy/latency for a low-latency action recognition. Comparative results with state-of-the-art methods are reported.
2015
45
1340
1352
M. Devanne; H. Wannous; S. Berretti; P. Pala; M. Daoudi; A. Del Bimbo
File in questo prodotto:
File Dimensione Formato  
tcyb15.pdf

Accesso chiuso

Descrizione: Articolo principale
Tipologia: Pdf editoriale (Version of record)
Licenza: Tutti i diritti riservati
Dimensione 1.75 MB
Formato Adobe PDF
1.75 MB Adobe PDF   Richiedi una copia
IEEESMC2014.pdf

accesso aperto

Tipologia: Versione finale referata (Postprint, Accepted manuscript)
Licenza: Open Access
Dimensione 4.07 MB
Formato Adobe PDF
4.07 MB Adobe PDF

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/891133
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 263
  • ???jsp.display-item.citation.isi??? 221
social impact