SciELO - Scientific Electronic Library Online

 
vol.12 issue3A New Multi-graph Transformation Method for Frequent Approximate Subgraph MiningA new system for human movement induction based on virtual reality author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Indicators

  • Have no cited articlesCited by SciELO

Related links

  • Have no similar articlesSimilars in SciELO

Share


Revista Cubana de Ciencias Informáticas

On-line version ISSN 2227-1899

Abstract

AGUIRRE CARRAZANA, Guillermo; LAMAR-LEON, Javier  and  PLASENCIA CALANA, Yenisel. Metric Learning to improve the persistent homology-based gait recognition. Rev cuba cienc informat [online]. 2018, vol.12, n.3, pp.17-31. ISSN 2227-1899.

Gait recognition is an important biometric technique for video surveillance tasks, due to the advantage of using it at distance. In this paper, we present a persistent homology-based method to extract topological features from the body silhouettes of a gait sequence. It has been used before in several papers for the second author for human identification, gender classification, carried object detection and monitoring human activities at distance. As the previous work, we apply persistent homology to extract topological features from the lowest fourth part of the body silhouette to decrease the negative effects of variations unrelated to the gait in the upper body part. The novelty of this paper is the introduction of the use of a metric learning to learn a Mahalanobis distance metric to robust gait recognition, where we use Linear Discriminant Analysis. This learned metric enforces objects for the same class to be closer while objects from different classes are pulled apart. We evaluate our approach using the CASIA-B dataset and we show the effectiveness of the methods proposed compared with other state-of-the-art methods.

Keywords : TDA; gait recognition; persistent homology; linear discriminant analysis; metric learning.

        · abstract in Spanish     · text in English     · English ( pdf )

 

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License