SciELO - Scientific Electronic Library Online

 
vol.12 issue3Metric Learning to improve the persistent homology-based gait recognition author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Indicators

  • Have no cited articlesCited by SciELO

Related links

  • Have no similar articlesSimilars in SciELO

Share


Revista Cubana de Ciencias Informáticas

On-line version ISSN 2227-1899

Abstract

ACOSTA MENDOZA, Niusvel. A New Multi-graph Transformation Method for Frequent Approximate Subgraph Mining. Rev cuba cienc informat [online]. 2018, vol.12, n.3, pp.1-16. ISSN 2227-1899.

Frequent approximate subgraph (FAS) mining has been successfully applied in several science domains, because in many applications, approximate approaches have achieved better results than exact approaches. However, there are real applications based on multi-graphs where traditional FAS miners cannot be applied because they were not designed to deal with this type of graph. Only one method based on graph transformation, which allows the use of traditional simple-graph FAS miners on multi-graph problems was reported, but it has high computational cost. This paper aims at accelerating the mining process, thus a more efficient method is proposed for transforming multi-graphs into simple graphs and vice versa without losing topological or semantic information, that allows using traditional FAS mining algorithms and returning the mined patterns to the multi-graph space. Finally, we analyze the performance of the proposed method over synthetic multi-graph collections and additionally we show the effectiveness of the proposal in image classification tasks where images are represented as multi-graphs.

Keywords : approximate mining; frequent approximate subgraphs; graph-based classification; multi-graph mining.

        · abstract in Spanish     · text in English     · English ( pdf )

 

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License