SciELO - Scientific Electronic Library Online

 
vol.12 número3Aprendizaje de métrica para mejorar el reconocimiento del andar basado en la homología persistente índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

  • No hay articulos citadosCitado por SciELO

Links relacionados

  • No hay articulos similaresSimilares en SciELO

Compartir


Revista Cubana de Ciencias Informáticas

versión On-line ISSN 2227-1899

Resumen

ACOSTA MENDOZA, Niusvel. A New Multi-graph Transformation Method for Frequent Approximate Subgraph Mining. Rev cuba cienc informat [online]. 2018, vol.12, n.3, pp.1-16. ISSN 2227-1899.

Frequent approximate subgraph (FAS) mining has been successfully applied in several science domains, because in many applications, approximate approaches have achieved better results than exact approaches. However, there are real applications based on multi-graphs where traditional FAS miners cannot be applied because they were not designed to deal with this type of graph. Only one method based on graph transformation, which allows the use of traditional simple-graph FAS miners on multi-graph problems was reported, but it has high computational cost. This paper aims at accelerating the mining process, thus a more efficient method is proposed for transforming multi-graphs into simple graphs and vice versa without losing topological or semantic information, that allows using traditional FAS mining algorithms and returning the mined patterns to the multi-graph space. Finally, we analyze the performance of the proposed method over synthetic multi-graph collections and additionally we show the effectiveness of the proposal in image classification tasks where images are represented as multi-graphs.

Palabras clave : approximate mining; frequent approximate subgraphs; graph-based classification; multi-graph mining.

        · resumen en Español     · texto en Inglés     · Inglés ( pdf )

 

Creative Commons License Todo el contenido de esta revista, excepto dónde está identificado, está bajo una Licencia Creative Commons