<?xml version="1.0" encoding="ISO-8859-1"?><article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<front>
<journal-meta>
<journal-id>2227-1899</journal-id>
<journal-title><![CDATA[Revista Cubana de Ciencias Informáticas]]></journal-title>
<abbrev-journal-title><![CDATA[Rev cuba cienc informat]]></abbrev-journal-title>
<issn>2227-1899</issn>
<publisher>
<publisher-name><![CDATA[Editorial Ediciones Futuro]]></publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id>S2227-18992018000400002</article-id>
<title-group>
<article-title xml:lang="en"><![CDATA[Unsupervised Segmentation of Agricultural Crops in UAV RGB Images]]></article-title>
<article-title xml:lang="es"><![CDATA[Segmentación no supervisada de Cultivos Agrícolas en imágenes de Vehículos Aéreos No tripulados.]]></article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname><![CDATA[Garea LLano]]></surname>
<given-names><![CDATA[Eduardo]]></given-names>
</name>
<xref ref-type="aff" rid="A01"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname><![CDATA[Osorio Roig]]></surname>
<given-names><![CDATA[Dailé]]></given-names>
</name>
<xref ref-type="aff" rid="A01"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname><![CDATA[Chacón Cabrera]]></surname>
<given-names><![CDATA[Yasser]]></given-names>
</name>
<xref ref-type="aff" rid="A01"/>
</contrib>
</contrib-group>
<aff id="A01">
<institution><![CDATA[,DATYS División de Investigaciones CENATAV ]]></institution>
<addr-line><![CDATA[ ]]></addr-line>
</aff>
<pub-date pub-type="pub">
<day>00</day>
<month>12</month>
<year>2018</year>
</pub-date>
<pub-date pub-type="epub">
<day>00</day>
<month>12</month>
<year>2018</year>
</pub-date>
<volume>12</volume>
<numero>4</numero>
<fpage>17</fpage>
<lpage>28</lpage>
<copyright-statement/>
<copyright-year/>
<self-uri xlink:href="http://scielo.sld.cu/scielo.php?script=sci_arttext&amp;pid=S2227-18992018000400002&amp;lng=en&amp;nrm=iso"></self-uri><self-uri xlink:href="http://scielo.sld.cu/scielo.php?script=sci_abstract&amp;pid=S2227-18992018000400002&amp;lng=en&amp;nrm=iso"></self-uri><self-uri xlink:href="http://scielo.sld.cu/scielo.php?script=sci_pdf&amp;pid=S2227-18992018000400002&amp;lng=en&amp;nrm=iso"></self-uri><abstract abstract-type="short" xml:lang="en"><p><![CDATA[Crop inventory is a precision agricultural task that allows for the planning and estimation of yields per hectare cultivated. The use of Unmanned Aerial Vehicles (UAV) has gained a great boom in the development of these applications given its low cost and fast possibilities of obtaining quality images. This paper presents a method for unsupervised segmentation of agricultural UAV RGB color images. We propose the combination of a set of texture features under a segmentation framework, based on the active contour without edges model with level set representation and a connected component filtering strategy. The experiments show that it can be applied for the segmentation of agricultural crops, with an average segmentation quality of 90%. It exceeds in efficacy other methods of supervised segmentation of the state of the art. It was demonstrated the robustness of the approach for images taken with UAVs of low performance which makes cheaper its application with low costs by agricultural producers.]]></p></abstract>
<abstract abstract-type="short" xml:lang="en"><p><![CDATA[El inventario de cultivos es una tarea agrícola de precisión que permite planificar y estimar los rendimientos por hectárea cultivada. El uso de vehículos aéreos no tripulados ha ganado un gran auge en el desarrollo de estas aplicaciones debido a su bajo costo y rápidas posibilidades de obtener imágenes de calidad. Este artículo presenta un método para la segmentación no supervisada de imágenes a color tomadas por estos vehículos. Proponemos la combinación de un conjunto de características de textura en un marco de segmentación, basado en el contorno activo sin modelo de bordes con representación de nivel y una estrategia de filtrado de componentes conexas. Los experimentos muestran que la propuesta puede ser aplicada para la segmentación de cultivos agrícolas, con una calidad de segmentación media del 90%, la misma supera en eficacia a otros métodos de segmentación supervisada del estado del arte. Se demostró la robustez del enfoque para imágenes tomadas con vehículos aéreos de bajo costo que hace más barata su aplicación por parte de los productores agrícolas.]]></p></abstract>
<kwd-group>
<kwd lng="en"><![CDATA[crop inventory]]></kwd>
<kwd lng="en"><![CDATA[texture analysis]]></kwd>
<kwd lng="en"><![CDATA[UAV]]></kwd>
<kwd lng="en"><![CDATA[unsupervised segmentation]]></kwd>
<kwd lng="es"><![CDATA[análisis de textura]]></kwd>
<kwd lng="es"><![CDATA[inventario de cultivos]]></kwd>
<kwd lng="es"><![CDATA[segmentación no supervisada]]></kwd>
<kwd lng="es"><![CDATA[UAV]]></kwd>
</kwd-group>
</article-meta>
</front><body><![CDATA[ <p align="right"><font face="Verdana, Arial, Helvetica, sans-serif" size="2"><B>ART&Iacute;CULO  ORIGINAL</B></font></p>     <p>&nbsp;</p>     <p><font size="4"><strong><font face="Verdana, Arial, Helvetica, sans-serif">Unsupervised  Segmentation of Agricultural Crops in UAV RGB Images</font></strong></font></p>     <p>&nbsp;</p>     <p><font size="3"><strong><font face="Verdana, Arial, Helvetica, sans-serif">Segmentaci&oacute;n no supervisada de Cultivos Agr&iacute;colas  en im&aacute;genes de Veh&iacute;culos A&eacute;reos No tripulados.</font></strong></font></p>     <p>&nbsp;</p>     <p>&nbsp;</p>     <P><font size="2"><strong><font face="Verdana, Arial, Helvetica, sans-serif">Eduardo Garea LLano<strong><sup>1*</sup></strong></font></strong></font>, <font size="2"><strong><font face="Verdana, Arial, Helvetica, sans-serif">Dail&eacute; Osorio Roig<strong><sup>1</sup></strong> , Yasser Chac&oacute;n Cabrera</font><font face="Verdana, Arial, Helvetica, sans-serif"><strong><sup>1</sup></strong></font></strong></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><sup>1</sup>Divisi&oacute;n de Investigaciones CENATAV, DATYS, 7ma A N0 21406 e/ 214 y 216,  Siboney, Playa, La Habana {egarea, dosorio, ychacon}@cenatav.co.cu</font> <font size="2" face="Verdana, Arial, Helvetica, sans-serif">    <br> </font></p>     ]]></body>
<body><![CDATA[<P><font face="Verdana, Arial, Helvetica, sans-serif"><span class="class"><font size="2">*Autor para la correspondencia: </font></span><font size="2">egarea@cenatav.co.cu </font></font>     <p>&nbsp;</p>     <p>&nbsp;</p> <hr>     <P><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>ABSTRACT</b></font>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Crop inventory is a precision agricultural task that allows for the  planning and estimation of yields per hectare cultivated. The use of Unmanned  Aerial Vehicles (UAV) has gained a great boom in the development of these  applications given its low cost and fast possibilities of obtaining quality  images. This paper presents a method for unsupervised segmentation of  agricultural UAV RGB color images. We propose the combination of a set of texture  features under a segmentation framework, based on the active contour without  edges model with level set representation and a connected component filtering  strategy. The experiments show that it can be applied for the segmentation of  agricultural crops, with an average segmentation quality of 90%. It exceeds in  efficacy other methods of supervised segmentation of the state of the art. It  was demonstrated the robustness of the approach for images taken with UAVs of  low performance which makes cheaper its application with low costs by  agricultural producers.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>Key words<span lang=EN-GB>:</span></b></font> <font size="2" face="Verdana, Arial, Helvetica, sans-serif">crop inventory, texture analysis, UAV, unsupervised segmentation.</font></p> <hr>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>RESUMEN</b> </font></p>     <p><font size="2"><em><font face="Verdana, Arial, Helvetica, sans-serif">El inventario de cultivos es una tarea agr&iacute;cola de  precisi&oacute;n que permite planificar y estimar los rendimientos por hect&aacute;rea  cultivada. El uso de veh&iacute;culos a&eacute;reos no tripulados ha ganado un gran auge en  el desarrollo de estas aplicaciones debido a su bajo costo y r&aacute;pidas  posibilidades de obtener im&aacute;genes de calidad. Este art&iacute;culo presenta un m&eacute;todo  para la segmentaci&oacute;n no supervisada de im&aacute;genes a color tomadas por estos  veh&iacute;culos. Proponemos la combinaci&oacute;n de un conjunto de caracter&iacute;sticas de  textura en un marco de segmentaci&oacute;n, basado en el contorno activo sin modelo de  bordes con representaci&oacute;n de nivel y una estrategia de filtrado de componentes  conexas. Los experimentos muestran que la propuesta puede ser aplicada para la  segmentaci&oacute;n de cultivos agr&iacute;colas, con una calidad de segmentaci&oacute;n media del  90%, la misma supera en eficacia a otros m&eacute;todos de segmentaci&oacute;n supervisada  del estado del arte. Se demostr&oacute; la robustez del enfoque para im&aacute;genes tomadas  con veh&iacute;culos a&eacute;reos de bajo costo que hace m&aacute;s barata su aplicaci&oacute;n por parte  de los productores agr&iacute;colas.</font></em></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>Palabras clave<span lang=EN-GB>: </span></b><em>an&aacute;lisis  de textura, inventario de cultivos, segmentaci&oacute;n no supervisada, UAV</em></font></p> <hr>     <p>&nbsp;</p>     ]]></body>
<body><![CDATA[<p>&nbsp;</p>     <p><font size="3" face="Verdana, Arial, Helvetica, sans-serif"><b>INTRODUCTION</b></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The use of  Unmanned Aerial Vehicles (UAV) has gained in the last years a great boom  because they are cheap and, at the same time, they offer high quality images.  Nowadays, high-resolution aerial images are widely available due to the  diffusion of advanced UAVs technologies. These developments offer new  opportunities for accurate land use analysis and change detection. UAVs have  several advantages over satellites and piloted aircraft: they can be deployed quickly  and repeatedly; they are cheaper and safer than piloted aircraft; they are  flexible in terms of flight height and mission time; and they can obtain images  at the sub-decimeter resolution. However, because of the limitation of the  flight height and focal length of the camera, the acquired image size is  smaller and a single image cannot always cover the entire target area, more  than one remote sensing image needs to be added into an UAV remote sensing  image mosaic. Another problem is that although multispectral and hyperspectral  sensors for UAVs are already available, these sensors still have relatively  high prices in the market, so their use for studies make projects more  expensive and limit their use by producers and farmers with lower economic  incomes.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The applications of remote sensing in agriculture are  multiple and varied. These applications are used among others, in the  identification, mapping and monitoring of crops, determination of biophysical  variables, estimation of biomass and yield, detection of biotic and abiotic  stresses, prediction of nitrogen content, recognition and monitoring of  irrigated areas, characterization of the water needs of crops, study of  environmental impacts, etc. Different examples of their use in agriculture can  be found in Zhang, 2012; Atzberger, 2013. In the last 3 years, there have been advances in remote sensing using  UAVs (Yinjiang, 2016; Jabal, 2017).  Detection of water stress, pests and diseases, weed; nutritional status,  determination of biomass and yield, soil characterization, etc., are some of  the agricultural applications that are being developed.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The crop inventory is another application that allows the  planning and estimation of the yields per hectare cultivated. This is done from  the classification of the different vegetation cover present in the study area.     <br>   In crops as in general in plants, the chlorophyll level is  different. These levels are perceptible by reflectance in the near infrared  band, which can be detected in the remote sensing images. In the near infrared,  the spectral reflectance curve of different crops allows their differentiation  so that spectral analysis is more suitable for the realization of crop  inventories from remote sensing images. </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The healthy crop in visible band of the spectrum has a strong  absorption rate and in the near infrared band has a strong reflectance. Based  on the theoretical basis of the spectrum, the monitoring of crop growth can be  achieved by extracting the approach of indicators of remote sensing of crops.  The normalized difference vegetation index (NDVI) are commonly used monitoring  indicators. However, these analyses are possible when the images are captured  by sensors that are able to work in different bands of the spectrum. However,  when we have UAVs equipped only with cameras that take pictures in the visible  spectrum, i.e. RGB images, it is not possible to perform this type of analysis.  Then a viable alternative for segmentation of different vegetal cover in the  scene could be through the description of their textures.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The term texture is used to characterize the surface of a  given object or phenomenon and is undoubtedly one of the main features used in  digital image processing and pattern recognition (Unser, 1986). Texture is an important feature of every image. It is a  very valuable source of information for the analysis and image understanding.  Texture is an innate property, virtually of all surface, so it is possible to  talk about grains in the woods, the weave of a textile, the pattern of  vegetation in an agricultural field, etc.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">When segmenting the texture of the image, the interest is  centered on obtaining a map of regions of textures, or a map of edges between  textures. The region-based texture segmentation approach tries to identify  regions of the image that have similar texture patterns. The local pixel  regions are joined to form texture patterns based on the similarity of some  texture property. Regions that have different textures are the ones that should  be segmented. Region segmentation methods have the advantage that the  boundaries of the regions are always closed and therefore regions with  different textures are always well separated. The disadvantages are in the data  needed by the methods in advance, such as: the number of different textures in  the image, the similarity values between textures, or a threshold to separate  them.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In this work, we propose a method for unsupervised  segmentation of agricultural crops in UAVs RGB color images. We propose the  combination of a set of texture features under a segmentation framework based  on the active contour without edges model with level set representation and a  connected component filtering strategy.</font></p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The  remainder of this paper is organized as follows. In Section 2 we present the  proposed general scheme of our method. In section 3 we present and discuss the  experimental results. Finally, we present the main conclusions of this work.</font></p>     <p><font size="2"><strong><font face="Verdana, Arial, Helvetica, sans-serif">Method for  unsupervised segmentation of agricultural crops in UAV images</font></strong></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The proposed method is based on the combination of  segmentation processes using texture models, texture descriptors, feature  extraction/ selection, pixel grouping and post-segmentation. The steps of the  proposed methodology are as follows: (<a href="/img/revistas/rcci/v12n4/f0102418.jpg" target="_blank">figure 1</a>)</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">(1) Read the  input UAV image. It can be a simple image or a video. In the video case, the  selection of the best frames that correspond to the presence in the scene of  the interested crops is made. (2) Apply some pre-processing to the simple  image. Improvements in brightness-contrast, signal-to-noise ratio, rotating (dextrorotatory,  levorotatory, horizontal, vertical), and turning it to gray, obtaining  negative, edges, or other transformation. (3) Apply a combination of texture  descriptors in the image. This action can generate a multi-dimensional image in  the order of tens and hundreds. (4) Apply segmentation method, based on the  active contour without edges model with level set representation. (5) Optional:  Apply methods for feature selection and reduction (PCA, SFFS). (6)  Post-processing of the segmented image. It facilitates the digital filtering by  applying as a theoretical basis an analysis of the connected components (CC),  and the evaluation of the quality of the segmented image using supervised  metrics.</font></p>     <p><font size="2"><strong><font face="Verdana, Arial, Helvetica, sans-serif">Proposed combination of texture models</font></strong></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The  Statistical Model of First Order is one of the forms of texture representation  that generates simple statistical procedures to characterize the texture  quantitatively. The methods that define texture descriptors in this model work  directly with the gray levels (GL) presented in the image, i.e. they are used  without modification. These descriptors are computed from the intensity values  of the GL of the pixels belonging to a centered neighborhood at each point of  the image, or from the GLs histogram of the image. The typical parameters that  can be extracted from the GL distribution of the image as texture measurements  are (Ferdeghin,  1991): minimum, maximum, mean, standard deviation and variance. These  features prove to be very simple and efficient discriminating simple textures.  The great advantage they have is their simplicity of calculation. Their main  limitation is that they cannot express the spatial characteristics of the  texture. To perform the segmentation of the image based on the texture  analysis, it must be ensured that the calculated and selected texture features  reflect a change in the statistical values of the image. This is usually  achieved when the regions have a simple texture (Theodoridis, 2009). When the texture is complex,  statistical changes in spatial information can be contrasted using texture  descriptors based on spectral methods. </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The agricultural crops usually have orientations  along an axis and have an elongated shape describing the shape of the grooves  (<a href="#f02">figure 2B</a>). This fact has a significant influence on the texture  characteristics of the image captured by UAVs. </font></p>     <p align="center"><img src="/img/revistas/rcci/v12n4/f0202418.jpg" alt="f02" width="522" height="190"><a name="f02"></a></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">One of the spectral models that has obtained better results  in the description of complex spatial oriented textures are those based on  Gabor filters (Lu,  21017).    <br> A 2D-Gabor filter consists of a sinusoidal flat wave having a  frequency value and an orientation, and is modulated by a 2D-Gaussian envelope.  We propose the use of a 2D Gabor filter bank at different frequencies and  orientations to extract oriented texture features of the image. Each frequency  and its corresponding orientation, models a channel that returns an image  response. The input image I (x, y) is convolved with the 2D-Gabor g (x, y) filter to obtain the  image of the Gabor feature r (x, y) as shown in (1).</font></p>     ]]></body>
<body><![CDATA[<p align="center"><img src="/img/revistas/rcci/v12n4/fo0102418.jpg" alt="fo01" width="339" height="65"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Where, (x, y) E &Omega;; &Omega;: is the set of points of the image and g (x, y) is the 2D-Gabor family of functions.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><strong>Multi Phase ACWE for Unsupervised Vegetal  Cover Segmentation in UAV images</strong></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The active contour approach has been widely used as image  segmentation method and for texture segmentation (Savelonas,  2001). The level set theory has provided more  flexibility and convenience to its implementation. The Active Contour without  Edges (ACWE) (Chan, 2001) uses a level set  representation, so the results do not depend on the initial position of the  contour. In [16] a Multi-Phase ACWE model for vector valued images was  presented. The algorithm is based on the extension of the ACWE model and level  set proposed by Chan and Vese (Chan,  2001) conceived to segment an image u0: &Omega; &rarr; R. In this function, the  active contour C &sub;&Omega;  was represented by a level set using a Lipschitz function &phi; (x, y): &Omega; &rarr; R. With this function &phi;  (x, y), the image can be divided into two or multiple regions.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The  problem in our case is to segment a textured UAV image without any previous  knowledge about the position of the vegetation textures in the image. Then we  propose to use this algorithm (Vega, 2008). We will get as input a vector  composed by the combination of texture features of two theoretical sources  (first order statistics and a bank of 2D Gabor filters), responsible for  providing differentiating properties between the types of textures present in  the image and characterizing each of the vegetation cover present in the UAV  scene.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Starting from the Multi-Phase ACWE model (Vega, 2008). &nbsp;it is possible to use <em>m = log2n</em> level functions to represent  n classes, of an UAV image possessing any complex topological structure, such  as example of <a href="#f02">figure 2</a>. Function (2) can segment <em>n = 2m</em> texture regions using m level functions and a vector made up of r  texture features (first order statistics and a bank of 2D Gabor filters) that  are received at<em> <img src="/img/revistas/rcci/v12n4/fo0202418.jpg" alt="fo02" width="127" height="21"></em></font></p>     <p align="center"><img src="/img/revistas/rcci/v12n4/fo0302418.jpg" alt="fo03" width="490" height="76"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Where <img src="/img/revistas/rcci/v12n4/fo0402418.jpg" alt="fo04" width="89" height="27"> is the term that computes the contour length. The vector of averages is  defined as <img src="/img/revistas/rcci/v12n4/fo0502418.jpg" alt="fo02" width="164" height="27">, being <em>1&le; I &le; r</em> and <em>n </em>equal to the number of classes in  which the image will be segmented. The average of the pixels of class <em>j </em>in feature <em>i</em> are calculated by the following equation (3):</font></p>     <p align="center"><img src="/img/revistas/rcci/v12n4/fo0602418.jpg" alt="fo02" width="321" height="73"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Where the variable, <em>&chi;j (x, y)</em> is the so-called  characteristic function of the <em>j-th</em> class, which contains the information of the values that take the different &phi;  functions in each pixel for that particular class. All weights <em>&lambda;i</em> are going to be considered as 1,  since, it is not possible to assign different weights to any texture feature  because we work with images without previous information.</font></p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><strong><em>Connected component (CC) filtering: </em></strong>It is possible that, because of segmentation, noises belonging to the  one vegetal cover are within other covers. In order to solve this problem in  each iteration of the algorithm, the existing CC are calculated, taking in one  CC the values &ge;0 and in the other, the values &lt;0. Once the CCs are  calculated, it is determined which of them should be considered as noise to  eliminate them. It is considered noise, when thet CC whose size in pixels is  smaller than the parameter &quot;area&quot; of the CC filter &ldquo;ccFilterSize&rdquo; is  always integer and positive. The pixels of the CC being removed are added to  the CC containing it. </font></p>     <p>&nbsp;</p>     <p><font face="Verdana, Arial, Helvetica, sans-serif"><strong><font size="3">EXPERIMENTAL RESULTAS Y DISCUSSION </font></strong></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In <a href="/img/revistas/rcci/v12n4/f0302418.jpg" target="_blank">figure 3</a>  we present the experimental design based on the flow diagram proposed in <a href="/img/revistas/rcci/v12n4/f0102418.jpg" target="_blank">figure 1</a>. The objective of the experiments is to verify the validity of the proposed  method.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">1)  Pre-processing of the video sequence: In this step, the video frames  corresponding to the presence in the scene of the studied crops are manually  extracted. As a result, the images that will be processed in the subsequent  steps are obtained. </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">2) Manual  segmentation of the regions with the study culture: By using tools developed  for this purpose, the regions containing the studied crops will be marked (by  the members of the research team) in each one of the images resulting from step  1, as a result for each image we will obtain an ideal segmentation, &quot;ground  truth&quot;. </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">3)  Application of the flow of the segmentation process using texture models. This  step corresponds to the proposed methodological scheme.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">4) The evaluation of the proposed method and its comparison with state of the art  methods will be developed by measuring its accuracy using the  &ldquo;<em>Coincidence Measure</em>&rdquo; (CM) proposed  in Vega,  2008. This measure takes into account the information that brings the ideal segmentation or &ldquo;ground truth&rdquo;. Then <em>0 &le; CM &le; 1</em>, if <em>CM=1</em> there is a perfect coincidence.    <br>   For the comparison of the performance of our proposal with state of the art  methods, we have selected two methods that are used in images taken in the  visible spectrum for the texture segmentation. (Haindl, 2005 y CALDERO,  2010) </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">For the  experiments, 21 video sequences were taken at different times and days at 100  feet (30.5 m) and 150 feet (45.7 m) on crop fields of the &ldquo;Artemisa&rdquo; province  located about 40km west from Havana, Cuba, where sugar cane, fruits, banana,  tomato, and potato crops are grown. Videos were taken by the UAV Phantom 1  quadcopter (<a href="#f02">figure 2A</a>) equipped PHANTOM. with RGB Sensor: Sony Exmor 1 / 2.3 &quot;, effective  pixels: 12.4 m (total pixels: 12.76 m), frames size: 1920 x1080. <a href="#f02">figure 2</a> shows  some examples of frames taken for the experiment.</font></p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><a href="/img/revistas/rcci/v12n4/t0102418.jpg" target="_blank">Table 1</a> shows the results  obtained in the evaluation of segmentation accuracy by CM. The average CM for  all processed images reached 0.9. This result is higher than what was obtained by the  methods with which it was compared. From the results, it is possible to observe that  when there are more than 2 crops in the scene the quality index of the  segmentation begins to decrease, however this one stays above 0.85 as it is the  case of the scenes that present 3 or 4 different crops (<a href="/img/revistas/rcci/v12n4/f0402418.jpg" target="_blank">figure 4</a>). These results show that the  proposed method can be used in applications where the automatic and  unsupervised detection of agricultural crop types is required. </font></p>     <p>&nbsp;</p>     <p><font face="Verdana, Arial, Helvetica, sans-serif" size="3"><B>CONCLUSIONS</B></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In this paper, we proposed a  method for automatic and unsupervised segmentation of agricultural crops in UAV RGB color images. We analyzed and demonstrated the pertinence of  combining two models of texture descriptors: The Statistical Model of First Order, which  guarantees the detection and differentiation of the simple textures of the  types of vegetation and the Spectral Model, which guarantees the detection of  the spatially oriented and complex textures characterizing the agricultural  crops. The proposed combination under a segmentation framework, based on the  ACWE model with level set representation and a connected component filtering  strategy shows that the proposed  approach is promising. The robustness of the approach for images taken with UAVs of low performance  as the Phantom 1 in terms of flight height and standard RGB sensors was  demonstrated which makes its application cheaper for farmers and producers of  low economic possibilities. The  future works will be oriented to determine other combinations of texture models  as well as its application in algorithms of supervised image classification for  the agricultural crop classification.</font></p>     <p>&nbsp;</p>     <p align="left"><font face="Verdana, Arial, Helvetica, sans-serif" size="3"><B>REFERENCIAS    BIBLIOGR&Aacute;FICAS</B></font>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Atzberger, C. Advances in remote sensing of  agriculture: Context description, existing operational monitoring systems and  major information needs. Remote Sensing, 5, 949-981. (2013) </font><!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Calderero, F. y F. Marques: Region  merging parameter dependency as information diversity to create sparse  hierarchies of partitions. Proceedings of 2010-IEEE 17th International  Conference on Image Processing, September 26-29, Hong Kong. (2010) </font><!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Chan, T.F. L.A. Vese: Active Contours Without  Edges. IEEE Transaction on Image Processing. 10(2):266-277, (2001).     </font></p>     ]]></body>
<body><![CDATA[<!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Ferdeghini, E., B. Pinamonti, E. Picano, F. Lattanzi, R. Bussani.  Quantitative texture analysis in echocardiography; application to the diagnosis  of myocarditis. Journal of Clinical Ultrasound, 19(5): 263-270. (1991).    </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Haindl, M.; S. Mike&scaron;: Colour texture segmentation using modelling  approach. Lecture Notes in Computer Science, (3687):484&ndash;491, (2005)</font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Jabal TI, Nugroho TW, Wahono W, Sitti RM. Survey of Irrigation Area  Using Micro Unmannes Aerial Vehicle (Micro- UAV) in Gumbasa irrigation Area. Agricultural  Socio-Economics Journal. AGRISE,Vol 17, No 1, (2017).    </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Lu R., &nbsp;Dennison E., Denison H., Cooper C., Taylor M. &nbsp;&amp; Bottema M.J. &nbsp;Texture analysis based on Gabor filters  improves the estimate of bone fracture risk from DXA images. Computer Methods in Biomechanics and  Biomedical Engineering: Imaging &amp; Visualization. Published on line, http://dx.doi.org/10.1080/21681163.2016.1271726, (2017). </font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">PHANTOM.  Quick Start Manual V1.7, 2013.09.25. Revision 2012-2013 DJI Innovations</font><!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Theodoridis, S. and  Koutroumbas, K. Pattern Recognition.  Academic Press. Fourth Edition. Book. ISBN: 978-1-59749-272-0. Printed in the  United States of America. (2009).    </font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Savelonas M.A., D.K. Iakovidis, D.E. Maroulis, &quot;An  LBP-Based Active Contour Algorithm for Unsupervised Texture Segmentation,&quot;  icpr, pp. 279-282, 18th International Conference on Pattern Recognition  (ICPR'06) Volume 2, (2006).     </font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Unser, M. Local  linear transforms for texture measurements. Signal Processing, vol. 2, pp.  61-79. (1986).    </font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Vega, S.; Gil J.L. and Vera O.L. Active  contour algorithm for texture segmentation using a texture feature set.  ICPR2008. IEEE Computer Society. TuBCT8.32, (1-4), (2008): IEEE.     </font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Yinjiang J. Zhongbin S., Weizheng S., Jianqing Y.,  Zhenan X.&nbsp; UAV Technology and Its  Application in Agriculture. Advanced Science and Technology Letters Vol.137  (SUComS 2016), pp.107-111. (2016).    </font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Zhang, C., Kovacs, J.M. The application of small unmanned aerial systems for precision  agriculture: a review. Precision Agriculture, 13, 693-712. (2012).    </font></p>     <p name="_ENREF_1">&nbsp;</p>     ]]></body>
<body><![CDATA[<p name="_ENREF_1">&nbsp;</p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Recibido: 27/10/2017    <br> Aceptado: 21/09/2018</font></p>      ]]></body><back>
<ref-list>
<ref id="B1">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Atzberger]]></surname>
<given-names><![CDATA[C]]></given-names>
</name>
</person-group>
<source><![CDATA[Advances in remote sensing of agriculture: Context description, existing operational monitoring systems and major information needs]]></source>
<year>2013</year>
<volume>5</volume>
<page-range>949-981</page-range></nlm-citation>
</ref>
<ref id="B2">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Calderero]]></surname>
<given-names><![CDATA[F]]></given-names>
</name>
<name>
<surname><![CDATA[Marques]]></surname>
<given-names><![CDATA[F]]></given-names>
</name>
</person-group>
<source><![CDATA[Region merging parameter dependency as information diversity to create sparse hierarchies of partitions]]></source>
<year>2010</year>
<publisher-loc><![CDATA[^eHong Kong Hong Kong]]></publisher-loc>
</nlm-citation>
</ref>
<ref id="B3">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Chan]]></surname>
<given-names><![CDATA[T.F]]></given-names>
</name>
<name>
<surname><![CDATA[Vese]]></surname>
<given-names><![CDATA[L.A]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[Active Contours Without Edges]]></article-title>
<source><![CDATA[]]></source>
<year>2001</year>
<volume>10</volume>
<numero>2</numero>
<issue>2</issue>
<page-range>266-277</page-range></nlm-citation>
</ref>
<ref id="B4">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Ferdeghini]]></surname>
<given-names><![CDATA[E]]></given-names>
</name>
<name>
<surname><![CDATA[Pinamonti]]></surname>
<given-names><![CDATA[B]]></given-names>
</name>
<name>
<surname><![CDATA[Picano]]></surname>
<given-names><![CDATA[E]]></given-names>
</name>
<name>
<surname><![CDATA[Lattanzi]]></surname>
<given-names><![CDATA[F]]></given-names>
</name>
<name>
<surname><![CDATA[Bussani]]></surname>
<given-names><![CDATA[R]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[Quantitative texture analysis in echocardiography]]></article-title>
<source><![CDATA[]]></source>
<year>1991</year>
<volume>19</volume>
<numero>5</numero>
<issue>5</issue>
<page-range>263-270</page-range><publisher-name><![CDATA[Journal of Clinical Ultrasound]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B5">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Haindl]]></surname>
<given-names><![CDATA[M]]></given-names>
</name>
<name>
<surname><![CDATA[Mikeš]]></surname>
<given-names><![CDATA[S]]></given-names>
</name>
</person-group>
<source><![CDATA[Colour texture segmentation using modelling approach]]></source>
<year>2005</year>
<volume>3687</volume>
<page-range>484-491</page-range><publisher-name><![CDATA[Lecture Notes in Computer Science]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B6">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Jabal]]></surname>
<given-names><![CDATA[TI]]></given-names>
</name>
<name>
<surname><![CDATA[Nugroho]]></surname>
<given-names><![CDATA[TW]]></given-names>
</name>
<name>
<surname><![CDATA[Wahono]]></surname>
<given-names><![CDATA[W]]></given-names>
</name>
<name>
<surname><![CDATA[Sitti]]></surname>
<given-names><![CDATA[RM]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[Survey of Irrigation Area Using Micro Unmannes Aerial Vehicle (Micro- UAV) in Gumbasa irrigation Area]]></article-title>
<source><![CDATA[]]></source>
<year>2017</year>
<volume>17</volume>
<numero>1</numero>
<issue>1</issue>
</nlm-citation>
</ref>
<ref id="B7">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Lu]]></surname>
<given-names><![CDATA[R]]></given-names>
</name>
<name>
<surname><![CDATA[Dennison]]></surname>
<given-names><![CDATA[E]]></given-names>
</name>
<name>
<surname><![CDATA[Denison]]></surname>
<given-names><![CDATA[H]]></given-names>
</name>
<name>
<surname><![CDATA[Cooper]]></surname>
<given-names><![CDATA[C]]></given-names>
</name>
<name>
<surname><![CDATA[Taylor]]></surname>
<given-names><![CDATA[M]]></given-names>
</name>
<name>
<surname><![CDATA[Bottema]]></surname>
<given-names><![CDATA[M.J]]></given-names>
</name>
</person-group>
<source><![CDATA[Texture analysis based on Gabor filters improves the estimate of bone fracture risk from DXA images.]]></source>
<year>2017</year>
</nlm-citation>
</ref>
<ref id="B8">
<nlm-citation citation-type="">
<collab>PHANTOM</collab>
<source><![CDATA[Quick Start Manual V1.7]]></source>
<year>2013</year>
</nlm-citation>
</ref>
<ref id="B9">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Theodoridis]]></surname>
<given-names><![CDATA[S]]></given-names>
</name>
<name>
<surname><![CDATA[Koutroumbas]]></surname>
<given-names><![CDATA[K]]></given-names>
</name>
</person-group>
<source><![CDATA[Pattern Recognition]]></source>
<year>2009</year>
<edition>Fourth</edition>
<publisher-name><![CDATA[Academic Press]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B10">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Savelonas]]></surname>
<given-names><![CDATA[M.A]]></given-names>
</name>
<name>
<surname><![CDATA[Iakovidis]]></surname>
<given-names><![CDATA[D.K]]></given-names>
</name>
<name>
<surname><![CDATA[Maroulis]]></surname>
<given-names><![CDATA[D.E]]></given-names>
</name>
</person-group>
<source><![CDATA[An LBP-Based Active Contour Algorithm for Unsupervised Texture Segmentation]]></source>
<year>2006</year>
<volume>2</volume>
<page-range>279-282</page-range></nlm-citation>
</ref>
<ref id="B11">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Unser]]></surname>
<given-names><![CDATA[M]]></given-names>
</name>
</person-group>
<source><![CDATA[Local linear transforms for texture measurements.]]></source>
<year>1986</year>
<volume>2</volume>
<page-range>61-79</page-range><publisher-name><![CDATA[Signal Processing]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B12">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Vega]]></surname>
<given-names><![CDATA[S]]></given-names>
</name>
<name>
<surname><![CDATA[Gil]]></surname>
<given-names><![CDATA[J.L]]></given-names>
</name>
<name>
<surname><![CDATA[Vera]]></surname>
<given-names><![CDATA[O.L]]></given-names>
</name>
</person-group>
<source><![CDATA[Active contour algorithm for texture segmentation using a texture feature set.]]></source>
<year>2008</year>
<volume>32</volume>
<page-range>1-4</page-range><publisher-name><![CDATA[IEEE Computer Society]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B13">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Yinjiang]]></surname>
<given-names><![CDATA[J]]></given-names>
</name>
<name>
<surname><![CDATA[Zhongbin]]></surname>
<given-names><![CDATA[S]]></given-names>
</name>
<name>
<surname><![CDATA[Weizheng]]></surname>
<given-names><![CDATA[S]]></given-names>
</name>
<name>
<surname><![CDATA[Jianqing]]></surname>
<given-names><![CDATA[Y]]></given-names>
</name>
<name>
<surname><![CDATA[Zhenan]]></surname>
<given-names><![CDATA[X]]></given-names>
</name>
</person-group>
<source><![CDATA[UAV Technology and Its Application in Agriculture.]]></source>
<year>2016</year>
<volume>137</volume>
<page-range>107-111</page-range></nlm-citation>
</ref>
<ref id="B14">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Zhang]]></surname>
<given-names><![CDATA[C]]></given-names>
</name>
<name>
<surname><![CDATA[Kovacs]]></surname>
<given-names><![CDATA[J.M]]></given-names>
</name>
</person-group>
<source><![CDATA[The application of small unmanned aerial systems for precision agriculture: a review]]></source>
<year>2012</year>
<volume>13</volume>
<page-range>693-712</page-range></nlm-citation>
</ref>
</ref-list>
</back>
</article>
