<?xml version="1.0" encoding="ISO-8859-1"?><article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<front>
<journal-meta>
<journal-id>2227-1899</journal-id>
<journal-title><![CDATA[Revista Cubana de Ciencias Informáticas]]></journal-title>
<abbrev-journal-title><![CDATA[Rev cuba cienc informat]]></abbrev-journal-title>
<issn>2227-1899</issn>
<publisher>
<publisher-name><![CDATA[Editorial Ediciones Futuro]]></publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id>S2227-18992016000500019</article-id>
<title-group>
<article-title xml:lang="en"><![CDATA[Information analysis of pattern and randomness]]></article-title>
<article-title xml:lang="es"><![CDATA[Análisis de información en patrones y aleatoriedad]]></article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname><![CDATA[Estévez-Rams]]></surname>
<given-names><![CDATA[Ernesto]]></given-names>
</name>
<xref ref-type="aff" rid="A01"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname><![CDATA[Rodríguez Horta]]></surname>
<given-names><![CDATA[Edwin]]></given-names>
</name>
<xref ref-type="aff" rid="A01"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname><![CDATA[Aragón Fernández]]></surname>
<given-names><![CDATA[Beatriz]]></given-names>
</name>
<xref ref-type="aff" rid="A02"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname><![CDATA[Serrano Alfaro]]></surname>
<given-names><![CDATA[Pablo]]></given-names>
</name>
<xref ref-type="aff" rid="A01"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname><![CDATA[Lora Serrano]]></surname>
<given-names><![CDATA[Raimundo]]></given-names>
</name>
<xref ref-type="aff" rid="A03"/>
</contrib>
</contrib-group>
<aff id="A01">
<institution><![CDATA[,Universidad de la Habana Facultad de Física-Instituto de Ciencias y Tecnología de Materiales(IMRE) ]]></institution>
<addr-line><![CDATA[ La Habana]]></addr-line>
<country>Cuba</country>
</aff>
<aff id="A02">
<institution><![CDATA[,Universidad de las Ciencias Informáticas (UCI)  ]]></institution>
<addr-line><![CDATA[ La Habana]]></addr-line>
<country>Cuba</country>
</aff>
<aff id="A03">
<institution><![CDATA[,Universidade Federal de Uberlandia  ]]></institution>
<addr-line><![CDATA[ Minas Gerais]]></addr-line>
<country>Brasil</country>
</aff>
<pub-date pub-type="pub">
<day>00</day>
<month>00</month>
<year>2016</year>
</pub-date>
<pub-date pub-type="epub">
<day>00</day>
<month>00</month>
<year>2016</year>
</pub-date>
<volume>10</volume>
<fpage>252</fpage>
<lpage>268</lpage>
<copyright-statement/>
<copyright-year/>
<self-uri xlink:href="http://scielo.sld.cu/scielo.php?script=sci_arttext&amp;pid=S2227-18992016000500019&amp;lng=en&amp;nrm=iso"></self-uri><self-uri xlink:href="http://scielo.sld.cu/scielo.php?script=sci_abstract&amp;pid=S2227-18992016000500019&amp;lng=en&amp;nrm=iso"></self-uri><self-uri xlink:href="http://scielo.sld.cu/scielo.php?script=sci_pdf&amp;pid=S2227-18992016000500019&amp;lng=en&amp;nrm=iso"></self-uri><abstract abstract-type="short" xml:lang="en"><p><![CDATA[ABSTRACT Complex patterns are ubiquitous in nature and its emergence is the subject of much research using a wide range of mathematical tools. On one side of complexity lies completely periodic system, and in the other side random behavior, both trivially simple from a statistical point of view. A fingerprint of complexity is the existence of large spatio-temporal correlations in the system dynamics. In this contribution, we will review two threads in complexity analysis, both steaming from information theory: Lempel-Ziv analysis of complexity, and computational mechanics. We discuss the usefulness of both approaches through the analysis of several examples. A first system will be the spatio-temporal evolution of cellular automata where transfer of information can be quantified by Lempel-Ziv measures. A second example will be random walk with bias and persistence; computational-mechanics will prove adequate for assessing the amount of wandering vs the patterned movement of the walker. Finally, disorder and pattern forming in layer crystal structure will be analyzed. Wrapping up, some discussion on the general nature of the examples analysis will be carried pointing to the appropriateness of the developed tools for studying the computational processing capabilities of complex systems]]></p></abstract>
<abstract abstract-type="short" xml:lang="es"><p><![CDATA[RESUMEN Los patrones complejos son comunes en la naturaleza y su surgimiento es objeto de mucha investigación utilizando una amplia gama de herramientas matemáticas. A un lado de la complejidad se encuentra la repetición completamente periódica, y en el otro, lo totalmente aleatorio, ambos trivialmente simples desde el punto de vista estadístico. Una huella dactilar de la complejidad es la existencia de correlaciones temporales o espaciales de largo alcancen la dinámica de los sistemas. En esta contribución, revisaremos brevemente dos métodos de realizar el análisis de la complejidad, ambos derivados de la teoría de la información: a través de la aleatoriedad de Lempel-Ziv y utilizando la mecánica computacional. La utilidad de ambas aproximaciones será discutida a través del análisis de varios ejemplos. Un primer sistema será la evolución espacio temporal de autómatas celulares, donde la transferencia de información será cuantificada utilizando Lempel-Ziv. Un segundo ejemplo será el del caminante aleatorio con sesgo y persistencia, la mecánica computacional demostrará ser apropiada para determinar la cantidad de deambular versus el movimiento predictible del caminante. Finalmente, desorden y formación de patrones en estructuras de capas será analizado. Para terminar, se discute la naturaleza general del análisis, insistiendo en la utilidad de las herramientas presentadas para el estudio de las capacidades de procesamiento computacional de los sistemas complejos.]]></p></abstract>
<kwd-group>
<kwd lng="en"><![CDATA[complexity]]></kwd>
<kwd lng="en"><![CDATA[information]]></kwd>
<kwd lng="en"><![CDATA[Lempel-Ziv]]></kwd>
<kwd lng="en"><![CDATA[computational mechanics]]></kwd>
<kwd lng="es"><![CDATA[complejidad]]></kwd>
<kwd lng="es"><![CDATA[información]]></kwd>
<kwd lng="es"><![CDATA[Lempel-Ziv]]></kwd>
<kwd lng="es"><![CDATA[mecánica computacional]]></kwd>
</kwd-group>
</article-meta>
</front><body><![CDATA[ <p align="right"><font face="Verdana, Arial, Helvetica, sans-serif" size="2"><B>ART&Iacute;CULO  ORIGINAL</B></font></p>     <p>&nbsp;</p>     <p><font size="4"><strong><em><font face="Verdana, Arial, Helvetica, sans-serif">Information analysis of  pattern and randomness</font></em></strong></font></p>     <p>&nbsp;</p>     <p><font size="3"><strong><font face="Verdana, Arial, Helvetica, sans-serif">An&aacute;lisis de informaci&oacute;n  en patrones y aleatoriedad</font></strong></font></p>     <p>&nbsp;</p>     <p>&nbsp;</p>     <P><font size="2"><strong><font face="Verdana, Arial, Helvetica, sans-serif">Ernesto  Est&eacute;vez-Rams<strong><sup>1*</sup></strong>, Edwin  Rodr&iacute;guez Horta<strong><sup>1</sup></strong>, Beatriz  Arag&oacute;n Fern&aacute;ndez</font></strong><font face="Verdana, Arial, Helvetica, sans-serif"><strong><sup>2</sup>, Pablo  Serrano Alfaro<sup>1</sup>, Raimundo  Lora Serrano<strong><sup>3</sup></strong></strong></font></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><sup>1</sup>Facultad de F&iacute;sica-Instituto de Ciencias y  Tecnolog&iacute;a de Materiales(IMRE), Universidad de la Habana, San Lazaro y L. CP  10400. La Habana. Cuba.</font>    <br>   <font size="2" face="Verdana, Arial, Helvetica, sans-serif"><sup>2</sup>Universidad de las Ciencias Inform&aacute;ticas (UCI),  Carretera a San Antonio, Boyeros. La Habana. Cuba.</font>    ]]></body>
<body><![CDATA[<br>   <font size="2" face="Verdana, Arial, Helvetica, sans-serif"><sup>3</sup>Universidade Federal de Uberlandia, AV. Joao Naves  de Avila, 2121- Campus Santa Monica, CEP 38408-144, Minas Gerais, Brasil.    <br> </font></p>     <P><font face="Verdana, Arial, Helvetica, sans-serif"><span class="class"><font size="2">*Autor para la correspondencia: </font></span></font><font size="2" face="Verdana, Arial, Helvetica, sans-serif"> <a href="mailto:rsosag@uci.cu">estevez@imre.oc.uh.cu</a><a href="mailto:jova@uci.cu"></a></font><font face="Verdana, Arial, Helvetica, sans-serif"><a href="mailto:losorio@ismm.edu.cu"></a> </font>     <p>&nbsp;</p>     <p>&nbsp;</p> <hr>     <P><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>ABSTRACT</b> </font>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Complex patterns are  ubiquitous in nature and its emergence is the subject of much research using a  wide range of mathematical tools.&nbsp; On one  side of complexity lies completely periodic system, and in the other side  random behavior, both trivially simple from a statistical point of view. A  fingerprint of complexity is the existence of large spatio-temporal  correlations in the system dynamics. In this contribution, we will review two  threads in complexity analysis, both steaming from information theory:  Lempel-Ziv analysis of complexity, and computational mechanics. We discuss the  usefulness of both approaches through the analysis of several examples. A first  system will be the spatio-temporal evolution of cellular automata where  transfer of information can be quantified by Lempel-Ziv measures. A second  example will be random walk with bias and persistence; computational-mechanics  will prove adequate for assessing the amount of wandering vs the patterned  movement of the walker. Finally, disorder and pattern forming in layer crystal  structure will be analyzed. Wrapping up, some discussion on the general nature  of the examples analysis will be carried pointing to the appropriateness of the  developed tools for studying the computational processing capabilities of  complex systems</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>Key words<span lang=EN-GB>:</span></b>complexity, information,  Lempel-Ziv, computational mechanics.</font></p> <hr>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>RESUMEN</b></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Los  patrones complejos son comunes en la naturaleza y su surgimiento es objeto de  mucha investigaci&oacute;n utilizando una amplia gama de herramientas matem&aacute;ticas. A  un lado de la complejidad se encuentra la repetici&oacute;n completamente peri&oacute;dica, y  en el otro, lo totalmente aleatorio, ambos trivialmente simples desde el punto  de vista estad&iacute;stico. Una huella dactilar de la complejidad es la existencia de  correlaciones temporales o espaciales de largo alcancen la din&aacute;mica de los  sistemas. En esta contribuci&oacute;n, revisaremos brevemente dos m&eacute;todos de realizar  el an&aacute;lisis de la complejidad, ambos derivados de la teor&iacute;a de la informaci&oacute;n:  a trav&eacute;s de la aleatoriedad de Lempel-Ziv y utilizando la mec&aacute;nica  computacional. La utilidad de ambas aproximaciones ser&aacute; discutida a trav&eacute;s del  an&aacute;lisis de varios ejemplos.&nbsp; Un primer  sistema ser&aacute; la evoluci&oacute;n espacio temporal de aut&oacute;matas celulares, donde la  transferencia de informaci&oacute;n ser&aacute; cuantificada utilizando Lempel-Ziv. Un  segundo ejemplo ser&aacute; el del caminante aleatorio con sesgo y persistencia, la  mec&aacute;nica computacional demostrar&aacute; ser apropiada para determinar la cantidad de  deambular versus el movimiento predictible del caminante. Finalmente, desorden  y formaci&oacute;n de patrones en estructuras de capas ser&aacute; analizado. Para terminar,  se discute la naturaleza general del an&aacute;lisis, insistiendo en la utilidad de  las herramientas presentadas para el estudio de las capacidades de  procesamiento computacional de los sistemas complejos.</font></p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><b>Palabras clave<span lang=EN-GB>: </span></b></font><font size="2" face="Verdana, Arial, Helvetica, sans-serif">complejidad,  informaci&oacute;n, Lempel-Ziv, mec&aacute;nica computacional.</font></p> <hr>     <p>&nbsp;</p>     <p>&nbsp;</p>     <p><font size="3" face="Verdana, Arial, Helvetica, sans-serif"><b>INTRODUCTION</b></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Trivial initial conditions  can give rise to complex patterns, and this has been the subject of intensive  studies (See (Crutchfield, 2012) and reference therein). Complexity arises in  the modeling of large systems in broad areas of science such as those found in  physics, chemistry or biology. It is clear that periodic behavior is far from  complex as it can be modeled with few variables and the nature of the  information is extremely redundant. However, it is also agreed, that completely  random processes are also not complex. In spite of its heavy information  content, randomness is easily modeled as a simple coin throw experiments shows.  Complexity lies between these two extremes and a fingerprint of its occurrence  is the presence of large spatio-temporal correlations (Wolfram, 1986). </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">A way of looking into  complexity is to ask the ability of a dynamical system to generate and store  information. Viewed from this perspective, they can be seen as computational  machines that generates symbols. The study of the system is then reduced to  quantify how it is capable of such computing capacity, which, in turn, can be  relevant if it is intended to tune the control parameters of the system to take  advantage of its computing ability (Crutchfiled, 2012).</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Kolmogorov, or algorithmic  complexity, has been at the root of complexity analysis. Kolmogorov complexity  characterize a system by the length of the shortest algorithm running on a  Universal Turing Machine (UTM), capable of reproducing the system (Kolmogorov,  1965; Li, 1993). &nbsp;A periodic system will  need a very short algorithm to be reproduced, while a completely random system  can only be replicated by describing it to the smallest detail. Kolmogorov  complexity is then not a true measure of complexity but of randomness, its absolute  nature, up to a constant value, exhibits useful properties (in what follows we  will use the more appropriate term of Kolmogorov randomness). The main drawback  of Kolmogorov randomness, as a practical tool, is its non-computability because  of the halting problem (Li, 1993). This limitation has driven researchers to  define practical alternatives, based on the compressibility of the mathematical  description of the system. All these alternatives are merely upper bounds to  the true Kolmogorov randomness of the system.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The analysis usually involves  characterizing the system configurations by its compressibility, where some  compression software such as gzip or bzip2 is used (Dubaq, 2001). The use of  compression software to estimate Kolmogorov randomness has a number of issues,  one being the necessarily finite size of the words dictionary (Weinberger,  1992). This leads to limitations in the size of the systems analyzed, depending  precisely in the Kolmogorov randomness of the sequence, the very quantity aimed  to be estimated. </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Lempel-Ziv complexity  (Lempel, 1976) (from now on LZ76 complexity), closely related to Kolmogorov  randomness, is a measure defined over a factorization of a character sequence.  Data sequences from different sources have been analyzed by LZ76 complexity  (Aboy, 2006; Chelani, 2011; Contantinescu, 2006; Liu, 2012; Rajkovic, 2003;  Szczepanski, 2004; Talebinejad, 2011; Zhang, 2009). All analysis using LZ76  complexity are based on a theorem proved by Ziv (Ziv, 1978) that showed, that  the asymptotic value of the LZ76 complexity growth rate (LZ76 complexity  normalized by n/log n, where n is the length of the sequence) is related to the  entropy rate h (as defined by Shannon information theory) for an ergodic  source. Entropy rate has a close relationship with Kolmogorov randomness  (Calude, 2002), and measures the irreducible randomness of a system (Feldman,  2008).</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Consider an optimal  computational machine capable of statistically reproducing the system dynamics.  Optimality is understood as the simpler machine with best predictive power.  Such machine is called &#603;-machine and its design, or its reconstruction from the  available data, is the goal of computational mechanics (Crutchfield, 1992;  Crutchfield, 2012).&nbsp; Complexity analysis  is then, using the epsilon-machine, to discover the nature of patterns and to  quantify them. It is rooted in information theory concepts, and has found  applications in several areas (Varn, 2013; Ryabov, 2011; Haslinger, 2010).&nbsp; Its use in statistical mechanics, allowed to  define and calculate magnitudes that complement thermodynamic quantities. One  of such magnitude is the statistical complexity Cm, defined as the  Shannon entropy over the probability of the causal states. A causal state is a  set of pasts that determines probabilistically equivalent futures (Shalizi,  2001).&nbsp; Entropy rate hm,  already mentioned when describing the LZ76 complexity can also be calculated  from the e-machine description of the system (Crutchfield, 2003). Finally, the  excess entropy E, defined as the mutual information between  past and future, and can be interpreted as the amount of memory needed to make  optimal predictions, without taking into account the irreducible randomness  (Feldman, 1998). </font></p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In this contribution, we will be reviewing the  use of LZ76 and computational mechanics in the study of complexity. We will do  so by dwelling into various examples: studying the spatio-temporal evolution of  cellular automata (CA), quantifying the amount of wandering and purposely  movement in a random walk with bias and persistence and finally, the emergence  of disorder and pattern in layer structured crystals.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The paper is organized as  follows. It first begins by mathematically introducing the various concepts  already mentioned, which will also allow to fix notation. This will be followed  by the discussion of the cellular automata dynamics. The next section will deal  with the biased persistent random walk model and then the results for layer  crystals will be presented. A discussion of the usefulness of the developed tools  for studying the computational processing capabilities of complex systems will  be made. Conclusions then follow. </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The  results presented in this paper have been partially published separately by the  authors in (Estevez, 2015; Rodriguez, 2016a; Rodriguez, 2016b) </font></p>     <p>&nbsp;</p>     <p><font face="Verdana, Arial, Helvetica, sans-serif"><strong><font size="3">MATHEMATICAL BACKGROUND </font></strong></font></p>     <p> <font size="2"><strong><font face="Verdana, Arial, Helvetica, sans-serif">A. Kolmogorov based  normalized information distance</font></strong></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The  Kolmogorov randomness <strong>K(s)</strong> of a string <strong>s</strong>, is the length of the  shortest program <strong>s*</strong> that when run in a Universal Turing Machine (UTM),  gives as output the string    <br> &nbsp;K(s) = |s*|.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Using UTM makes the Kolmogorov randomness an  absolute measure, up to a constant factor. It is clear that a constant string  can be described by a very short program, while a random string, say out of a  coin toss experiment, will have not algorithmic way to be exactly predicted  except by reproducing the string itself. The conditional Kolmogorov randomness <strong>K(s|p)</strong> can be introduced as the length of the shortest program that, knowing <strong>p</strong>,  allows to compute <strong>s</strong>. Also, the joint Kolmogorov randomness <strong>K(s,p)</strong> is the size of the smallest program that computes both strings <strong>s</strong> and <strong>p</strong>.  Without going into details, in what follows the allowed programs will be  prefix-free, where no program is a proper prefix of another program (Li,  1993).&nbsp; The halting program makes Kolmogorov  randomness non-computable, which turns out to be a huge limitation for its  practical use.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">It can be shown that the  following relation holds </font></p>     ]]></body>
<body><![CDATA[<p align="center"><font size="2" face="Verdana, Arial, Helvetica, sans-serif">K(s,p) <img src="/img/revistas/rcci/v10s1/fo0119517.jpg" alt="fo01" width="13" height="11"> K(s)+K(p|s*)  = K(p)+K(s|p*) ,&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; (1)</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">where <img src="/img/revistas/rcci/v10s1/fo0119517.jpg" alt="fo01" width="13" height="11"> denotes that equality is valid up to a constant value independent  of <strong>p</strong> and <strong>s</strong>.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Entropy density can be  estimated from</font></p>     <p align="center"><img src="/img/revistas/rcci/v10s1/fo0219517.jpg" alt="fo02" width="138" height="48"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The  entropy rate is defined by</font></p>     <p align="center"><img src="/img/revistas/rcci/v10s1/fo0319517.jpg" alt="fo03" width="161" height="47"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">where <strong>H[s(1,N)]</strong> is the  Shannon block entropy (Cover, 2006) of the string <img src="/img/revistas/rcci/v10s1/fo0419517.jpg" alt="fo04" width="84" height="20"></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In  spite of the non-computability of the Kolmogorov randomness, the entropy  density can be computed, as we do not need the actual K but only its scaling  behavior.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The information about <strong>s</strong> contained in <strong>p</strong> is defined by</font></p>     <p align="center">&nbsp;I(s:p)=K(s)-K(s|p*),&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; (4)</p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">which  implies that <strong>I(s:p)=I(p:s)</strong> up to a constant.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Li et al. (li, 2004) defined  the normalized information distance (NID) between two sequences <strong>s</strong> and <strong>p</strong> by the relation:</font></p>     <p align="center"><img src="/img/revistas/rcci/v10s1/fo0519517.jpg" alt="fo05" width="213" height="41"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">where,  without loss of generality, it is assumed that <strong>K(p) &gt; K(s)</strong>.&nbsp; NID is an information-based distance that  quantifies how correlated are two sequence from the algorithmic  perspective.&nbsp; If two sequences can be, to  a large extent, derived one from the other by a small sized algorithm, then the  corresponding NID is small.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The problem with the use of  equation (5) is that Kolmogorov randomness is non computable, the practical  alternative is to estimate <strong>dNID</strong> from</font></p>     <p align="center"><img src="/img/revistas/rcci/v10s1/fo0619517.jpg" alt="fo06" width="261" height="38"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">where <strong>C(x)</strong> is the compressed size of the string <strong>x</strong>.  Compression have been made using available software such as  gzip or bzip2 (Li, 2004; Emmert, 2010), with no significant difference between  the different compression softwares.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Instead  of using a compression algorithm, if s and p have the same length, then we  rewrite equation (6) in terms of the entropy density </font></p>     <p align="center"><img src="/img/revistas/rcci/v10s1/fo0719517.jpg" alt="fo07" width="258" height="41"></p>     <p><font size="2"><strong><font face="Verdana, Arial, Helvetica, sans-serif">h(x)</font></strong><font face="Verdana, Arial, Helvetica, sans-serif"> is, contrary to <strong>K(x)</strong> a computable magnitude and this is the approach  will be using.     ]]></body>
<body><![CDATA[<br> </font></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif"><strong>B.  Lempel-Ziv factorization and complexity</strong></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Let us call s(i,j) the  substring of <strong>s</strong> starting at the position <strong>i</strong> and having length <strong>j</strong>.  Define the operator </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">s(i,j)&pi; = s(i, j-1)    <br> &pi; is kind of a drop operator, consequently,    <br> s(i,j)&pi;  <sup>k</sup> = s(i, j-k).</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The Lempel-Ziv factorization <strong>F(s)</strong> of the string <strong>s</strong> of length <strong>N</strong> is given by    <br> F(s) = s(1, l<sub>1</sub>)s(l<sub>1</sub>+1,  l<sub>2</sub>)... s(l<sub>m-1</sub>+1, N),</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">where there are <strong>m</strong> factors such that each factor s(l<sub>k-1</sub>+1, l<sub>k</sub>) complies with</font></p>     <p><img src="/img/revistas/rcci/v10s1/fo0819517.jpg" alt="fo08" width="512" height="56"></p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The  partition <strong>F(s)</strong> is unique for every string (Lempel, 1976).</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">For  example, the exhaustive history of the sequence u = 111010100011 is F(s) = 1.110.10100.011,    <br>   where  each factor is delimited by a dot.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The  LZ76 randomness <strong>C<sub>LZ</sub>(s)</strong> of the string <strong>s</strong>, is the number  of factors in Lempel-Ziv factorization. In the example above, <strong>C<sub>LZ</sub>(s)  = 4</strong>.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In the limit of very large  string length, <strong>CLZ(s)</strong> is bounded by (Lempel, 1976)</font></p>     <p><img src="/img/revistas/rcci/v10s1/fo0919517.jpg" alt="fo09" width="104" height="39"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">which allows to define a  normalized LZ76 randomness as</font></p>     <p><img src="/img/revistas/rcci/v10s1/fo1019517.jpg" alt="fo10" width="142" height="32"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Ziv(Lempel y Ziv, 1976) proved  that, if s is generated by an ergodic source, then <img src="/img/revistas/rcci/v10s1/fo1119517.jpg" alt="fo11" width="153" height="21"> where <strong>h(s)</strong> is the entropy  rate defined above in equation (3). This allows to use <strong>cLZ(s) </strong>as  an estimate of <strong>h(s)</strong> for <strong>N&gt;&gt;1</strong>. dNCD can then be  computed using the estimates of entropy rate given by equation (3), we will  denote such distance by <strong>d<sub>LZ</sub></strong>.</font></p>     <p><font size="2"><strong><font face="Verdana, Arial, Helvetica, sans-serif">C. </font></strong><font face="Verdana, Arial, Helvetica, sans-serif"><strong>Computational  mechanics: Casual states, statistical complexity, entropy rate and excess  entropy</strong></font></font></p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">We will be mostly following  (Shalizi, 2001). Consider a process that produces as output a (bi) infinite  sequence of characters, drawn from a given alphabet &Sigma;. Take a particular  realization of the process with output string s, which will be partitioned in  two <img src="/img/revistas/rcci/v10s1/fo1219517.jpg" alt="fo12" width="155" height="21">, the past, and <img src="/img/revistas/rcci/v10s1/fo1319517.jpg" alt="fo13" width="92" height="20">, the future. Assuming that  strings are drawn from a distribution, possibly unknown, then two <img src="/img/revistas/rcci/v10s1/fo1419517.jpg" alt="fo14" width="65" height="21">, that conditions the same  probability <img src="/img/revistas/rcci/v10s1/fo1519517.jpg" alt="fo15" width="157" height="19">   for all futures <img src="/img/revistas/rcci/v10s1/fo1619517.jpg" alt="fo16" width="20" height="16">, are said to belong to the same  causal state Cp. By construction, the set of causal state (denoted  by {Cp} of size |{Cp}|) uniquely determines the future of  a sequence which allows to define a function &epsilon; over <img src="/img/revistas/rcci/v10s1/fo1719517.jpg" alt="fo17" width="19" height="14"> , relating  <img src="/img/revistas/rcci/v10s1/fo1719517.jpg" alt="fo17" width="19" height="14"> to its causal state Cp.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The  statistical complexity is defined as the Shannon entropy over the causal states </font></p>     <p align="center"><img src="/img/revistas/rcci/v10s1/fo1819517.jpg" alt="fo18" width="313" height="32"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The logarithm is usually  taken in base two and the units are then bits. The set of causal states is  related to the optimal memory required for prediction; more memory resources  will not improve the predictive power of the process. Statistical complexity,  being the Shannon entropy over the causal states, is therefore a measure of how  much memory the system needs to optimally predict the future.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The  entropy rate, can be calculated as </font></p>     <p align="center"><img src="/img/revistas/rcci/v10s1/fo1919517.jpg" alt="fo19" width="430" height="34"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">We will be considering first order Markov process,  where the excess entropy is given by</font></p>     <p align="center"><img src="/img/revistas/rcci/v10s1/fo2019517.jpg" alt="fo19" width="123" height="23"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Excess entropy is a  measure of the resources needed, once the irreducible randomness has been  subtracted (Feldman, 2008).</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">When the size of the  alphabet &Sigma; is finite, the number of causal states for the first order Markov  process is also finite, and the dynamics of the system can be optimally  described by a finite state machine (FSM), which in this case will represent  the &epsilon;-machine.&nbsp; </font></p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The &epsilon;-machine FSM can be  described by a digraph, where each node corresponds to a causal state and the  directed transitions between nodes are labeled as sk|P(Cm|Cp).  sk  is the emitted symbol, while making a transition from Cp to Cm  the arriving  state is uniquely determined by the emitted symbol, a property called  unifiliarity.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The reader can refer to  (Shalizi, 2001; Feldman, 1998) for further discussion.</font></p>     <p><font size="2"><strong><font face="Verdana, Arial, Helvetica, sans-serif">Information transfer in the  spatio-temporal evolution of cellular automata</font></strong></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Our first systems  will be discrete time and space cellular automata (CA), which has been the  subject of intense research over the past decades ((Kari, 2005) and reference  therein). CA can go from periodic patterns to universal computing capabilities  (Wolfram, 1984). This last behavior is amazing, as a CA can be specified by a  finite number of local rules acting over a finite number of states. In spite of  the local nature of the rules, CA can achieve large spatio-temporal  correlations (Wolfram, 1986).</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">We will  define, for the purposes of this article, a one dimensional CA, as a tern (&Sigma;, s, &Phi;), where &Sigma; is, as  defined before, a finite alphabet; s = s0s1...sN-1  is a set of sites and; &Phi; is a local rule. If s<sup>t</sup> = s<sup>t</sup>0,s<sup>t</sup>1,s<sup>t</sup>2,...s<sup>t</sup>N-1  denotes a particular configuration of the sites values at time t, then</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">&nbsp;s<sup>t+1</sup>i&nbsp; = &Phi;[ s<sup>t-1</sup>i-r, s<sup>t-1</sup>i-r+1,...,s<sup>t-1</sup>i+r].</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">For elementary CA (ECA), r  = 1, and a binary alphabet &Sigma; = {0, 1} is used.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">There are a total of 256 possible rules for ECA which  can be labeled by a number. To each rule &Phi;, a label R is assigned according to  a scheme proposed by Wolfram that has become standard (Wolfram, 02):</font></p>     <p>&nbsp;<font size="2" face="Verdana, Arial, Helvetica, sans-serif">R = &Phi;(0,0,0)2<sup>0</sup>+&Phi;(0,0,1)2<sup>1</sup>+&Phi;(0,1,0)2<sup>2</sup>+...  +&Phi;(1,1,1)2<sup>7</sup>.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">ECA rules can be  partitioned into equivalence classes as a result of mirror and reversion symmetries,  the analysis of the rules can then be reduced to a representative member of  each class. CA have been classified in a number of ways (Kari, 2005), where the  most cited one is the original classification of Wolfram (Wolfram, 1984).  Starting from an arbitrary random initial configuration, CA are classified as:</font></p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">W1: configurations evolves to a homogeneous state;    <br>   W2: configurations evolves to a periodic behavior;    <br>   W3: configurations evolves to aperiodic chaotic  patterns;    <br>   W4: configurations evolves  to configurations with complex patterns and long lived, correlated localized  structures.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Wolfram classification is vague; as a result, the  assignment of each rule to a Wolfram class is ambiguous.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In (Estevez, 2015) how ECA  rules transfer information from the (random) initial configuration as they  evolve was studied. The d<sub>LZ</sub> between two consecutive configurations st  and st+1 was computed for successive values of time t.&nbsp; After dropping the first 2000 steps, the d<sub>LZ</sub>  values were averaged (denoted as dp<sub>LZ</sub>)  and plotted against the final entropy density (<a href="/img/revistas/rcci/v10s1/f0119517.jpg" target="_blank">Figure 1</a>).&nbsp; Three clusters were identified, one with the d<sub>LZ</sub> values around 1, labeled dp3. A second cluster, dp2, was made of  rules belonging to W2, and a third cluster, with zero d<sub>LZ</sub>, made of  W1 rules and labeled dp1. Group of rules in dp1 have a complete transfer of  information as the configurations evolves, and these rules also show entropy  densities near zero. The group of rules dp2 span the whole range of entropy  density values, but show dp<sub>LZ</sub> distance between 10<sup>-3</sup>  and 10<sup>-1</sup>, they also show a trend of decreasing dp<sub>LZ</sub>&nbsp; with increasing entropy density. The  third group of rules dp3 loose, on the average, all information from one-time  step to the next.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">In addition, the effect of  changing the initial configurations was considered. Two initial sequences were  taken, the second one with a single (random) site changed with respect to the  first one. Then both ECA were left to evolve and the d<sub>LZ</sub> between  them were calculated at each time step. This was done 1000 times for each rule  and the results averaged.&nbsp; Different  behaviors were discovered from almost no sensibility to the perturbation of the  initial condition, to heavy dependence on the initial condition as shown in  <a href="/img/revistas/rcci/v10s1/f0219517.jpg" target="_blank">Figure 2</a>.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Rules 150 as well as 60, 90 and 105 (and equivalents) behaves in a very  interesting way. <a href="/img/revistas/rcci/v10s1/f0319517.jpg" target="_blank">Figure 3</a> shows the distance between the non-perturbed and  perturbed evolution for rule 150. The fractal nature of the behavior is clear  and can be understood by looking into the difference map. The reiterative  collapse of the d<sub>LZ</sub> curve to almost zero value, can be pointed in  the difference map to the apex of the triangle regions. </font></p>     <p><font size="2"><strong><font face="Verdana, Arial, Helvetica, sans-serif">Analysis of random walk  viewed as a symbol generating process.</font></strong></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">We  turn to a  one-dimensional random walker (RW) which was studied by the authors in  (Rodriguez, 2016). The walker is allowed to move to the right, or to the left,  a unit length in a unit time. The probability that the walker chooses right  will be a control parameter labeled by r. p, on the other hand, is the  probability that the walker keeps moving in the same direction as the previous  step. r, es also known as bias, and p as persistence. Also, a probability l  that the walker makes no move at a given step is allowed.&nbsp; The set of control parameters is then (r, p,  l) which will be taken fixed in time.</font></p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Our  process knows outputs values from the set </font><img src="/img/revistas/rcci/v10s1/fo2119517.jpg" alt="fo21" width="90" height="22"><font size="2" face="Verdana, Arial, Helvetica, sans-serif">, the first symbol representing  a move to the right, the second symbol no move, and the last symbol a move to  the left. The control parameters (probabilistically) decide the next move based  on the previous one, and therefore the dynamics of the system can be described  by a first order Markov process.&nbsp; The  most FSM describing this process for different values of the control parameters  are shown in <a href="/img/revistas/rcci/v10s1/f0419517.jpg" target="_blank">figure 4</a>. It was found that the most unpredictable dynamics  happens at r = l = 1/3 with an entropy density of h = log<sub>2</sub>3 = 1.5849  bits/site.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">From the &epsilon;-machine is  straightforward to calculate the entropy density and excess entropy as a  function of the control parameters. Such diagrams allows asserting the amount  of movement, which can be considered random (wandering), in contrast to the  movement following some pattern. This kind of information is not directly  available through usual statistical physics analysis. <a href="/img/revistas/rcci/v10s1/f0519517.jpg" target="_blank">Figure 5</a> shows such  diagrams.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Excess  entropy is a measure of patterned movement. As latency increases the walker  stays longer runs in the same place (state <img src="/img/revistas/rcci/v10s1/fo2219517.jpg" alt="fo22" width="5" height="5">) and the patterned  movement goes to zero. For a given value of latency, persistence controls the  patterned movement. Small, or near one, values of persistence results in larger  patterned movement. The relation with bias is less straightforward and seems  less sensitive to this control parameter. Complementary, entropy density shows  a maximum around {r,p}={1/2,1/2} for a fixed latency value. Entropy density is  witnessing the wandering movement of the walker. The movement for small values  of persistence is patterned, but the system alternates between three states. If  latency is small, the movement gets closer to an antiferromagnetic order and  the walker does not get far from its initial position, the drift velocity is  near zero. The reduction of drift velocity is not consequence of wandering  movement, but of its patterned alternate character.&nbsp;&nbsp; As persistence increases, still with small  latency, the excess entropy decreases as a result that the system  has longer runs on the <img src="/img/revistas/rcci/v10s1/fo2219517.jpg" alt="fo22" width="5" height="5"> state. Once persistence  goes above one half, excess entropy starts climbing and the dynamics gets  increasingly closer to ferromagnetic order.</font></p>     <p><font size="2"><strong><font face="Verdana, Arial, Helvetica, sans-serif">Ising models for the study  of stacking disorder in layered crystals: &epsilon;-machine analysis.</font></strong></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Close packed structures  are special type of layer structures ubiquitous in nature. The close packed  condition is referred to the constrain that two consecutive layers with the  same lateral displacement, are forbidden.&nbsp;  Two periodic arrangements that differ only on their stacking order are  termed to belong to the same polytypic family and each member of a family is  called a polytype. Experimentally it has been found that perfect periodic  stacking are the exception, usually stacking disorder is present in varying  degrees from low density to almost complete disruption of any underlying  order.&nbsp; If the stacking ordering in CPS  is coded as some binary code, then it is possible to study, polytypism and  stacking disorder, by writing the Hamiltonian that describes the interaction  between the binary codes, treated as spins (Uppal, 1980; Kabra, 1988; Shaw,  1990).</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">When  considering only finite range interaction, a large class of system can be cast  in the framework of the Ising model, which has a Hamiltonian of the type:</font></p>     <p align="center"><img src="/img/revistas/rcci/v10s1/fo2319517.jpg" alt="fo23" width="227" height="52"></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">As usual, B described an  external field intensity, Jk is the interaction parameter for range  k and si is the spin (pair of layers) at site i.&nbsp; Now the system reduces to a dynamical system,  typical in complexity analysis, where, because of the different interaction  terms, patterns and disorder can arise witness by the resulting string of  characters.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">&epsilon;-machine analysis of Ising models have been studied  before (Feldman, 1998). As in the case of the random walker, the process can be  cast into a first order Markov process, in this case by using the transfer  matrix formalism.&nbsp; A FSM description of  the system arises. The FSM of maximum connectivity is shown in <a href="/img/revistas/rcci/v10s1/f0619517.jpg" target="_blank">Figure 6</a>  together with the statistical complexity contour map, for a second neighbor  interaction.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">The  complexity map directly describes the phase diagram. The appearance of  different polytypes comes as result of the J1/J2 ratio. Phase transformation in  this type of system is considered to be result of the parameters J1, J2,  depending on external factors, such as temperature or pressure. Also, the  &epsilon;-machine description allows to discover the polytypes appearing at the  boundary between stable phases. Such calculation was performed for all  boundaries in the phase diagram and <a href="/img/revistas/rcci/v10s1/t0119517.jpg" target="_blank">Table I </a>shows the polytypes at the FCC-DHCP  border whose probability of occurrence is above 10%.</font></p>     ]]></body>
<body><![CDATA[<p>&nbsp;</p>     <p><font face="Verdana, Arial, Helvetica, sans-serif"><strong><font size="3"> DISCUSSION AND</font></strong></font> <font face="Verdana, Arial, Helvetica, sans-serif" size="3"><B>CONCLUSIONS</B></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Complexity analysis has a  long story of development, still grasping what we mean by complex behavior has  turned out to be a very complex problem. Perhaps there is no single definition  for complexity; yet, it can be agreed, for a number of systems, that the  emergence of new behavior from seemingly simple rules, when a large number of  variables are involved, can be a fingerprint of complexity. If the systems can  be mathematized as codes or strings of a numerable alphabet, then complexity is  tractable from a number of angles. In this paper, we have explored two venues,  which have proven useful in a number of cases. Computational mechanics,  pioneered by Crutchfield and coworkers, is reaching a point of maturity were  increasing practical applications can be fore-visioned. One of the outstanding  aspect of computational mechanics is that it has allowed to explore in a  quantitative and deep way the &ldquo;quality&rdquo; of disorder and pattern. In its  framework the meaning of entropic measures such as entropy rate, excess entropy  and statistical complexity have been clarified. It is also a practical tool. It  uses in the analysis of layer solids just scratch the surface of its  usefulness.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Although  much less worked than computational mechanics, Lempel-Ziv has its own beauties.  It is very practical; it allows to estimate entropy density from raw data if  enough observations are made. If combined with ideas from Kolmogorov complexity  it can be used to define empirical metrics over system dynamics. We have shown  the power of Lempel-Ziv based analysis of CA spatio-temporal evolution. The  method used is readily extendable to other systems in a straightforward manner.</font></p>     <p><font face="Verdana, Arial, Helvetica, sans-serif"><strong><font size="3">ACKNOWLEDGMENTS</font></strong></font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">This work was partially  financed by FAPEMIG under the project BPV-00047-13. EER which to thank  PVE/CAPES for financial support under the grant 1149-14-8. Infrastructure  support was given under project FAPEMIG APQ-02256-12. EDRH and PSA which to  thank MES for master degree support.</font></p>     <p>&nbsp;</p>     <p align="left"><font face="Verdana, Arial, Helvetica, sans-serif" size="3"><B>REFERENCIAS    BIBLIOGR&Aacute;FICAS</B></font>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">M. Aboy, R. Homero, D. Abasolo, and D.  Alvarez. Interpretation  of the Lempel-Ziv complexity measure in the context of biomedical signal  analysis. IEEE Trans. Biom. Eng., 2006, 53, p. 2282&ndash;2288.</font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">C. S. Calude. Information and Randomness, 2002,  Springer Verlag.    </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">A.B. Chelani. Complexity analysis of co concentrations  at a trafc site in Delhi. Transp. Res. D, 2011, 16, p. 57&ndash;60, 2011.</font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">S. Contantinescu and L. Ilie. Mathematical Foundations  of Computer Science, 2006, 4/62. Springer Verlag, Berlin.    </font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">T. M. Cover and J. A. Thomas. Elements of information  theory. 2006, Second edition. Wiley Interscience, New Jersey.    </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">J. P. Crutchfield. Knowledge and meening ... chaos and  complexity. In L. Lam and V. Narodditsty, editors, Modeling complex phenomena,  1992, Springer, Berlin, p.66&ndash;101.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">J. Crutchfield and D. P. Feldman. Regularities unseen,  randomness observed: Levels of entropy convergence. Chaos, 2003, 13, p.25&ndash;54.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">J. P. Crutchfield. Between order and chaos. Nature,  2012, 8, p.17&ndash;24.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">[Dubaq, 2001] J.-C Dubacq, B. Durand, and E. Formenti.  Kolmogorov complexity and cellular automata classification. Th. Comp. Science,  2001, 259, p. 271&ndash;285.</font></p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">F. Emmert-Streib. Exploratory analysis of  spatiotemporal patterns of cellular automata by clustering compressibility.  Phys. Rev. E, 2010, 81, p. 026103&ndash;026114.</font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">E. Estevez-Rams, R. Lora-Serrano, C. A. J. Nunes, and  B. Arag&oacute;n Fern&aacute;ndez, Lempel-Ziv complexity analysis of one dimensional cellular  automata, Chaos 2015, 25, p.&nbsp; 123106-123116.    </font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">D.P. Feldman. Computational mechanics of classical  spin systems. (1998)</font><!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">D.P. Feldman, C.S. McTeque, J.P. Crutchfield, Chaos,  2008, 18, p. 043106.    </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">R. Haslinger, K. L. Klinker, and C. R. Shalizi. The  computational structure of spike trains. Neural computation, 2010, 22, p.  121&ndash;157.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">J. Kari. Theory of cellular autoamta: A survey. Th.  Comp. Science, 2005, 334, p. 3&ndash;33.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">V. K. Kabra and D. Pandey, 1988, 61, p. 1493.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">A. Lempel and J. Ziv. On the complexity of finite  sequences. IEEE Trans. Inf. Th., 1976, IT-22, p. 75&ndash;81.</font></p>     ]]></body>
<body><![CDATA[<!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">M. Li and P. Vitanyi. An Introduction to Kolmogorov  Complexity and Its Applications. Springer Verlag, 1993.    </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">M. Li, X. Chen, X. Li, B. Ma, and P. M.  B. Vitanyi. The  similarity metric. IEEE Trans. Inf. Th., 2004, 50, p.3250&ndash;3264.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">L. Liu, D. Li, and F. Bai. A relative lempelziv  complexity: Application to comparing biological sequences. Chem. Phys. Lett.,  2012, 530, p.107&ndash;112.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">A. N. Kolmogorov. Three approaches to the concept of  the amount of information. Probl. Inf. Transm. (English Trans.)., 1965, 1,  p.1&ndash;7.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">M. Rajkovic and Z. Mihailovic. Quantifying complexity  in the minority game. Physica A, 2003, 325, p. 40&ndash;47.</font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">E. Rodriguez-Horta , E. Estevez-Rams , R. Lora-Serrano  , B. Aragon-Fernandez, Correlated biased random walk with latency in one and  two dimensions: Asserting patterned and unpredictable movement, Physica A,  2016, 458, p. 303312.    </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">E. Rodriguez-Horta , E. Estevez-Rams , R.Neder, R.  Lora-Serrano, Computational mechanics of stacking faults with finite range  interaction: layer pair interaction, in press.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">V. Ryabov, D. Neroth, Chaos, 2011, 21, p. 037113.</font></p>     ]]></body>
<body><![CDATA[<p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">C. R. Shalizi and J. P. Crutchfield. Computational  mechanics: pattern and prediction. J. Stat. Phys., 2001, 104, p. 817&ndash;879.</font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">J. J. A. Shaw and V. Heine, J. Phys.: Condens.  Matter., 1990, 2, p.4351.    </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">J. Szczepanski, J. M. Amigo, E. Wajnryb, and M. V.  Sanchez-Vives. Characterizing spike trains with lempel-ziv complexity.  Neurocomp., 2004, 58-60, p. 77&ndash;84.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">M. Talebinejad and A.D.C. Chanand A. Miri. A lempelziv  complexity measure for muscle fatigue estimation. J. of Electro. and Kinesi.,  2011, 21, p. 236&ndash;241.</font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">M. K. Uppal, S. Ramasesha, and C. N. R. Rao, Acta  Cryst. 1980, A36, p. 356.    </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">D. P. Varn, G. S. Canright, and J. P. Crutchfield.  &epsilon;-machine spectral reconstruction theory: a direct method for inferring planar  disorder and structure from x-ray diffraction. Acta Cryst. A, 2013, 69,  p.197&ndash;206.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">M. J. Weinberger, A. Lemepl, and J. Ziv. A sequential  algorithm for the universal coding of finite memory sources. IEEE Trans. Inf.  Th., 1992, 38, p. 1002&ndash;1014.</font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">S. Wolfram. Universality and complexity in cellular  automata. Phys. D, 1984, 10, p. 1&ndash;35.</font></p>     ]]></body>
<body><![CDATA[<!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">S. Wolfram. Theory and applications of cellular  automata. 1986, World Scientific Press, Singapur.    </font></p>     <!-- ref --><p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">S. Wolfram. A new kind of science. 2002, Wolfram media  Inc., Champaign, Illinois.    </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Y. Zhang, J. Hao, C. Zhou, and K. Chang. Normalized  Lempel-Ziv complexity and its applications in biosequence analysis, J. Math.  Chem., 2009, 46, p. 1203&ndash;1212. </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">A. Lempel and J. Ziv, On the complexity of finite  sequences, IEEE Trans. Inf. Theory 1978, 22, 75&ndash;81. </font></p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">J.  Ziv. Coding theorems for individual sequences. IEEE Trans. Inf. Th., 1978,  IT-24, p. 405&ndash;412. </font></p>     <p name="_ENREF_1">&nbsp;</p>     <p name="_ENREF_1">&nbsp;</p>     <p><font size="2" face="Verdana, Arial, Helvetica, sans-serif">Recibido: 15/06/2016    ]]></body>
<body><![CDATA[<br> Aceptado: 10/10/2016</font></p>      ]]></body><back>
<ref-list>
<ref id="B1">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Aboy]]></surname>
<given-names><![CDATA[M]]></given-names>
</name>
<name>
<surname><![CDATA[Homero]]></surname>
<given-names><![CDATA[R]]></given-names>
</name>
<name>
<surname><![CDATA[Abasolo]]></surname>
<given-names><![CDATA[D]]></given-names>
</name>
<name>
<surname><![CDATA[Alvarez]]></surname>
<given-names><![CDATA[D]]></given-names>
</name>
</person-group>
<source><![CDATA[Interpretation of the Lempel-Ziv complexity measure in the context of biomedical signal analysis.]]></source>
<year>2006</year>
<volume>53</volume>
<page-range>2282-2288</page-range></nlm-citation>
</ref>
<ref id="B2">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Calude]]></surname>
<given-names><![CDATA[C. S]]></given-names>
</name>
</person-group>
<source><![CDATA[Information and Randomness]]></source>
<year>2002</year>
<publisher-name><![CDATA[Springer Verlag]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B3">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Chelani]]></surname>
<given-names><![CDATA[A.B]]></given-names>
</name>
</person-group>
<source><![CDATA[Complexity analysis of co concentrations at a trafc site in Delhi.]]></source>
<year>2011</year>
<volume>16</volume>
<page-range>57-60</page-range></nlm-citation>
</ref>
<ref id="B4">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Contantinescu]]></surname>
<given-names><![CDATA[S]]></given-names>
</name>
<name>
<surname><![CDATA[Ilie]]></surname>
<given-names><![CDATA[L]]></given-names>
</name>
</person-group>
<source><![CDATA[Mathematical Foundations of Computer Science]]></source>
<year>2006</year>
<publisher-loc><![CDATA[^eBerlin Berlin]]></publisher-loc>
<publisher-name><![CDATA[Springer Verlag]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B5">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Cover]]></surname>
<given-names><![CDATA[T. M.]]></given-names>
</name>
<name>
<surname><![CDATA[Thomas]]></surname>
<given-names><![CDATA[J. A.]]></given-names>
</name>
</person-group>
<source><![CDATA[Elements of information theory]]></source>
<year>2006</year>
<edition>Second edition</edition>
<publisher-loc><![CDATA[^eNew Jersey New Jersey]]></publisher-loc>
<publisher-name><![CDATA[Wiley Interscience]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B6">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Crutchfield]]></surname>
<given-names><![CDATA[P]]></given-names>
</name>
</person-group>
<source><![CDATA[Modeling complex phenomena]]></source>
<year>1992</year>
<page-range>66-101</page-range><publisher-loc><![CDATA[^eBerlin Berlin]]></publisher-loc>
<publisher-name><![CDATA[Springer]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B7">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Crutchfield]]></surname>
<given-names><![CDATA[J]]></given-names>
</name>
<name>
<surname><![CDATA[Feldman]]></surname>
<given-names><![CDATA[D. P]]></given-names>
</name>
</person-group>
<source><![CDATA[Regularities unseen, randomness observed: Levels of entropy convergence]]></source>
<year>2003</year>
<volume>13</volume>
<page-range>25-54</page-range></nlm-citation>
</ref>
<ref id="B8">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Crutchfield]]></surname>
<given-names><![CDATA[J. P]]></given-names>
</name>
</person-group>
<source><![CDATA[Between order and chaos]]></source>
<year>2012</year>
<volume>8</volume>
<page-range>17-24</page-range></nlm-citation>
</ref>
<ref id="B9">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Dubacq]]></surname>
<given-names><![CDATA[C]]></given-names>
</name>
<name>
<surname><![CDATA[Durand]]></surname>
<given-names><![CDATA[B]]></given-names>
</name>
<name>
<surname><![CDATA[Formenti]]></surname>
<given-names><![CDATA[E]]></given-names>
</name>
</person-group>
<source><![CDATA[Kolmogorov complexity and cellular automata classification]]></source>
<year>2001</year>
<volume>259</volume>
<page-range>271-285</page-range></nlm-citation>
</ref>
<ref id="B10">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[. Emmert-Streib]]></surname>
<given-names><![CDATA[F]]></given-names>
</name>
</person-group>
<source><![CDATA[Exploratory analysis of spatiotemporal patterns of cellular automata by clustering compressibility.]]></source>
<year>2010</year>
<volume>81</volume>
<page-range>026103-026114</page-range></nlm-citation>
</ref>
<ref id="B11">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Estevez-Rams]]></surname>
<given-names><![CDATA[E]]></given-names>
</name>
<name>
<surname><![CDATA[Lora-Serrano]]></surname>
<given-names><![CDATA[R]]></given-names>
</name>
<name>
<surname><![CDATA[Nunes]]></surname>
<given-names><![CDATA[C. A. J.]]></given-names>
</name>
<name>
<surname><![CDATA[Aragón Fernández]]></surname>
<given-names><![CDATA[B]]></given-names>
</name>
</person-group>
<source><![CDATA[Lempel-Ziv complexity analysis of one dimensional cellular automata]]></source>
<year>2015</year>
<volume>25</volume>
<page-range>123106-123116</page-range></nlm-citation>
</ref>
<ref id="B12">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Feldman]]></surname>
<given-names><![CDATA[D.P]]></given-names>
</name>
</person-group>
<source><![CDATA[Computational mechanics of classical spin systems]]></source>
<year>1998</year>
</nlm-citation>
</ref>
<ref id="B13">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Feldman]]></surname>
<given-names><![CDATA[D.P]]></given-names>
</name>
<name>
<surname><![CDATA[McTeque]]></surname>
<given-names><![CDATA[C.S]]></given-names>
</name>
<name>
<surname><![CDATA[Crutchfield]]></surname>
<given-names><![CDATA[J.P.]]></given-names>
</name>
</person-group>
<source><![CDATA[Chaos]]></source>
<year>2008</year>
<volume>18</volume>
<page-range>p. 043106</page-range></nlm-citation>
</ref>
<ref id="B14">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Haslinger]]></surname>
<given-names><![CDATA[R]]></given-names>
</name>
<name>
<surname><![CDATA[Klinker]]></surname>
<given-names><![CDATA[K. L]]></given-names>
</name>
<name>
<surname><![CDATA[Shalizi]]></surname>
<given-names><![CDATA[C. R.]]></given-names>
</name>
</person-group>
<source><![CDATA[The computational structure of spike trains]]></source>
<year>2010</year>
<volume>22</volume>
<page-range>121-157</page-range></nlm-citation>
</ref>
<ref id="B15">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Kari]]></surname>
<given-names><![CDATA[J]]></given-names>
</name>
</person-group>
<source><![CDATA[Theory of cellular autoamta: A survey]]></source>
<year>2005</year>
<volume>334</volume>
<page-range>3-33</page-range></nlm-citation>
</ref>
<ref id="B16">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Kabra]]></surname>
<given-names><![CDATA[V. K]]></given-names>
</name>
<name>
<surname><![CDATA[Pandey]]></surname>
</name>
</person-group>
<source><![CDATA[D]]></source>
<year>1988</year>
<volume>61</volume>
<page-range>1493</page-range></nlm-citation>
</ref>
<ref id="B17">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Lempel]]></surname>
<given-names><![CDATA[A]]></given-names>
</name>
<name>
<surname><![CDATA[Ziv]]></surname>
<given-names><![CDATA[J]]></given-names>
</name>
</person-group>
<source><![CDATA[On the complexity of finite sequences]]></source>
<year>1976</year>
<page-range>75-81</page-range></nlm-citation>
</ref>
<ref id="B18">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[M]]></surname>
<given-names><![CDATA[Li]]></given-names>
</name>
<name>
<surname><![CDATA[Vitanyi]]></surname>
<given-names><![CDATA[P]]></given-names>
</name>
</person-group>
<source><![CDATA[An Introduction to Kolmogorov Complexity and Its Applications]]></source>
<year>1993</year>
<publisher-name><![CDATA[Springer Verlag]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B19">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[M. Li]]></surname>
<given-names><![CDATA[X]]></given-names>
</name>
<name>
<surname><![CDATA[Chen]]></surname>
<given-names><![CDATA[X. Li]]></given-names>
</name>
<name>
<surname><![CDATA[Ma]]></surname>
<given-names><![CDATA[B]]></given-names>
</name>
<name>
<surname><![CDATA[Vitanyi]]></surname>
<given-names><![CDATA[P. M. B]]></given-names>
</name>
</person-group>
<source><![CDATA[The similarity metric]]></source>
<year>2004</year>
<volume>50</volume>
<page-range>3250-3264</page-range></nlm-citation>
</ref>
<ref id="B20">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Liu]]></surname>
<given-names><![CDATA[L]]></given-names>
</name>
<name>
<surname><![CDATA[D]]></surname>
<given-names><![CDATA[Li]]></given-names>
</name>
<name>
<surname><![CDATA[Bai]]></surname>
<given-names><![CDATA[F]]></given-names>
</name>
</person-group>
<source><![CDATA[A relative lempelziv complexity: Application to comparing biological sequences]]></source>
<year>2012</year>
<volume>530</volume>
<page-range>107-112</page-range></nlm-citation>
</ref>
<ref id="B21">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Kolmogorov]]></surname>
<given-names><![CDATA[A. N]]></given-names>
</name>
</person-group>
<source><![CDATA[Three approaches to the concept of the amount of information]]></source>
<year>1965</year>
<volume>1</volume>
<page-range>p.1-7</page-range></nlm-citation>
</ref>
<ref id="B22">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Rajkovic]]></surname>
<given-names><![CDATA[M]]></given-names>
</name>
<name>
<surname><![CDATA[Mihailovic]]></surname>
<given-names><![CDATA[Z]]></given-names>
</name>
</person-group>
<source><![CDATA[Quantifying complexity in the minority game.]]></source>
<year>2003</year>
<volume>325</volume>
<page-range>40-47</page-range></nlm-citation>
</ref>
<ref id="B23">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Rodriguez-Horta]]></surname>
<given-names><![CDATA[E]]></given-names>
</name>
<name>
<surname><![CDATA[Estevez-Rams]]></surname>
<given-names><![CDATA[E]]></given-names>
</name>
<name>
<surname><![CDATA[Lora-Serrano]]></surname>
<given-names><![CDATA[R]]></given-names>
</name>
<name>
<surname><![CDATA[Aragon-Fernandez]]></surname>
<given-names><![CDATA[B]]></given-names>
</name>
</person-group>
<source><![CDATA[Correlated biased random walk with latency in one and two dimensions: Asserting patterned and unpredictable movement]]></source>
<year>2016</year>
<volume>458</volume>
<page-range>303312</page-range></nlm-citation>
</ref>
<ref id="B24">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Rodriguez-Horta]]></surname>
<given-names><![CDATA[E]]></given-names>
</name>
<name>
<surname><![CDATA[Estevez-Rams]]></surname>
<given-names><![CDATA[E]]></given-names>
</name>
</person-group>
<source><![CDATA[Computational mechanics of stacking faults with finite range interaction: layer pair interaction]]></source>
<year>2011</year>
<volume>21</volume>
<page-range>. 037113</page-range></nlm-citation>
</ref>
<ref id="B25">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Shalizi]]></surname>
<given-names><![CDATA[C. R]]></given-names>
</name>
<name>
<surname><![CDATA[Crutchfield]]></surname>
<given-names><![CDATA[J. P]]></given-names>
</name>
</person-group>
<source><![CDATA[Computational mechanics: pattern and prediction]]></source>
<year>2001</year>
<volume>104</volume>
<page-range>817-879</page-range></nlm-citation>
</ref>
<ref id="B26">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Shaw]]></surname>
<given-names><![CDATA[J. J. A]]></given-names>
</name>
<name>
<surname><![CDATA[Heine]]></surname>
<given-names><![CDATA[V]]></given-names>
</name>
</person-group>
<source><![CDATA[J. Phys.: Condens. Matter]]></source>
<year>1990</year>
<volume>2</volume>
<page-range>4351</page-range></nlm-citation>
</ref>
<ref id="B27">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Szczepanski]]></surname>
<given-names><![CDATA[J]]></given-names>
</name>
<name>
<surname><![CDATA[Amigo]]></surname>
<given-names><![CDATA[J. M.]]></given-names>
</name>
<name>
<surname><![CDATA[Wajnryb]]></surname>
<given-names><![CDATA[E]]></given-names>
</name>
<name>
<surname><![CDATA[Sanchez-Vives]]></surname>
<given-names><![CDATA[M. V]]></given-names>
</name>
</person-group>
<source><![CDATA[Characterizing spike trains with lempel-ziv complexity]]></source>
<year>2004</year>
<volume>58-60</volume>
<page-range>77-84</page-range></nlm-citation>
</ref>
<ref id="B28">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Talebinejad]]></surname>
<given-names><![CDATA[M]]></given-names>
</name>
<name>
<surname><![CDATA[Chanand]]></surname>
<given-names><![CDATA[A.D.C.]]></given-names>
</name>
</person-group>
<source><![CDATA[. Miri. A lempelziv complexity measure for muscle fatigue estimation.]]></source>
<year>2011</year>
<volume>21</volume>
<page-range>236-241</page-range></nlm-citation>
</ref>
<ref id="B29">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Uppal]]></surname>
<given-names><![CDATA[M. K]]></given-names>
</name>
<name>
<surname><![CDATA[Ramasesha]]></surname>
<given-names><![CDATA[S]]></given-names>
</name>
<name>
<surname><![CDATA[Rao]]></surname>
<given-names><![CDATA[N. R]]></given-names>
</name>
</person-group>
<source><![CDATA[Acta Cryst]]></source>
<year>1980</year>
<page-range>p. 356</page-range></nlm-citation>
</ref>
<ref id="B30">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Varn]]></surname>
<given-names><![CDATA[D. P]]></given-names>
</name>
<name>
<surname><![CDATA[Canright]]></surname>
<given-names><![CDATA[G. S]]></given-names>
</name>
<name>
<surname><![CDATA[Crutchfield]]></surname>
<given-names><![CDATA[J. P]]></given-names>
</name>
</person-group>
<source><![CDATA[&#949;-machine spectral reconstruction theory: a direct method for inferring planar disorder and structure from x-ray diffraction.]]></source>
<year>2013</year>
<volume>69</volume>
<page-range>197-206</page-range></nlm-citation>
</ref>
<ref id="B31">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Weinberger]]></surname>
<given-names><![CDATA[M. J.]]></given-names>
</name>
<name>
<surname><![CDATA[Lemepl]]></surname>
<given-names><![CDATA[A]]></given-names>
</name>
<name>
<surname><![CDATA[Ziv]]></surname>
<given-names><![CDATA[J]]></given-names>
</name>
</person-group>
<source><![CDATA[A sequential algorithm for the universal coding of finite memory sources.]]></source>
<year>1992</year>
<volume>38</volume>
<page-range>1002-1014</page-range></nlm-citation>
</ref>
<ref id="B32">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Wolfram]]></surname>
<given-names><![CDATA[S]]></given-names>
</name>
</person-group>
<source><![CDATA[Universality and complexity in cellular automata]]></source>
<year>1984</year>
<volume>10</volume>
<page-range>1-35</page-range></nlm-citation>
</ref>
<ref id="B33">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Wolfram]]></surname>
<given-names><![CDATA[S]]></given-names>
</name>
</person-group>
<source><![CDATA[Theory and applications of cellular automata]]></source>
<year>1986</year>
<publisher-name><![CDATA[World Scientific Press]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B34">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Wolfram]]></surname>
<given-names><![CDATA[S]]></given-names>
</name>
</person-group>
<source><![CDATA[A new kind of science]]></source>
<year>2002</year>
</nlm-citation>
</ref>
<ref id="B35">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Zhang]]></surname>
<given-names><![CDATA[Y]]></given-names>
</name>
<name>
<surname><![CDATA[Hao]]></surname>
<given-names><![CDATA[J]]></given-names>
</name>
<name>
<surname><![CDATA[Zhou]]></surname>
<given-names><![CDATA[C]]></given-names>
</name>
<name>
<surname><![CDATA[Chang]]></surname>
<given-names><![CDATA[K]]></given-names>
</name>
</person-group>
<source><![CDATA[Normalized Lempel-Ziv complexity and its applications in biosequence analysis]]></source>
<year>2009</year>
<volume>46</volume>
<page-range>. 1203-1212</page-range></nlm-citation>
</ref>
<ref id="B36">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Lempel]]></surname>
<given-names><![CDATA[A]]></given-names>
</name>
<name>
<surname><![CDATA[Ziv]]></surname>
<given-names><![CDATA[J]]></given-names>
</name>
</person-group>
<source><![CDATA[On the complexity of finite sequences]]></source>
<year>1978</year>
<volume>22</volume>
<page-range>75-81</page-range></nlm-citation>
</ref>
<ref id="B37">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Ziv]]></surname>
<given-names><![CDATA[J]]></given-names>
</name>
</person-group>
<source><![CDATA[Coding theorems for individual sequences]]></source>
<year>1978</year>
<page-range>405-412</page-range></nlm-citation>
</ref>
</ref-list>
</back>
</article>
