Review Article - (2016) Volume 5, Issue 3

The Relation between Thermodynamics and the Information Theories: The Introduction of the Term Enmorphy

Dimitrios Samios*
Institute of Chemistry, Federal University of Rio Grande do Sul, Brazil
*Corresponding Author: Dimitrios Samios, Institute of Chemistry, Federal University of Rio Grande do Sul, Bento Gonçalves, Porto Alegre-RS, Brazil, Tel: 555133089883, Fax: 555133167304 Email:

Abstract

This study aims to present the relation of thermodynamics to the information theories. Thermodynamically, material systems are considered to be understood in terms of Matter, Energy and Entropy. We discuss the significance of the work and the others who contributed for the development of the thermodynamics. In parallel to this fact, the evolution of information theories begins with the mathematization of information. However, Stonier discussed in terms of Matter and Information, considering order to be the inverse of disorder, Or=1/D. Going ahead with the information theories we introduced the term of Enmorphy. We present the concept of Matter-Information– Enmorphy as a system parallel to Matter- Energy-Entropy. The new concept includes the terms: Matter, Information, related to structure, organization and dynamics and the term Enmorphy related to disorder or to the Entropy.

Keywords: Matter-energy-entropy; Matter-information-enmorphy; Analogical thinking

Introduction

Concerning thermodynamics and information theory

The literature on physics, chemistry, biology, medicine, social sciences, arts and other related areas demonstrates in the last years a very strong trend in the use of Information Theory. Specially the strong interdisciplinary interest [1-3] the complexity studies [4-6] and some recent publications [7,8] demonstrated that there is no consensus on what information is, no consensus on the necessity of an epistemological conception of information. This goal would require the exploration of a wide variety of disciplines including philosophy and science.

Many of the publications [1-8] mentioned before have used the term “Information” as defined by Shannon in 1948 [9]. The way Shannon introduced the definition of information, is a similar one, parallel to the Boltzmann-Gibbs entropy [10-14].

Following paragraphs resume the evolution of the notions of Boltzmann-Gibbs Entropy and the Shannon´s information. We must have in mind that Clausius gives the statements of the first and second law of thermodynamics in 185010, overcoming the caloric theory, but preserving Carnot´s principle. In 1854, in the Clausius Theorem, he established [11] the importance of dQ/T, but he did not named the respective quantity. Only in 1865 he introduced the modern macroscopic concept of Entropy [12,13]. Inspired in the term Energy, the Greek combination of εν and εργια (energia in Latin languages), being `work-content in English, werk-inhalt in German, he devised the term Entropy as a corresponding designation for the transformationcontents (verwandlungsinhalt) of a system and for this reason the terms “Εντροπια, Entropie, Entropia, Entropy”.

In the latter part of the nineteenth century, Ludwig Boltzmann almost single-handedly established the field now known as statistical mechanics. One of his major contributions and undoubtedly the most controversial was his H-theorem. According to Tolman and Boltzmann stated the equation for the temporal development of distribution functions in phase space and published his H-theorem [14] in 1872. According to Brush [15], the scientific community was initially skeptical of Boltzmann's result and a lengthy controversy ensued. The Boltzmann's H-theorem was incapable to relate directly H to the entropy except for very near the final or equilibrium state and it fails to identify the source of irreversibility [16]. The H theorem was published with the intent of showing that the second law of thermodynamics derives from the laws of mechanics. Now it is agreed that the attempted proof was unsuccessful. Boltzmann began by defining the function H for a dilute gas comprised of spherical particles [15]. In this point we have to remember that the content of the Boltzmann´s functional also appears in Shannon’s theory of Information.

According to Müller [17], Gibbs published in 1876 and 1878 two papers with discussion of phase equilibria, statistical ensembles, the free energy as the driving force behind chemical reactions and chemical thermodynamics in general. In the same time, 1877, Boltzmann stated [17,18] the relationship between entropy and probability. This classical thermodynamics evolution was crowned in 1909 with Carathéodory´s [19] axiomatic description of thermodynamics. In his conception, temperature is mathematically defined as the integrating factor who permits the integration of the non-exact differential of the energy conservation expression (first law), proving this way the classical thermodynamics relationship between Energy, Entropy and Temperature. The quantum physics development received in 1927 a very important contribution by Neumann [20] who introduced the density matrix representation and established quantum statistical mechanics.

Concerning Information, the period before and immediately after the second world war was characterized by a huge necessity of efficient and accurate acquisition of information and fast, efficient and error free communication. Is exactly in 1948 when Claude Elwood Shannon establishes “A mathematical theory of communication” and related Information to Entropy [21]. Similar conceptions to Shannon were originally presented by Nyquist [22] and Hartley [23] however, not exactly in the same context. Nyquist presented the paper, certain factors affecting telegraph speed [22] contains a theoretical section quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system, giving the relation

W = K log m        (1)

Where W is the speed of transmission of intelligence, m is the number of different voltage levels to choose from at each time step and K is a constant. Hartley's paper [23] “Transmission of Information”, uses the word information as a measurable quantity, reflecting the receiver's ability to distinguish one sequence of symbols from any other, thus quantifying information as

H = logSn = nlogS               (2)

Where S was the number of possible symbols and n the number of symbols in a transmission.

It is our opinion that Shannon´s information theory formulated the basis for a development similar to that of the industrial revolution induced by the steam machine. The field is at the intersection of mathematics, statistics, computer science, physics, chemistry, biology, materials science, neurobiology, communications and electrical engineering additional to other scientific disciplines.

The Shannon-Boltzmann-Gibbs “Information-Entropy” isomorphism was strongly criticized by different authors, from different scientific and philosophical disciplines. Shannon never claimed to have developed a theory of information; rather he considered his contribution to have been a theory of communication. In order to follow the criticism on the Shannon theory let present the main conception and the respective definitions.

In the Shannon theory the measure of information is known as entropy, which is equivalent to the average number of bits needed for storage or communication. Entropy quantifies the uncertainty involved when encountering a random variable. The inspiration for adopting the word entropy in information theory came from the close resemblance between Shannon's formula and very similar known formulae from thermodynamics. Exactly when and how Shannon developed his theory is still uncertain. According to Thomsen [24] there are some evidences about the genesis of Shannon’s information theory. John von Neumann encouraged him to use for the measure of information instead of the word of “uncertainty” the word of “entropy”. “You should call it entropy, for two reasons. In first place your uncertainty has been used in statistical mechanics under that name, so it already has a name. In second place and more important, no one knows what entropy really is, so in a debate you will always have the advantage.” In statistical thermodynamics the most general formula for the thermodynamic entropy S of a thermodynamic system is the Boltzmann-Gibbs entropy,

image

Where kB is the Boltzmann constant and pi the probability of a state. The B-G entropy was given by J. Willard Gibbs in 1878 after earlier work by Boltzmann. The entropy is expressed in quantum physics to give the Neumann entropy, introduced by Neumann [20] in 1927,

image

Where ρ is the density matrix of the quantum mechanical system. In the Shannon´s formalism the entropy, H, of a discrete random variable X is a measure of the amount of uncertainty associated with the value of X. Information can be quantified as follows. If H(X) is the set of all messages x that X could be and p(x) is the probability of X given x and then the entropy of X is defined:

image

In equation 5, I(x) is the self-information, which is the entropy contribution of an individual message and EX is the expected value. In this representation Shannon equals the expected information to total entropy. An important property of entropy is that it is maximized when all the messages in the message space are equiprobable.

p(x)=1/n,

—i.e., most unpredictable- in which case H(X)=logn.

The special case of information entropy for a random variable with two outcomes is the binary entropy function Hb(p) given by the equation 6.

Hb ( p ) = − p log p −(1− p)log (1− p)                    (6)

The formalism of Shannon, stemming from Hartley´s original ideas, received strong conceptual criticisms from different authors. Cherry [25] in 1978 has commented that: ‘it is a pity that the mathematical concepts stemming from Hartley have been called ‘information’ at all.’ Cherry goes on to point out that the formula derived by Shannon for the average information contained in a long series of symbols is really a measure of the statistical rarity or ‘surprise value’ of a course of message signs. “This is hardly a true measure of the information content of a message”

The other important critic was formulated by Wicken [26] in 1987 how has shown that: “While the Shannon equation is symbolically isomorphic with the Boltzmann equation, the meanings of the respective equations bear little in common”.

Information related to structure and organization

Stonier in his approach “Information and the Internal Structure of the Universe” [27,28] presents an information theory and assumes that: information is a basic property of the universe - as fundamental as matter and energy. Specifically, a system may be said to contain information if such a system exhibits organization. Similarly to mass which is a reflection of a system containing matter and Heat which is a reflection of a system containing energy, the Organization is the physical expression of a system containing information. By ‘organization’ is meant the existence of a nonrandom pattern of particles, energy fields, or other sub-units comprising the system. Considering that the Shannon theory handles manly with message, it is of great importance to discuss the quest information versus message. It is necessary to differentiate between the concept ‘information’ and the concept ‘message’. Similarly to thermodynamics, Energy, in pre-relativity physics, was considered as the more abstract quantity which, when added to matter, manifested itself as heat. Information may be considered as the more abstract quantity which, when added to matter, manifests itself as structure, as organization. “This concept of information leads to a different quantitative definition from that of Shannon”. The Stonier´s theory assumes that there exists a direct linear relationship between the degree of organization and the quantity of structural information.

On this basis it is not unreasonable to consider that there exists a direct linear relationship between the amount of organization and the quantity of information. Stonier, affirms that it is needless to say, the verification of this assumption must await the confrontation of theory with physical reality. The starting point of the Stonier´s alternative theoretical approach is Boltzmann’s equation as proposed by Schrödinger [29] in 1944.

S = k log D             (7)

Where S = the entropy of the system, k = Boltzmann’s constant and D=the state of disorder of the system. Although Schrödinger did not write it as an equation, it is clear from the text that he considered ‘order’ to be the inverse of ‘disorder’, Or=1/D and implicit in his writing is the equation:

S = k logOr          (8)

Where Or=the state of order of the system. Using the assumption of a direct linear relationship between the amount of organization (Or) and the quantity of information (I), that is:

I = c(Or )              (9)

Where c represents a constant to be defined. By substituting in equation (9) one may derive the relationship between information and entropy,

I = ce ( −S / k )              (10)

The constant c, upon further examination, has proven to be equal to the information content of a system when its absolute entropy equaled zero. Equation (10) can be modified as:

image

Where Io is the information content of the system when S=0.   

The (Io/I) may be restated as a probability function, at which point its values correspond to Boltzmann´s original W (Warscheinlichkeit). It is important to realize that Boltzmann´s equations were derived [15,17,18] from studies in gases in which (I) would never exceed (Io) .

This approach indicates a very different relationship between information and entropy from that understood by the communications engineers (Shannon and others). Instead of there being a direct relationship between information and entropy, as proposed by Shannon and Weaver [21], or a negative one as modified by Brillouin [30], states that: information and entropy are inversely related. To be more precise, entropy is the multiplicative inverse of information, not, as is generally viewed today, the additive inverse. Furthermore, entropy must not be viewed as a simple metaphor for describing information.

The attempt to unify theories of information

Trying to elucidate and understand the nature of information and the related theories we have to comment the interdisciplinary very broad contribution edited by Hofkirchner [31] in 1999. “The quest of a Unified theory of information” includes a series of works related to the unification of the presented Information Theories.

Between other papers, Flückiger presents a paper [32] with the title “Towards a Unified Concept of Information: Presentation of a New Approach”. In this paper he differentiates between two main streams of Information Theories, namely: a) Functional-cybernetic information theories: In this category information is understood as functionality, functional meaning or as a feature of organized or self-organized systems which explain some dynamical aspects of information. (This includes mainly the conceptions of Shannon and neo-Shannonites) and b) Structural-attributive information theories: In these theories, information is understood as structure, diversity, order, etc. These theories are based in the statement that “Each animate or inanimate individual has an inner structure, the diversity of which constitutes the individual´s information content. In this category, he includes Szilard[33], Wiener [34], Brillouin [32], McKay [35], Delvin [36] and Stonier [28].

It is the reality that the two types of information theories explain different aspects of information and both types have remained useful to this day. However many issues remain objects of strong controversy between the two different information theories. Until today, in so far our knowledge, there was no success in the unification attempt which confirms the actual historical point.

Toward the introduction of enmorphy

The attempts to unify the Functional-cybernetic information theories with the Structural-attributive information theories were and continue remaining without success: In this point we would try to contribute with a critical analysis aiming to elucidate the origin of the difference between the two main information theories. In order to proceed, let as have in mind some fundamental notions of sciences like: a) the wave function which describes the spatial and temporal periodicity characteristics of a material system, b) the time correlation and autocorrelation function, c) the spatial correlation function, d) the numerical simulation analysis techniques between other similar issues. The mentioned notions are some of the fundamentals for the description of spatial and temporal characteristics of physical systems which constitute the main elements of system morphology [37-40]. Actually, the quantification of morphological analysis is creating the strongly increasing area of “Morphometry” [41-44]. The evolution of informatics becomes a “tsunami” for the scientist. Counting, measuring and analyzing aspects of morphometry becomes increasing in the last decade. Davatzikos [45-47] is one of the innumerous examples in the brain morphology. Alain [48] and Adams [49,50] and their coworkers are some examples of the area of biology and dimensional analysis. Consider that the definition of Entropy was given in relation to heat, in other words related to energy involved in a process. Connections between information-theoretic entropy and thermodynamic entropy, are explored by Jaynes [51] including the important contributions by Landauer [52]. The absolute temperature was inversely related to the differential of entropy, dS, through the equation dS>dQ/T. According to Caratheodory [19] the parameter 1/T transforms the heat in integrable quantity and the integration result is the Entropy. Resuming, we observe that the entropy (Boltzmann-Gibbs or Neumann) is strictly related to the total probability of the possible distinct energy states of the material system. Similarly the Rényi entropy theory [53] and the q-statistics of Tsallis [54,55] generalize the conception of entropy. Tsallis introduced the entropic index q which is intimately related to the microscopic dynamics of the observed system and characterizes the degree of non-extensivity. The values q<1, q=1 and q>1 correspond respectively to superadditivity (superextensivity), additivity (extensivity) and subadditivity (subextensivity). The introduction of the generalized q-statistics is surly affecting the area of the information theories ones they are intimately related to the entropy [49].

Again, trying to elucidate the “Functional-cybernetic and the Structural-attributive Information theories” we observe that: The first one is strongly related to “message” and consequently to communication (storage, emission, reception), consequently related to dynamical aspects of the message structure. The second one is related to static organization, the static order and generally the static structure aspects of the information. As we observed, both theories include information structural aspects, however the first one is manly dominated by the dynamic (emission, reception) aspects, while the second appears to be time invariant. Let us rationalize in a similar way to classical thermodynamics. Clausius [11-13] added to the terms Matter and Energy the term Entropy independently of the dynamical aspect. In the same way, it seems obvious, a generalized information theory mast include additionally to the Matter, the terms information (related to Structure and organizations) and a new term is introduced for the first time in this paper, namely the term Enmorphy. The choice of the term information (parallel to the energy) seems to be generally plausible and acceptable. By information is meant the existence of a structure or organization of nonrandom pattern of particles, energy fields, or other sub-units comprising a system. It is the term parallel and “analogon” to energy. Energy is the entity which produces the energetic states of the material system. Similarly a system can include different and various forms of information expressed as organization, as structure. In this case the term Enmorphy is proposed which is composed by the terms en (Greek εν) and morphe (Greek μορφη) and means “turn the appearance (the morphe) in to inside” similarly to Entropy meaning “ turn the energy in to inside”. Both terms include the meaning of “loss, hide, occult a property”. In the case of entropy a loss of energy inside the system is happened, while in the case of Enmorphy a loss of information in terms of structure or organization is happened inside the system. The term Enmorphy, must be understood similarly to Entropy as the physical entity which in the case of spontaneous processes tends to increase. The above analysis and the respective definitions, primarily eliminates the isomorphism question of the Shannon´s theory. The centrality of the energy is substituted by information and finally the notion of the Boltzmann- Gibbs or von Neumann Entropy is substituted by the notion Enmorphy. The proposed concept creates a new conceptual basis, namely, that of matter, information, enmorphy parallel to the thermo dynamical conception of matter, energy and entropy. Including into concept of matter- information-enmorphy the notion of time we integrate automatically the dynamical aspects, the processing and transmission of information.

Of course, there are many similarities between the Material- Information-Enmorphy and the Material-Energy-Entropy conceptions; however, this would be a challenge to look after the mathematical relations which govern these two conceptions. The construction of a generalized Information Theory based on the proposed concept would needs elevated attention exactly because of the similarity with thermodynamics. The tendency of the “analogical thinking” was and continues constituting an important “instrument” of the human way to think.

Some additional remarks

The proposed concept of Matter-Information-Enmorphy is not philosophically equivalent to the concept of Matter-Energy-Entropy. Sure, there are very important relations between the entities included in the concepts, however it would be not correct to equalize energy to information without definition of process, it would be precipitated to define lows of “information conservation” as a parallel to the “energy conservation” or rules of “Enmorphy additivity” as a parallel to Entropy additivity, as proposed by Boltzman-Gibbs for Entropy. Equally the validity of the Q-statistics has to be discussed in the Matter-Information-Enmorphy concept. Dynamical aspects (time dependence) related to the proposed concept would need elucidation, notions as periodicity (wave functions), time and space correlations, may be discussed adequately in relation to Enmorphy. Finally the notion of Enmorphy and its relation to temperature has to be evaluated and elucidated. Intuitively, it would be necessary to introduce a new relation evolving Information, Enmorphy and Temperature. These remarks aimed to indicate the possibility of opening the new scientific field of Enmorphy.

Conclusion

An analysis of the fundamental ideas of Clausius, Boltzmann, Gibbs and Neumann, then the introduction of the early information theory with the ideas of Shannon, Nyquist and Hartley and later by Stonier, permitted to formulate the new concept of Matter-Information- Enmorphy as a construction parallel to Matter-Energy-Entropy. The concept introducing the notion of Enmorphy overcomes the isomorphism between Shannon-Stonier Information Theories and Boltzmann-Gibbs-Neumann entropy theory. In other words, the Thermodynamics and the Information Theories are described by the same mathematical formalism. What they have in common is the Matter. Matter is what we feel with our senses. Energy is used as the parallel to Information and Entropy as the parallel to Enmorphy. They are four totally different properties, including different content. This is the reason why we have introduced the Enmorphy. The proposed concept of Matter–Information–Enmorphy would probably initiate a new field in sciences.

References

  1. Hernandez-Lemus E (2009) Information theoretical methods to deconvolute genetic regulatory networks applied to thyroid neoplasms. Physica 388: 5057-5069.
  2. Martins AFT, Smith NA, Xing EP, Aguiar PMQ, et al. (2009) Non-extensive information theoretic kernels on measures. J Mach Learn Res 10: 935-975.
  3. Abdallah S, Plumbley M (2009) Information dynamics: Patterns of expectation and surprise in the perception of music. Conn Sci 21: 89-117.
  4. Estrada E (2009) Information mobility in complex networks. Phys Rev E Stat Nonlin Soft Matter Phys 80: 026104.
  5. Prokopenko M, Boschietti F, Ryan A J (2009) An information-theoretic primer on complexity, self-organization and emergence. Complexity 15: 11-28.
  6. Anand K, Bianconi G (2009) Entropy measures for networks: Toward an information theory of complex topologies. Phys Rev 80: 45-102.
  7. Karamuftuoglu M (2009) Situating logic and information in information science. J Am Soc Inf Sci Technol 60: 2019-2031.
  8. Thomsen SW (2009) Some evidence concerning the genesis of Shannon's information theory. Stud Hist Philos Sci 40: 81-91.
  9. Shannon CE (1948) A mathematical theory of communication. Bell System Technical Journal 27: 379-423.
  10. Clausius R (1850) On the driving force of the heat, Part I, Part II. Ann Phys 79: 368-397.
  11. Clausius R (1855) Poggendoff’s Annalen, translated in the Journal de Mathematiques Paris, p: 481.
  12. Clausius R (1865) On the heat conduction of gaseous bodies. Ann Phys 125: 353-400.
  13. Clausius R (1865) The mechanical theory of heat – with its applications to the steam engine and to physical properties of bodies. London: John van Voorst 1 Paternoster Row.
  14. Tolman RC (1938) The principles of statistical mechanics, Oxford University Press, London, pp: 134-140.
  15. Brush SG (1976) The kind of motion we call heat. North-Holland Publishing Co., Amsterdam, pp: 598-612.
  16. Prigogine I (1961). Introduction to thermodynamics of irreversible processes. New York: Interscience.
  17. Müller I (2007) A History of Thermodynamics-the Doctrine of Energy and Entropy. Springer, Berlin.
  18. Ehrenfest P, Ehrenfest T (1911) Conceptual foundations of the statistical view in mechanics: Encyclopedia of mathematical sciences, including their applications. Leipzig-Teubner, pp: 3-90.
  19. Carathéodory C (1909) Studies on the fundamentals of thermodynamics. Math Ann 67: 355-386.
  20. Von Neumann J (1932) Mathematical Foundations of Quantum Mechanics, Springer, Berlin.
  21. Shannon CE, Weaver W (1964) The mathematical theory of communication. University of Illinois Press, Urbana, Illinois.
  22. Nyquist H (1924) Certain factors affecting telegraph speed. Bell System Technical Journal 3: 324-346.
  23. Hartley RVL (1928) Transmission of Information. Bell System Technical Journal 7: 535-563.
  24. Thomsen SW (2009) Some evidence concerning the genesis of Shannon’s information theory. Stud Hist Philos Sci 40: 81-91.
  25. Cherry C (1978) On human communication. The MIT Press. Cambridge, Massachusetts.
  26. Wicken W (1987) Entropy and information: Suggestions for a common language. Philos Sci 54: 176-193.
  27. Stonier T (1990) Information and the internal structure of the universe. Springer Verlag, London (UK).
  28. Stonier T (1996) Information as a basic property of the universe. BioSystems 38: 135-140.
  29. Schrödinger E (1944) What is life? Cambridge University Press, Cambridge, UK.
  30. Hofkirchner W (1999) The quest of a unified theory of information. World futures general evolution studies, Vol. 13 Gordon and Breach Publishers, Amsterdam, the Netherlands.
  31. Flückiger F (1999) Towards a unified concept of information: Presentation of a new approach. In: The quest of a Unified theory of information” Ed. Wolfgang Hofkirchner. Gordon and Breach Publishers, Amsterdam, the Netherland, pp: 110-112.
  32. Szilard L (1929) On the entropy of entropy in a thermodynamic system, when intermittent beings intervene. Time for Physics 53: 48.
  33. Wiener N (1961) Cybernetics or control and communication in the animal and the machine 2nd Edition, MIT press and Wiley, New York and London.
  34. McKay DM (1969) Information mechanism and meaning, The MIT press, Cambridge, Massachusetts and London, England.
  35. Delvin KJ (1992) Logic and information. Cambridge University press, Cambridge, England.
  36. Berman PR, Le Gouet JL (2009) Quantum-information storage: A schrodinger-picture approach. Phys Rev 79: 2314.
  37. Shor PW (2009) Quantum information theory. The bits don't add up. Nat Phys 5: 247-248.
  38. Machado G et al (2009) Morphological and crystalline studies of isotactic polypropylene plastically deformed and evaluated by small-angle X-ray scattering, scanning electron microscopy and X-ray diffraction. Eur Polym J 45: 700-713.
  39. Guzatto R, Da Roza MB, Denardin ELG, Samios D (2009) Dynamical, morphological and mechanical properties of poly(ethylene terephthalate) deformed by plane strain compression. Polym Test 28: 24-29.
  40. Yanovsky I (2009) Comparing registration methods for mapping brain change using tensor-based morphometry. Med Image Anal 13: 679-700.
  41. Elewa AMT (2004) Morphometrics: Applications in biology and palaeontology. Berlin: Springer.
  42. McLellan T, John AE (1998) The relative success of some methods for measuring and describing the shape of complex objects. System Biol 47: 264-281.
  43. Zelditch M, Swiderski D, Sheets DH, Fink W (2004) Geometric morphometrics for biologists. Academic Press.
  44. Davatzikos C (2004) Why voxel-based morphometric analysis should be used with great caution when characterizing group differences. NeuroImage 23: 17-20.
  45. Gaonkar B, Pohl K, Davatzikos C (2011) Pattern based morphometry. Medical image computing and computer-assisted intervention: MICCAI. Lecture Notes in Computer Science 14: 459-466.
  46. Zhang T, Koutsouleris N, Meisenzahl E, Davatzikos C (2014) Heterogeneity of structural brain changes in subtypes of schizophrenia revealed using MRI pattern analysis. Schizophr Bull 41: 74-84.
  47. Miller MI, Younes L, Trouvé A (2013) Diffeomorphometry and geodesic positioning systems for human anatomy. Technology 2: 36-43.
  48. Kaliontzopoulou A, Adams DC (2016) Phylogenies, the comparative method and the conflation of tempo and mode. System Biol 65: 1-15.
  49. Collyer ML, Sekora DJ, Adams DC (2015) A method for analysis of phenotypic change for phenotypes described by high-dimensional data. Heredity 115: 357-365.
  50. Jaynes ET (1957) Information theory and statistical mechanics. Phys Rev 106: 620.
  51. Landauer R (1961) Irreversibility and heat generation in the computing process. IBM Journal of Research and Development 5: 183-191.
  52. Rényi A (1961) On measures of information and entropy. Proceedings of the 4th Berkeley Symposium on Mathematics. Statistics and Probability, pp: 547-561.
  53. Tsallis C (1988) Possible generalization of Boltzmann-Gibbs statistics. J Stat Phys 52: 479-487.
  54. Tsallis C (2009) Non-additive entropy: The concept and its use. Eur Phys J A 40: 257-266.
Citation: Samios D (2016) The Relation between Thermodynamics and the Information Theories: The Introduction of the Term Enmorphy. Int J Swarm Intel Evol Comput 5:140.

Copyright: © 2016 Samios D. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.