

^ Aapo Hyvärinen and Erkki Oja, Independent Component Analysis: A Tutorial, node14: Negentropy, Helsinki University of Technology Laboratory of Computer and Information Science.

^ Aapo Hyvärinen, Survey on Independent Component Analysis, node32: Negentropy, Helsinki University of Technology Laboratory of Computer and Information Science.& Herwig, H.: (2009) 'Exact thermodynamic principles for dynamic order existence and evolution in chaos', Chaos, Solitons & Fractals, v. ^ Léon Brillouin, La science et la théorie de l'information, Masson, 1959.


J = S max − S = − Φ = − k ln Z the Boltzmann constant Brillouin's negentropy principle of information More recently, the Massieu–Planck thermodynamic potential, known also as free entropy, has been shown to play a great role in the so-called entropic formulation of statistical mechanics, applied among the others in molecular biology and thermodynamic non-equilibrium processes. A similar physical quantity was introduced in 1869 by Massieu for the isothermal process (both quantities differs just with a figure sign) and then Planck for the isothermal-isobaric process. It corresponds exactly to the definition of negentropy adopted in statistics and information theory. In other words, it is a difference between maximum possible, under assumed conditions, entropy and its actual entropy. This quantity is the amount of entropy that may be increased without changing an internal energy or increasing its volume. On the diagram one can see the quantity called capacity for entropy. In 1873, Willard Gibbs created a diagram illustrating the concept of free energy corresponding to free enthalpy. There is a physical quantity closely linked to free energy (free enthalpy), with a unit of entropy and isomorphic to negentropy known in statistics and information theory.
