• John Baez and Blake Pollard, Relative entropy in biological systems. (Blog article here.)

But now Blake Pollard has a new paper, and I want to talk about that:

• Blake Pollard, Open Markov processes: A compositional perspective on non-equilibrium steady states in biology.

I’ll focus on just one aspect: the principle of minimum entropy production.

]]>• John Baez and Blake Pollard, Relative entropy in biological systems. (Blog article here.)

Since I like abstract generalities, this information-theoretic way of understanding free energy appeals to me.

And of course free energy is *useful*, so an organism should care about it—and we should be able to track what an organism actually does with it. This is one of my main goals: understanding better what it means for a system to ‘do something with free energy’.

In glycolysis, some of the free energy of glucose gets transferred to ATP. ATP is a bit like ‘money’: it carries free energy in a way that the cell can easily ‘spend’ to do interesting things. So, at some point I want to look at an example of how the cell actually spends this money. But for now I want to think about glycolysis—which may be more like ‘cashing a check and getting money’. […]

]]>I am thinking that the Landauer’s principle can be used to prove the entropy increase using heat flow (graph) from a chemical reaction, so that each chemical reaction can give ordered phases, and life=consciousness, for high reduction of the inner energy (and the self-replication like a usual manner to obtain order, in long times).

I am thinking that if the information is a bit string, and if each program is a number of memory unit, then it could be possible to generate computer programs using free Helmholtz free energy like a reservoir of bit change, and the environment like a reservoir of bit change: the graphs coding the flow of the energy like an adaptive program (thermodynamic computer science).

If there exists an upper limit of the entropy (for the third law of thermodynamic), then the Landauer’s principle is not true ever, and there is a quantum restriction: there is an information that is not stored in the thermodynamic system because all the increase of the quantum levels have an energy greater of the Landauer limit (so that a principle in thermodynamic could be a quantum principle). ]]>

“alwasy infinitesimal stochastic”

]]>The **Jensen–Shannon divergence** of two probability distributions is defined in terms of relative information by

where is the arithmetic mean of the probability distributions and :

The Jensen–Shannon divergence is obviously symmetric in its arguments. More interesting is that its square root is actually a metric on the space of probability distributions! In particular, it obeys the triangle inequality. Even better, it is nonincreasing whenever and evolve in time via a Markov process. Without some property like this, a metric on probability distributions is not very interesting to me.

Markov processes are linear. For nonlinear systems like the replicator equation or the rate equation of a chemical reaction network, it seems hard to find any metric where distances between populations decrease as time passes. There are however nice theorems about how relative information decreases, as summarized in this introduction.

]]>If 1-2 sentences could do it, you might add whether or not any of the “other divergences” are more like metrics, and if not, what that means, and if so, why they’re not “better”. (If some divergence is symmetric, maybe nonlinearly rescaling it could also fix the triangle inequality — if not, it would be interesting to understand why.)

]]>• John Baez, Entropic forces, *Azimuth*, 1 February 2012.