
This is my talk for the workshop Biological Complexity: Can It Be Quantified?
• John Baez, Biology as information dynamics, 2 February 2017.
Abstract. If biology is the study of self-replicating entities, and we want to understand the role of information, it makes sense to see how information theory is connected to the ‘replicator equation’—a simple model of population dynamics for self-replicating entities. The relevant concept of information turns out to be the information of one probability distribution relative to another, also known as the Kullback–Leibler divergence. Using this we can get a new outlook on free energy, see evolution as a learning process, and give a clean general formulation of Fisher’s fundamental theorem of natural selection.
For more, read:
• Marc Harper, The replicator equation as an inference dynamic.
• Marc Harper, Information geometry and evolutionary game theory.
• Barry Sinervo and Curt M. Lively, The rock-paper-scissors game and the evolution of alternative male strategies, Nature 380 (1996), 240–243.
• John Baez, Diversity, entropy and thermodynamics.
• John Baez, Information geometry.
The last reference contains proofs of the equations shown in red in my slides.
In particular, Part 16 contains a proof of my updated version of Fisher’s fundamental theorem.
Thank you very much! Didn’t know Fisher information metric! It will be my weekend read. Do you think can somehow be related to Ryu-Takayanagi formula?
I don’t know the Ryu-Takaynagi formula. The Fisher information metric is fundamental to information geometry, meaning the geometry of the space of probability distributions, or more general probability measures. I wrote about it in this series:
• Information geometry.
The word “distribution” is repeated on slide 12.
Thanks! I fixed it, but it will take some time for the fixed version to appear, since (unusually) this airport wifi is not letting me use WinSCP to transfer files.
Okay, the fixed version is up now—and I’ve put the most important equations in cute red letters.
By the way, I’m very happy with my statement of Fisher’s fundamental theorem of natural selection—but I’ve never seen it anywhere else, though I’m having trouble imagining it’s new. It’s on the second to last page of my talk.
Ronald Fisher’s original statement and proof became famous for their obscurity. Quoth Wikipedia:
Price’s paper is here:
• George R. Price, Fisher’s ‘fundamental theorem’ made clear, Annals of Human Genetics 36 (1972), 129–140.
I don’t find it very clear, perhaps because I didn’t spend enough time on it.
My result is a theorem in the mathematical sense, though quite an easy one. I assume a population distribution evolves according to the replicator equation and derive an equation whose right-hand side matches that of Fisher’s original equation: the variance of the fitness.
But my left-hand side is different: it’s the square of the speed of the corresponding probability distribution, where speed is measured using the ‘Fisher information metric’. This metric was discovered by the same Fisher, but I don’t think he used it in his work on the fundamental theorem.
Something similar to my statement appears as Theorem 2 of Marc Harper’s paper:
• Marc Harper, Information geometry and evolutionary game theory.
and for that theorem he cites:
• Josef Hofbauer and Karl Sigmund, Evolutionary Games and Population Dynamics, Cambridge University Press, Cambridge, 1998.
However, his Theorem 2 assumes that the probability distribution flows along the gradient of a function, and I’m not assuming that: indeed, my version applies to the rock-paper-scissors game where the probability distribution of players moves round and round!