I’m giving a short 30-minute talk at a workshop on Biological and Bio-Inspired Information Theory at the Banff International Research Institute.
I’ll say more about the workshop later, but here’s my talk, in PDF and video form:
• Biodiversity, entropy and thermodynamics.
Most of the people at this workshop study neurobiology and cell signalling, not evolutionary game theory or biodiversity. So, the talk is just a quick intro to some things we’ve seen before here. Starting from scratch, I derive the Lotka–Volterra equation describing how the distribution of organisms of different species changes with time. Then I use it to prove a version of the Second Law of Thermodynamics.
This law says that if there is a ‘dominant distribution’—a distribution of species whose mean fitness is at least as great as that of any population it finds itself amidst—then as time passes, the information any population has ‘left to learn’ always decreases!
Of course reality is more complicated, but this result is a good start.
This was proved by Siavash Shahshahani in 1979. For more, see:
• Lou Jost, Entropy and diversity.
• Marc Harper, The replicator equation as an inference dynamic.
• Marc Harper, Information geometry and evolutionary game theory.
The video of John’s talk this morning just posted to the BIRS.ca website: http://www.birs.ca/events/2014/5-day-workshops/14w5170/videos/watch/201410291038-Baez.mp4
Thanks!
[…] Biodiversity, Entropy and Thermodynamics […]