Here are two talks on the categorical semantics of entropy, given on Wednesday May 11th 2022 at CUNY. First one there’s one by me and then starting around 1:31:00 there’s one by Tai-Danae Bradley:
My talk is called “Shannon entropy from category theory”:
Shannon entropy is a powerful concept. But what properties single out Shannon entropy as special? Instead of focusing on the entropy of a probability measure on a finite set, it can help to focus on the “information loss”, or change in entropy, associated with a measure-preserving function. Shannon entropy then gives the only concept of information loss that is functorial, convex-linear and continuous. This is joint work with Tom Leinster and Tobias Fritz.
Tai-Danae Bradley’s is called “Operads and entropy”:
This talk will open with a basic introduction to operads and their representations, with the main example being the operad of probabilities. I’ll then give a light sketch of how this framework leads to a small, but interesting, connection between information theory, abstract algebra, and topology, namely a correspondence between Shannon entropy and derivations of the operad of probabilities.
My talk is mainly about this paper:
• John Baez, Tobias Fritz and Tom Leinster, A characterization of entropy in terms of information loss, 2011.
and hers is mainly about this:
• Tai-Danae Bradley, Entropy as a topological operad derivation, 2021.
Here are some related readings:
• Tom Leinster, An operadic introduction to entropy, 2011.
• John Baez and Tobias Fritz, A Bayesian characterization of relative entropy, 2014.
• Tom Leinster, A short characterization of relative entropy, 2017.
• Nicolas Gagné and Prakash Panangaden,
A categorical characterization of relative entropy on standard Borel spaces, 2017.
• Tom Leinster, Entropy and Diversity: the Axiomatic Approach, 2020.
• Arthur Parzygnat, A functorial characterization of von Neumann entropy, 2020.
• Arthur Parzygnat, Towards a functorial description of quantum relative entropy, 2021.
I’m giving my talk again on Thursday, June 16th, at 18:00 Lisbon time, which is 17:00 UTC or 10:00 in California:
• Shannon entropy from category theory, Lisbon Webinar on Mathematics, Physics and Machine Learning.
The link here is a Zoom link which will work when the talk actually starts.
June 16th
Thanks—fixed!
I still think back sometimes on our conversation about drought vs. floods in a hotter world. We’re having serious droughts in the western US, but floods quite often in the east.
A CSIRO climate report said that air can hold 7% more water for every degree of warming. But climate change affects different places differently, and some will be drier. There are claims that climate models show where it will be drier. They say southern Australia will be drier. We’ve had 2 very wet years after 9 years of drought. I wonder how much of a handle they have on the climate. Lake sediments show that 500 years ago Eastern Australia had a drought that lasted 80 years.
Thanks! The ability for hot air to hold more water is a double-edged sword: it also means that the soil gets more dried out in hot weather. We see that a lot in here in the deserts of Southern California.