My talk is called “Shannon entropy from category theory”:
Shannon entropy is a powerful concept. But what properties single out Shannon entropy as special? Instead of focusing on the entropy of a probability measure on a finite set, it can help to focus on the “information loss”, or change in entropy, associated with a measure-preserving function. Shannon entropy then gives the only concept of information loss that is functorial, convex-linear and continuous. This is joint work with Tom Leinster and Tobias Fritz.
Tai-Danae Bradley’s is called “Operads and entropy”:
This talk will open with a basic introduction to operads and their representations, with the main example being the operad of probabilities. I’ll then give a light sketch of how this framework leads to a small, but interesting, connection between information theory, abstract algebra, and topology, namely a correspondence between Shannon entropy and derivations of the operad of probabilities.
My talk is mainly about this paper:
• John Baez, Tobias Fritz and Tom Leinster, A characterization of entropy in terms of information loss, 2011.
and hers is mainly about this:
• Tai-Danae Bradley, Entropy as a topological operad derivation, 2021.
Here are some related readings:
• Tom Leinster, An operadic introduction to entropy, 2011.
• John Baez and Tobias Fritz, A Bayesian characterization of relative entropy, 2014.
• Tom Leinster, A short characterization of relative entropy, 2017.
• Nicolas Gagné and Prakash Panangaden,
A categorical characterization of relative entropy on standard Borel spaces, 2017.
• Tom Leinster, Entropy and Diversity: the Axiomatic Approach, 2020.
• Arthur Parzygnat, A functorial characterization of von Neumann entropy, 2020.
• Arthur Parzygnat, Towards a functorial description of quantum relative entropy, 2021.