Information Processing and Biology

7 November, 2016

santa_fe_institute

The Santa Fe Institute, in New Mexico, is a place for studying complex systems. I’ve never been there! Next week I’ll go there to give a colloquium on network theory, and also to participate in this workshop:

Statistical Mechanics, Information Processing and Biology, November 16–18, Santa Fe Institute. Organized by David Krakauer, Michael Lachmann, Manfred Laubichler, Peter Stadler, and David Wolpert.

Abstract. This workshop will address a fundamental question in theoretical biology: Does the relationship between statistical physics and the need of biological systems to process information underpin some of their deepest features? It recognizes that a core feature of biological systems is that they acquire, store and process information (i.e., perform computation). However to manipulate information in this way they require a steady flux of free energy from their environments. These two, inter-related attributes of biological systems are often taken for granted; they are not part of standard analyses of either the homeostasis or the evolution of biological systems. In this workshop we aim to fill in this major gap in our understanding of biological systems, by gaining deeper insight in the relation between the need for biological systems to process information and the free energy they need to pay for that processing.

The goal of this workshop is to address these issues by focusing on a set three specific questions: 1) How has the fraction of free energy flux on earth that is used by biological computation changed with time? 2) What is the free energy cost of biological computation or functioning? 3) What is the free energy cost of the evolution of biological computation or functioning? In all of these cases we are interested in the fundamental limits that the laws of physics impose on various aspects of living systems as expressed by these three questions.

I think it’s not open to the public, but I will try to blog about it. The speakers include a lot of experts on information theory, statistical mechanics, and biology. Here they are:

Wednesday November 16: Chris Jarzynski, Seth Lloyd, Artemy Kolchinski, John Baez, Manfred Laubichler, Harold de Vladar, Sonja Prohaska, Chris Kempes.

Thursday November 17: Phil Ball, Matina C. Donaldson-Matasci, Sebastian Deffner, David Wolpert, Daniel Polani, Christoph Flamm, Massimiliano Esposito, Hildegard Meyer-Ortmanns, Blake Pollard, Mikhail Prokopenko, Peter Stadler, Ben Machta.

Friday November 18: Jim Crutchfield, Sara Walker, Hyunju Kim, Takahiro Sagawa, Michael Lachmann, Wojciech Zurek, Christian Van den Broeck, Susanne Still, Chris Stephens.


Compositionality Workshop

1 November, 2016

I’m excited! In early December I’m going to a workshop on ‘compositionality’, meaning how big complex things can be built by sticking together smaller, simpler parts:

Compositionality, December 5-9, workshop at the Simons Institute for the Theory of Computing, Berkeley. Organized by Samson Abramsky, Lucien Hardy and Michael Mislove.

In 2007 Jim Simons, the guy who helped invent Chern–Simons theory and then went on to make billions using math to run a hedge fund, founded a research center for geometry and physics on Long Island. More recently he’s also set up this institute for theoretical computer science, in Berkeley. I’ve never been there before.

‘Compositionality’ sounds like an incredibly broad topic, but since it’s part of a semester-long program on Logical structures in computation, this workshop will be aimed at theoretical computer scientists, who have specific ideas about compositionality. And these theoretical computer scientists tend to like category theory. After all, category theory is about morphisms, which you can compose.

Here’s the idea:

The compositional description of complex objects is a fundamental feature of the logical structure of computation. The use of logical languages in database theory and in algorithmic and finite model theory provides a basic level of compositionality, but establishing systematic relationships between compositional descriptions and complexity remains elusive. Compositional models of probabilistic systems and languages have been developed, but inferring probabilistic properties of systems in a compositional fashion is an important challenge. In quantum computation, the phenomenon of entanglement poses a challenge at a fundamental level to the scope of compositional descriptions. At the same time, compositionally has been proposed as a fundamental principle for the development of physical theories. This workshop will focus on the common structures and methods centered on compositionality that run through all these areas.

So, some physics and quantum computation will get into the mix!

A lot of people working on categories and computation will be at this workshop. Here’s what I know about the talks so far. If you click on the talk titles you’ll get abstracts, at least for most of them.

The program

 

Monday, December 5th, 2016
9 – 9:20 am
Coffee and Check-In
9:20 – 9:30 am
Opening Remarks
9:30 – 10:30 am
10:30 – 11 am
Break
11 – 11:35 am
11:40 am – 12:15 pm
12:20 – 2 pm
Lunch
2 – 2:35 pm
2:40 – 3:15 pm
3:30 – 4 pm
Break
4 – 5 pm
Discussion
5 – 6 pm
Reception

 

Tuesday, December 6th, 2016
9 – 9:30 am
Coffee and Check-In
9:30 – 10:30 am
10:30 – 11 am
Break
11 – 11:35 am
11:40 am – 12 pm
12:05 – 12:25 pm
12:30 – 2 pm
Lunch
2 – 2:35 pm
2:40 – 3:15 pm
3:30 – 4 pm
Break
4 – 5 pm
Discussion

 

Wednesday, December 7th, 2016
9 – 9:30 am
Coffee and Check-In
9:30 – 10:30 am
10:30 – 11 am
Break
11 – 11:20 am
11:25 – 11:45 am
11:50 am – 12:25 pm
12:30 – 2 pm
Lunch

 

Thursday, December 8th, 2016
9 – 9:30 am
Coffee and Check-In
9:30 – 10:05 am
10:10 – 10:30 am
10:35 – 11 am
Break
11 – 11:20 am
11 am – 11:45 am
11 am – 12:10 pm
12 pm – 2 pm
Lunch
2 – 2:35 pm
2:40 – 3:15 pm
3 pm – 3:50 pm
Break
3:50 – 4:25 pm
4:30 – 4:50 pm

 

Friday, December 9th, 2016
9:30 – 10:05 am
10 am – 10:45 am
10:50 – 11:20 am
Break
11:20 – 11:55 am
12 – 12:35 pm
12:40 – 2 pm
Lunch
2 – 3 pm
Discussion
3 – 3:40 pm

Network Calculus

14 March, 2016

If anyone here can go to this talk and report back, I’d be most grateful! I just heard about it from Jamie Vicary.

Jens Schmitt, Network calculus, Thursday 17 March 2016, 10:00 am, Tony Hoare Room, Robert Hooke Building, the University of Oxford.

Abstract. This talk is about Network Calculus: a recent methodology to provide performance guarantees in concurrent programs, digital circuits, and communication networks. There is a deterministic and a stochastic version of network calculus, providing worst-case and probabilistic guarantees, respectively. The deterministic network calculus has been developed in the last 25 years and is somewhat a settled methodology; the stochastic network calculus is ten years younger and while some of the dust has settled there are still many open fundamental research questions. For both, the key innovation over existing methods is that they work with bounds on the arrival and service processes of the system under analysis. This often enables dealing with complex systems where conventional methods become intractable—of course at a loss of exact results, but still with rigorous performance bounds.


Higher-Dimensional Rewriting in Warsaw (Part 2)

26 June, 2015

Today I’m going to this workshop:

Higher-Dimensional Rewriting and Applications, 28-29 June 2015, Warsaw, Poland.

Many of the talks will be interesting to people who are trying to use category theory as a tool for modelling networks!

For example, though they can’t actually attend, Lucius Meredith and my student Mike Stay hope to use Google Hangouts to present their work on Higher category models of the π-calculus. The π-calculus is a way of modelling networks where messages get sent here and there, e.g. the internet. Check out Mike’s blog post about this:

• Mike Stay, A 2-categorical approach to the pi calculus, The n-Category Café, 26 May 2015.

Krzysztof Bar, Aleks Kissinger and Jamie Vicary will be speaking about Globular, a proof assistant for computations in n-categories:

This talk is a progress report on Globular, an online proof assistant for semistrict higher-dimensional rewriting. We aim to produce a tool which can visualize higher-dimensional categorical diagrams, assist in their construction with a point-and-click interface, perform type checking to prevent incorrect composites, and automatically handle the interchanger data at each dimension. Hosted on the web, it will have a low barrier to use, and allow hyperlinking of formalized proofs directly from research papers. We outline the theoretical basis for the tool, and describe the challenges we have overcome in its design.

Eric Finster will be talking about another computer system for dealing with n-categories, based on the ‘opetopic’ formalism that James Dolan and I invented. And Jason Morton is working on a computer system for computation in compact closed categories! I’ve seen it, and it’s cool, but he can’t attend the workshop, so David Spivak will be speaking on his work with Jason on the theoretical foundations of this software:

We consider the linked problems of (1) finding a normal form for morphism expressions in a closed compact category and (2) the word problem, that is deciding if two morphism expressions are equal up to the axioms of a closed compact category. These are important ingredients for a practical monoidal category computer algebra system. Previous approaches to these problems include rewriting and graph-based methods. Our approach is to re-interpret a morphism expression in terms of an operad, and thereby obtain a single composition which is strictly associative and applied according to the abstract syntax tree. This yields the same final operad morphism regardless of the tree representation of the expression or order of execution, and solves the normal form problem up to automorphism.

Recently Eugenia Cheng has been popularizing category theory, touring to promote her book Cakes, Custard and Category Theory. But she’ll be giving two talks in Warsaw, I believe on distributive laws for Lawvere theories.

As for me, I’ll be promoting my dream of using category theory to understand networks in electrical engineering. I’ll be giving a talk on control theory and a talk on electrical circuits: two sides of the same coin, actually.

• John Baez, Jason Erbele and Nick Woods, Categories in control.

If you’ve seen a previous talk of mine with the same title, don’t despair—this one has new stuff! In particular, it talks about a new paper by Nick Woods and Simon Wadsley.

Abstract. Control theory is the branch of engineering that studies dynamical systems with inputs and outputs, and seeks to stabilize these using feedback. Control theory uses “signal-flow diagrams” to describe processes where real-valued functions of time are added, multiplied by scalars, differentiated and integrated, duplicated and deleted. In fact, these are string diagrams for the symmetric monoidal category of finite-dimensional vector spaces, but where the monoidal structure is direct sum rather than the usual tensor product. Jason Erbele has given a presentation for this symmetric monoidal category, which amounts to saying that it is the PROP for bicommutative bimonoids with some extra structure.

A broader class of signal-flow diagrams also includes “caps” and “cups” to model feedback. This amounts to working with a larger symmetric monoidal category where objects are still finite-dimensional vector spaces but the morphisms are linear relations. Erbele also found a presentation for this larger symmetric monoidal category. It is the PROP for a remarkable thing: roughly speaking, an object with two special commutative dagger-Frobenius structures, such that the multiplication and unit of either one and the comultiplication and counit of the other fit together to form a bimonoid.

• John Baez and Brendan Fong, Circuits, categories and rewrite rules.

Abstract. We describe a category where a morphism is an electrical circuit made of resistors, inductors and capacitors, with marked input and output terminals. In this category we compose morphisms by attaching the outputs of one circuit to the inputs of another. There is a functor called the ‘black box functor’ that takes a circuit, forgets its internal structure, and remembers only its external behavior. Two circuits have the same external behavior if and only if they impose same relation between currents and potentials at their terminals. This is a linear relation, so the black box functor goes from the category of circuits to the category of finite-dimensional vector spaces and linear relations. Constructing this functor makes use of Brendan Fong’s theory of ‘decorated cospans’—and the question of whether two ‘planar’ circuits map to the same relation has an interesting answer in terms of rewrite rules.

The answer to the last question, in the form of a single picture, is this:


(Click to enlarge.) How can you change an electrical circuit made out of resistors without changing what it does? 5 ways are shown here:

  1. You can remove a loop of wire with a resistor on it. It doesn’t do anything.
  2. You can remove a wire with a resistor on it if one end is unattached. Again, it doesn’t do anything.
  3. You can take two resistors in series—one after the other—and replace them with a single resistor. But this new resistor must have a resistance that’s the sum of the old two.
  4. You can take two resistors in parallel and replace them with a single resistor. But this resistor must have a conductivity that’s the sum of the old two. (Conductivity is the reciprocal of resistance.)
  5. Finally, the really cool part: the Y-Δ transform. You can replace a Y made of 3 resistors by a triangle of resistors But their resistances must be related by the equations shown here.

For circuits drawn on the plane, these are all the rules you need! This was proved here:

• Yves Colin de Verdière, Isidoro Gitler and Dirk Vertigan, Réseaux électriques planaires II.

It’s just the beginning of a cool story, which I haven’t completely blended with the categorical approach to circuits. Doing so clearly calls for 2-categories: those double arrows are 2-morphisms! For more, see:

• Joshua Alman, Carl Lian and Brandon Tran, Circular planar electrical networks I: The electrical poset EPn.


Network Theory in Turin

23 May, 2015

Here are the slides of the talk I’m giving on Monday to kick off the Categorical Foundations of Network Theory workshop in Turin:

Network theory.

This is a long talk, starting with the reasons I care about this subject, and working into the details of one particular project: the categorical foundations of networks as applied to electrical engineering and control theory. There are lots of links in blue; click on them for more details!


Kinetic Networks: From Topology to Design

16 April, 2015

Here’s an interesting conference for those of you who like networks and biology:

Kinetic networks: from topology to design, Santa Fe Institute, 17–19 September, 2015. Organized by Yoav Kallus, Pablo Damasceno, and Sidney Redner.

Proteins, self-assembled materials, virus capsids, and self-replicating biomolecules go through a variety of states on the way to or in the process of serving their function. The network of possible states and possible transitions between states plays a central role in determining whether they do so reliably. The goal of this workshop is to bring together researchers who study the kinetic networks of a variety of self-assembling, self-replicating, and programmable systems to exchange ideas about, methods for, and insights into the construction of kinetic networks from first principles or simulation data, the analysis of behavior resulting from kinetic network structure, and the algorithmic or heuristic design of kinetic networks with desirable properties.


Information and Entropy in Biological Systems (Part 3)

6 April, 2015

I think you can watch live streaming video of our workshop on Information and Entropy in Biological Systems, which runs Wednesday April 8th to Friday April 10th. Later, videos will be made available in a permanent location.

To watch the workshop live, go here. Go down to where it says

Investigative Workshop: Information and Entropy in Biological Systems

Then click where it says live link. There’s nothing there now, but I’m hoping there will be when the show starts!

Below you can see the schedule of talks and a list of participants. The hours are in Eastern Daylight Time: add 4 hours to get Greenwich Mean Time. The talks start at 10 am EDT, which is 2 pm GMT.

Schedule

There will be 1½ hours of talks in the morning and 1½ hours in the afternoon for each of the 3 days, Wednesday April 8th to Friday April 10th. The rest of the time will be for discussions on different topics. We’ll break up into groups, based on what people want to discuss.

Each invited speaker will give a 30-minute talk summarizing the key ideas in some area, not their latest research so much as what everyone should know to start interesting conversations. After that, 15 minutes for questions and/or coffee.

Here’s the schedule. You can already see slides or other material for the talks with links!

Wednesday April 8

• 9:45-10:00 — the usual introductory fussing around.
• 10:00-10:30 — John Baez, Information and entropy in biological systems.
• 10:30-11:00 — questions, coffee.
• 11:00-11:30 — Chris Lee, Empirical information, potential information and disinformation.
• 11:30-11:45 — questions.

• 11:45-1:30 — lunch, conversations.

• 1:30-2:00 — John Harte, Maximum entropy as a foundation for theory building in ecology.
• 2:00-2:15 — questions, coffee.
• 2:15-2:45 — Annette Ostling, The neutral theory of biodiversity and other competitors to the principle of maximum entropy.
• 2:45-3:00 — questions, coffee.
• 3:00-5:30 — break up into groups for discussions.

• 5:30 — reception.

Thursday April 9

• 10:00-10:30 — David Wolpert, The Landauer limit and thermodynamics of biological organisms.
• 10:30-11:00 — questions, coffee.
• 11:00-11:30 — Susanne Still, Efficient computation and data modeling.
• 11:30-11:45 — questions.

• 11:45-1:30 — group photo, lunch, conversations.

• 1:30-2:00 — Matina Donaldson-Matasci, The fitness value of information in an uncertain environment.
• 2:00-2:15 — questions, coffee.
• 2:15-2:45 — Roderick Dewar, Maximum entropy and maximum entropy production in biological systems: survival of the likeliest?
• 2:45-3:00 — questions, coffee.
• 3:00-6:00 — break up into groups for discussions.

Friday April 10

• 10:00-10:30 — Marc Harper, Information transport and evolutionary dynamics.
• 10:30-11:00 — questions, coffee.
• 11:00-11:30 — Tobias Fritz, Characterizations of Shannon and Rényi entropy.
• 11:30-11:45 — questions.

• 11:45-1:30 — lunch, conversations.

• 1:30-2:00 — Christina Cobbold, Biodiversity measures and the role of species similarity.
• 2:00-2:15 — questions, coffee.
• 2:15-2:45 — Tom Leinster, Maximizing biological diversity.
• 2:45-3:00 — questions, coffee.
• 3:00-6:00 — break up into groups for discussions.

Participants

Here are the confirmed participants. This list may change a little bit:

• John Baez – mathematical physicist.

• Romain Brasselet – postdoc in cognitive neuroscience knowledgeable about information-theoretic methods and methods of estimating entropy from samples of probability distributions.

• Katharina Brinck – grad student at Centre for Complexity Science at Imperial College; did masters at John Harte’s lab, where she extended his Maximum Entropy Theory of Ecology (METE) to trophic food webs, to study how entropy maximization on the macro scale together with MEP on the scale of individuals drive the structural development of model ecosystems.

• Christina Cobbold – mathematical biologist, has studied the role of species similarity in measuring biodiversity.

• Troy Day – mathematical biologist, works with population dynamics, host-parasite dynamics, etc.; influential and could help move population dynamics to a more information-theoretic foundation.

• Roderick Dewar – physicist who studies the principle of maximal entropy production.

• Barrett Deris – MIT postdoc studying the studying the factors that influence evolvability of drug resistance in bacteria.

• Charlotte de Vries – a biology master’s student who studied particle physics to the master’s level at Oxford and the Perimeter Institute. Interested in information theory.

• Matina Donaldson-Matasci – a biologist who studies information, uncertainty and collective behavior.

• Chris Ellison – a postdoc who worked with James Crutchfield on “information-theoretic measures of structure and memory in stationary, stochastic systems – primarily, finite state hidden Markov models”. He coauthored “Intersection Information based on Common Randomness”, http://arxiv.org/abs/1310.1538. The idea: “The introduction of the partial information decomposition generated a flurry of proposals for defining an intersection information that quantifies how much of “the same information” two or more random variables specify about a target random variable. As of yet, none is wholly satisfactory.” Works on mutual information between organisms and environment (along with David Krakauer and Jessica Flack), and also entropy rates.

• Cameron Freer – MIT postdoc in Brain and Cognitive Sciences working on maximum entropy production principles, algorithmic entropy etc.

• Tobias Fritz – a physicist who has worked on “resource theories” and haracterizations of Shannon and Rényi entropy and on resource theories.

• Dashiell Fryer – works with Marc Harper on information geometry and evolutionary game theory.

• Michael Gilchrist – an evolutionary biologist studying how errors and costs of protein translation affect the codon usage observed within a genome. Works at NIMBioS.

• Manoj Gopalkrishnan – an expert on chemical reaction networks who understands entropy-like Lyapunov functions for these systems.

• Marc Harper – works on evolutionary game theory using ideas from information theory, information geometry, etc.

• John Harte – an ecologist who uses the maximum entropy method to predict the structure of ecosystems.

• Ellen Hines – studies habitat modeling and mapping for marine endangered species and ecosystems, sea level change scenarios, documenting of human use and values. Her lab has used MaxEnt methods.

• Elizabeth Hobson – behavior ecology postdoc developing methods to quantify social complexity in animals. Works at NIMBioS.

• John Jungk – works on graph theory and biology.

• Chris Lee – in bioinformatics and genomics; applies information theory to experiment design and evolutionary biology.

• Maria Leites – works on dynamics, bifurcations and applications of coupled systems of non-linear ordinary differential equations with applications to ecology, epidemiology, and transcriptional regulatory networks. Interested in information theory.

• Tom Leinster – a mathematician who applies category theory to study various concepts of ‘magnitude’, including biodiversity and entropy.

• Timothy Lezon – a systems biologist in the Drug Discovery Institute at Pitt, who has used entropy to characterize phenotypic heterogeneity in populations of cultured cells.

• Maria Ortiz Mancera – statistician working at CONABIO, the National Commission for Knowledge and Use of Biodiversity, in Mexico.

• Yajun Mei – statistician who uses Kullback-Leibler divergence and how to efficiently compute entropy for the two-state hidden Markov models.

• Robert Molzon – mathematical economist who has studied deterministic approximation of stochastic evolutionary dynamics.

• David Murrugarra – works on discrete models in mathematical biology; interested in learning about information theory.

• Annette Ostling – studies community ecology, focusing on the influence of interspecific competition on community structure, and what insights patterns of community structure might provide about the mechanisms by which competing species coexist.

• Connie Phong – grad student at Chicago’s Institute of Genomics and System biology, working on how “certain biochemical network motifs are more attuned than others at maintaining strong input to output relationships under fluctuating conditions.”

• Petr Plechak – works on information-theoretic tools for estimating and minimizing errors in coarse-graining stochastic systems. Wrote “Information-theoretic tools for parametrized coarse-graining of non-equilibrium extended systems”.

• Blake Polllard – physics grad student working with John Baez on various generalizations of Shannon and Renyi entropy, and how these entropies change with time in Markov processes and open Markov processes.

• Timothee Poisot – works on species interaction networks; developed a “new suite of tools for probabilistic interaction networks”.

• Richard Reeve – works on biodiversity studies and the spread of antibiotic resistance. Ran a program on entropy-based biodiversity measures at a mathematics institute in Barcelona.

• Rob Shaw – works on entropy and information in biotic and pre-biotic systems.

• Matteo Smerlak – postdoc working on nonequilibrium thermodynamics and its applications to biology, especially population biology and cell replication.

• Susanne Still – a computer scientist who studies the role of thermodynamics and information theory in prediction.

• Alexander Wissner-Gross – Institute Fellow at the Harvard University Institute for Applied Computational Science and Research Affiliate at the MIT Media Laboratory, interested in lots of things.

• David Wolpert – works at the Santa Fe Institute on i) information theory and game theory, ii) the second law of thermodynamics and dynamics of complexity, iii) multi-information source optimization, iv) the mathematical underpinnings of reality, v) evolution of organizations.

• Matthew Zefferman – works on evolutionary game theory, institutional economics and models of gene-culture co-evolution. No work on information, but a postdoc at NIMBioS.