Information Processing in Chemical Networks (Part 2)

13 June, 2017

I’m in Luxembourg, and I’ll be blogging a bit about this workshop:

Dynamics, Thermodynamics and Information Processing in Chemical Networks, 13-16 June 2017, Complex Systems and Statistical Mechanics Group, University of Luxembourg. Organized by Massimiliano Esposito and Matteo Polettini.

I’ll do it in the comments!

I explained the idea of this workshop here:

Information processing in chemical networks.

and now you can see the program here.


Applied Category Theory

6 April, 2017

The American Mathematical Society is having a meeting here at U. C. Riverside during the weekend of November 4th and 5th, 2017. I’m organizing a session on Applied Category Theory, and I’m looking for people to give talks.

The goal is to start a conversation about applications of category theory, not within pure math or fundamental physics, but to other branches of science and engineering—especially those where the use of category theory is not already well-established! For example, my students and I have been applying category theory to chemistry, electrical engineering, control theory and Markov processes.

Alas, we have no funds for travel and lodging. If you’re interested in giving a talk, please submit an abstract here:

General information about abstracts, American Mathematical Society.

More precisely, please read the information there and then click on the link on that page to submit an abstract. It should then magically fly through cyberspace to me! Abstracts are due September 12th, but the sooner you submit one, the greater the chance that we’ll have space.

For the program of the whole conference, go here:

Fall Western Sectional Meeting, U. C. Riverside, Riverside, California, 4–5 November 2017.

We’ll be having some interesting plenary talks:

• Paul Balmer, UCLA, An invitation to tensor-triangular geometry.

• Pavel Etingof, MIT, Double affine Hecke algebras and their applications.

• Monica Vazirani, U.C. Davis, Combinatorics, categorification, and crystals.


Quantifying Biological Complexity

23 January, 2017

Next week I’m going to this workshop:

Biological Complexity: Can It Be Quantified?, 1-3 February 2017, Beyond Center for Fundamental Concepts in Science, Arizona State University, Tempe Arizona. Organized by Paul Davies.

I haven’t heard that any of it will be made publicly available, but I’ll see if there’s something I can show you. Here’s the schedule:

Wednesday February 1st

9:00 – 9:30 am Paul Davies

Brief welcome address, outline of the subject and aims of the meeting

Session 1. Life: do we know it when we see it?

9:30 – 10:15 am: Chris McKay, “Mission to Enceladus”

10:15 – 10:45 am: Discussion

10:45– 11:15 am: Tea/coffee break

11:15 – 12:00 pm: Kate Adamala, “Alive but not life”

12:00 – 12:30 pm: Discussion

12:30 – 2:00 pm: Lunch

Session 2. Quantifying life

2:00 – 2:45 pm: Lee Cronin, “The living and the dead: molecular signatures of life”

2:45 – 3:30 pm: Sara Walker, “Can we build a life meter?”

3:30 – 4:00 pm: Discussion

4:00 – 4:30 pm: Tea/coffee break

4:30 – 5:15 pm: Manfred Laubichler, “Complexity is smaller than you think”

5:15 – 5:30 pm: Discussion

The Beyond Annual Lecture

7:00 – 8:30 pm: Sean Carroll, “Our place in the universe”

Thursday February 2nd

Session 3: Life, information and the second law of thermodynamics

9:00 – 9:45 am: James Crutchfield, “Vital bits: the fuel of life”

9:45 – 10:00 am: Discussion

10:00 – 10:45 pm: John Baez, “Information and entropy in biology”

10:45 – 11:00 am: Discussion

11:00 – 11:30 pm: Tea/coffee break

11:30 – 12:15 pm: Chris Adami, “What is biological information?”

12:15 – 12:30 pm: Discussion

12:30 – 2:00 pm: Lunch

Session 4: The emergence of agency

2:00 – 2:45 pm: Olaf Khang Witkowski, “When do autonomous agents act collectively?”

2:45 – 3:00 pm: Discussion

3:00 – 3:45 pm: William Marshall, “When macro beats micro”

3:45 – 4:00 pm: Discussion

4:00 – 4:30 am: Tea/coffee break

4:30 – 5:15pm: Alexander Boyd, “Biology’s demons”

5:15 – 5:30 pm: Discussion

Friday February 3rd

Session 5: New physics?

9:00 – 9:45 am: Sean Carroll, “Laws of complexity, laws of life?”

9:45 – 10:00 am: Discussion

10:00 – 10:45 am: Andreas Wagner, “The arrival of the fittest”

10:45 – 11:00 am: Discussion

11:00 – 11:30 am: Tea/coffee break

11:30 – 12:30 pm: George Ellis, “Top-down causation demands new laws”

12:30 – 2:00 pm: Lunch


Information Processing in Chemical Networks (Part 1)

4 January, 2017

There’s a workshop this summer:

Dynamics, Thermodynamics and Information Processing in Chemical Networks, 13-16 June 2017, Complex Systems and Statistical Mechanics Group, University of Luxembourg. Organized by Massimiliano Esposito and Matteo Polettini.

They write, “The idea of the workshop is to bring in contact a small number of high-profile research groups working at the frontier between physics and biochemistry, with particular emphasis on the role of Chemical Networks.”

The speakers may include John Baez, Sophie de Buyl, Massimiliano Esposito, Arren Bar-Even, Christoff Flamm, Ronan Fleming, Christian Gaspard, Daniel Merkle, Philippe Nge, Thomas Ouldridge, Luca Peliti, Matteo Polettini, Hong Qian, Stefan Schuster, Alexander Skupin, Pieter Rein ten Wolde. I believe attendance is by invitation only, so I’ll endeavor to make some of the ideas presented available here at this blog.

Some of the people involved

I’m looking forward to this, in part because there will be a mix of speakers I’ve met, speakers I know but haven’t met, and speakers I don’t know yet. I feel like reminiscing a bit, and I hope you’ll forgive me these reminiscences, since if you try the links you’ll get an introduction to the interface between computation and chemical reaction networks.

In part 25 of the network theory series here, I imagined an arbitrary chemical reaction network and said:

We could try to use these reactions to build a ‘chemical computer’. But how powerful can such a computer be? I don’t know the answer.

Luca Cardelli answered my question in part 26. This was just my first introduction to the wonderful world of chemical computing. Erik Winfree has a DNA and Natural Algorithms Group at Caltech, practically next door to Riverside, and the people there do a lot of great work on this subject. David Soloveichik, now at U. T. Austin, is an alumnus of this group.

In 2014 I met all three of these folks, and many other cool people working on these theme, at a workshop I tried to summarize here:

Programming with chemical reaction networks, Azimuth, 23 March 2014.

The computational power of chemical reaction networks, 10 June 2014.

Chemical reaction network talks, 26 June 2014.

I met Matteo Polettini about a year later, at a really big workshop on chemical reaction networks run by Elisenda Feliu and Carsten Wiuf:

Trends in reaction network theory (part 1), Azimuth, 27 January 2015.

Trends in reaction network theory (part 2), Azimuth, 1 July 2015.

Polettini has his own blog, very much worth visiting. For example, you can see his view of the same workshop here:

• Matteo Polettini, Mathematical trends in reaction network theory: part 1 and part 2, Out of Equilibrium, 1 July 2015.

Finally, I met Massimiliano Esposito and Christoph Flamm recently at the Santa Fe Institute, at a workshop summarized here:

Information processing and biology, Azimuth, 7 November 2016.

So, I’ve gradually become educated in this area, and I hope that by June I’ll be ready to say something interesting about the semantics of chemical reaction networks. Blake Pollard and I are writing a paper about this now.


Information Processing and Biology

7 November, 2016

santa_fe_institute

The Santa Fe Institute, in New Mexico, is a place for studying complex systems. I’ve never been there! Next week I’ll go there to give a colloquium on network theory, and also to participate in this workshop:

Statistical Mechanics, Information Processing and Biology, November 16–18, Santa Fe Institute. Organized by David Krakauer, Michael Lachmann, Manfred Laubichler, Peter Stadler, and David Wolpert.

Abstract. This workshop will address a fundamental question in theoretical biology: Does the relationship between statistical physics and the need of biological systems to process information underpin some of their deepest features? It recognizes that a core feature of biological systems is that they acquire, store and process information (i.e., perform computation). However to manipulate information in this way they require a steady flux of free energy from their environments. These two, inter-related attributes of biological systems are often taken for granted; they are not part of standard analyses of either the homeostasis or the evolution of biological systems. In this workshop we aim to fill in this major gap in our understanding of biological systems, by gaining deeper insight in the relation between the need for biological systems to process information and the free energy they need to pay for that processing.

The goal of this workshop is to address these issues by focusing on a set three specific questions: 1) How has the fraction of free energy flux on earth that is used by biological computation changed with time? 2) What is the free energy cost of biological computation or functioning? 3) What is the free energy cost of the evolution of biological computation or functioning? In all of these cases we are interested in the fundamental limits that the laws of physics impose on various aspects of living systems as expressed by these three questions.

I think it’s not open to the public, but I will try to blog about it. The speakers include a lot of experts on information theory, statistical mechanics, and biology. Here they are:

Wednesday November 16: Chris Jarzynski, Seth Lloyd, Artemy Kolchinski, John Baez, Manfred Laubichler, Harold de Vladar, Sonja Prohaska, Chris Kempes.

Thursday November 17: Phil Ball, Matina C. Donaldson-Matasci, Sebastian Deffner, David Wolpert, Daniel Polani, Christoph Flamm, Massimiliano Esposito, Hildegard Meyer-Ortmanns, Blake Pollard, Mikhail Prokopenko, Peter Stadler, Ben Machta.

Friday November 18: Jim Crutchfield, Sara Walker, Hyunju Kim, Takahiro Sagawa, Michael Lachmann, Wojciech Zurek, Christian Van den Broeck, Susanne Still, Chris Stephens.


Compositionality Workshop

1 November, 2016

I’m excited! In early December I’m going to a workshop on ‘compositionality’, meaning how big complex things can be built by sticking together smaller, simpler parts:

Compositionality, 5-9 December 2016, workshop at the Simons Institute for the Theory of Computing, Berkeley. Organized by Samson Abramsky, Lucien Hardy and Michael Mislove.

In 2007 Jim Simons, the guy who helped invent Chern–Simons theory and then went on to make billions using math to run a hedge fund, founded a research center for geometry and physics on Long Island. More recently he’s also set up this institute for theoretical computer science, in Berkeley. I’ve never been there before.

‘Compositionality’ sounds like an incredibly broad topic, but since it’s part of a semester-long program on Logical structures in computation, this workshop will be aimed at theoretical computer scientists, who have specific ideas about compositionality. And these theoretical computer scientists tend to like category theory. After all, category theory is about morphisms, which you can compose.

Here’s the idea:

The compositional description of complex objects is a fundamental feature of the logical structure of computation. The use of logical languages in database theory and in algorithmic and finite model theory provides a basic level of compositionality, but establishing systematic relationships between compositional descriptions and complexity remains elusive. Compositional models of probabilistic systems and languages have been developed, but inferring probabilistic properties of systems in a compositional fashion is an important challenge. In quantum computation, the phenomenon of entanglement poses a challenge at a fundamental level to the scope of compositional descriptions. At the same time, compositionally has been proposed as a fundamental principle for the development of physical theories. This workshop will focus on the common structures and methods centered on compositionality that run through all these areas.

So, some physics and quantum computation will get into the mix!

A lot of people working on categories and computation will be at this workshop. Here’s what I know about the talks so far. If you click on the talk titles you’ll get abstracts, at least for most of them.

The program

 

Monday, December 5th, 2016
9 – 9:20 am
Coffee and Check-In
9:20 – 9:30 am
Opening Remarks
9:30 – 10:30 am
10:30 – 11 am
Break
11 – 11:35 am
11:40 am – 12:15 pm
12:20 – 2 pm
Lunch
2 – 2:35 pm
2:40 – 3:15 pm
3:30 – 4 pm
Break
4 – 5 pm
Discussion
5 – 6 pm
Reception

 

Tuesday, December 6th, 2016
9 – 9:30 am
Coffee and Check-In
9:30 – 10:30 am
10:30 – 11 am
Break
11 – 11:35 am
11:40 am – 12 pm
12:05 – 12:25 pm
12:30 – 2 pm
Lunch
2 – 2:35 pm
2:40 – 3:15 pm
3:30 – 4 pm
Break
4 – 5 pm
Discussion

 

Wednesday, December 7th, 2016
9 – 9:30 am
Coffee and Check-In
9:30 – 10:30 am
10:30 – 11 am
Break
11 – 11:20 am
11:25 – 11:45 am
11:50 am – 12:25 pm
12:30 – 2 pm
Lunch

 

Thursday, December 8th, 2016
9 – 9:30 am
Coffee and Check-In
9:30 – 10:05 am
10:10 – 10:30 am
10:35 – 11 am
Break
11 – 11:20 am
11 am – 11:45 am
11 am – 12:10 pm
12 pm – 2 pm
Lunch
2 – 2:35 pm
2:40 – 3:15 pm
3 pm – 3:50 pm
Break
3:50 – 4:25 pm
4:30 – 4:50 pm

 

Friday, December 9th, 2016
9:30 – 10:05 am
10 am – 10:45 am
10:50 – 11:20 am
Break
11:20 – 11:55 am
12 – 12:35 pm
12:40 – 2 pm
Lunch
2 – 3 pm
Discussion
3 – 3:40 pm

Network Calculus

14 March, 2016

If anyone here can go to this talk and report back, I’d be most grateful! I just heard about it from Jamie Vicary.

Jens Schmitt, Network calculus, Thursday 17 March 2016, 10:00 am, Tony Hoare Room, Robert Hooke Building, the University of Oxford.

Abstract. This talk is about Network Calculus: a recent methodology to provide performance guarantees in concurrent programs, digital circuits, and communication networks. There is a deterministic and a stochastic version of network calculus, providing worst-case and probabilistic guarantees, respectively. The deterministic network calculus has been developed in the last 25 years and is somewhat a settled methodology; the stochastic network calculus is ten years younger and while some of the dust has settled there are still many open fundamental research questions. For both, the key innovation over existing methods is that they work with bounds on the arrival and service processes of the system under analysis. This often enables dealing with complex systems where conventional methods become intractable—of course at a loss of exact results, but still with rigorous performance bounds.