Compositionality in Network Theory

29 November, 2016

Here are the slides of my talk at the workshop on compositionality at the Simons Institute for the Theory of Computing next week:

• John Baez, Compositionality in network theory, 6 December 2016.

Abstract. To describe systems composed of interacting parts, scientists and engineers draw diagrams of networks: flow charts, Petri nets, electrical circuit diagrams, signal-flow graphs, chemical reaction networks, Feynman diagrams and the like. In principle all these different diagrams fit into a common framework: the mathematics of symmetric monoidal categories. This has been known for some time. However, the details are more challenging, and ultimately more rewarding, than this basic insight. Two complementary approaches are presentations of symmetric monoidal categories using generators and relations (which are more algebraic in flavor) and decorated cospan categories (which are more geometrical). In this talk we focus on the latter.

This talk assumes considerable familiarity with category theory. For a much gentler talk on the same theme, see:

Monoidal categories of networks.


Compositional Frameworks for Open Systems

27 November, 2016

santa_fe_institute

Here are the slides of Blake Pollard’s talk at the Santa Fe Institute workshop on Statistical Physics, Information Processing and Biology:

• Blake Pollard, Compositional frameworks for open systems, 17 November 2016.

He gave a really nice introduction to how we can use categories to study open systems, with his main example being ‘open Markov processes’, where probability can flow in and out of the set of states. People liked it a lot!

blake_talk_with_border


Under2 Coalition

24 November, 2016

I’ve been thinking hard about climate change since at least 2010. That’s why I started this blog. But the last couple years I’ve focused on basic research in network theory as a preliminary step toward green mathematics. Basic research is what I’m best at, and there are plenty of people working on the more immediate, more urgent aspects of climate change.

Indeed, after the Paris Agreement, I started hoping that politicians were taking this issue seriously and that we’d ultimately deal with it—even though I knew this agreement was not itself enough to keep warming below 2° C:

There is a troubling paradox at the heart of climate policy. On the one hand, nobody can doubt the historic success of the Paris Agreement. On the other hand, everybody willing to look can see the impact of our changing climate. People already face rising seas, expanding desertification and coastal erosion. They take little comfort from agreements to adopt mitigation measures and finance adaptation in the future. They need action today.

That is why the Emissions Gap Report tracks our progress in restricting global warming to 1.5 – 2 degrees Celsius above pre-industrial levels by the end of this century. This year’s data shows that overall emissions are still rising, but more slowly, and in the case of carbon dioxide, hardly at all. The report foresees further reductions in the short term and increased ambition in the medium term. Make no mistake; the Paris Agreement will slow climate change. The recent Kigali Amendment to the Montreal Protocol will do the same.

But not enough: not nearly enough and not fast enough. This report estimates we are actually on track for global warming of up to 3.4 degrees Celsius. Current commitments will reduce emissions by no more than a third of the levels required by 2030 to avert disaster. The Kigali Amendment will take off 0.5 degrees Celsius, although not until well after 2030. Action on short-lived climate pollutants, such as black carbon, can take off a further 0.5 degrees Celsius. This means we need to find another one degree from somewhere to meet the stronger, and safer, target of 1.5 degrees Celsius warming.

So, we must take urgent action. If we don’t, we will mourn the loss of biodiversity and natural resources. We will regret the economic fallout. Most of all, we will grieve over the avoidable human tragedy; the growing numbers of climate refugees hit by hunger, poverty, illness and conflict will be a constant reminder of our failure to deliver.

That’s from an annual report put out by the United Nations Environment Programme, or UNEP:

• United Nations Environment Programme, The Emissions Gap Report 2016.

As this report makes clear, we can bridge the gap and keep global warming below 2° C, if we work very hard.

But my limited optimism was shaken by the US presidential election, and especially by the choice of Myron Ebell to head the ‘transition team’ for the Environmental Protection Agency. For the US government to dismantle the Clean Power Plan and abandon the Paris Agreement would seriously threaten the fight against climate change.

Luckily, people already recognize that even with the Paris Agreement, a lot of work must happen at the ‘subnational’ level. This work will go on even if the US federal government gives up. So I want to learn more about it, and get involved somehow.

This is where the Under2 Coalition comes in.

The Under2 Coalition

California, Connecticut, Minnesota, New Hampshire, New York, Oregon, Rhode Island, Vermont and Washington have signed onto a spinoff of the Paris Climate Agreement. It’s called the Under2 Memorandum of Understanding, or Under2 MOU for short.

“Under 2” stands for two goals:

• under 2 degrees Celsius of global warming, and
• under 2 tonnes of carbon dioxide emitted per person per year.

These states have agreed to cut greenhouse gas emissions to 80-95% below 1990 levels by 2050. They’ve also agreed to share technology and scientific research, expand use of zero-emission vehicles, etc., etc.

And it’s not just US states that are involved in this! A total of 165 jurisdictions in 33 countries and six continents have signed or endorsed the Under2 MOU. Together, they form the Under2 Coalition. They represent more than 1.08 billion people and $25.7 trillion in GDP, more than a third of the global economy:

Under2 Coalition.

I’ll list the members, starting with ones near the US. If you go to the link you can find out exactly what each of these ‘sub-national entities’ are promising to do. In a future post, I’ll say more about the details, since I want Riverside to join this coalition. Jim Stuttard has already started a page about a city in the UK which is not a member of the Under2 Coalition, but has done a lot of work to figure out how to cut carbon emissions:

• Azimuth Wiki, Birmingham Green Commission.

This sort of information will be useful for other cities.

UNITED STATES

Austin
California
Connecticut
Los Angeles
Massachusetts
Minnesota
New Hampshire
New York City
New York State
Oakland City
Oregon
Portland City
Rhode Island
Sacramento
San Francisco
Seattle
Vermont
Washington

CANADA

British Columbia
Northwest Territories
Ontario
Québec
Vancouver City

MEXICO

Baja California
Chiapas
Hidalgo
Jalisco
Mexico City
Mexico State
Michoacán
Quintana Roo
Tabasco
Yucatán

BRAZIL

Acre
Amazonas
Mato Grosso
Pernambuco
Rondônia
São Paulo City
São Paulo State
Tocantins

CHILE

Santiago City

COLOMBIA

Guainía
Guaviare

PERU

Loreto
San Martín
Ucayali

AUSTRIA

Lower Austria

FRANCE

Alsace
Aquitaine
Auvergne-Rhône-Alpes
Bas-Rhin
Midi-Pyrénées
Pays de la Loire

GERMANY

Baden-Württemberg
Bavaria
Hesse
North Rhine-Westphalia
Schleswig-Holstein
Thuringia

HUNGARY

Budapest

ITALY

Abruzzo
Basilicata
Emilia-Romagna
Lombardy
Piedmont
Sardinia
Veneto

THE NETHERLANDS

Drenthe
North Brabant
North Holland
South Holland

PORTUGAL

Azores
Madeira

SPAIN

Andalusia
Basque Country
Catalonia
Navarra

SWEDEN

Jämtland Härjedalen

SWITZERLAND

Basel-Landschaft
Basel-Stadt

UNITED KINGDOM

Bristol
Greater Manchester
Scotland
Wales

AUSTRALIA

Australian Capital Territory (ACT)
South Australia

CHINA

Alliance of Peaking Pioneer Cities (represents 23 cities)
Jiangsu Province
Sichuan
Zhenjiang City

INDIA

Telangana

INDONESIA

East Kalimantan
South Sumatra
West Kalimantan

JAPAN

Gifu

NEPAL

Kathmandu Valley

KENYA

Laikipia County

IVORY COAST

Assemblée des Régions de Côte d’Ivoire (represents 33 subnationals)

NIGERIA

Cross River State

MOZAMBIQUE

Nampula

SENEGAL

Guédiawaye


Jarzynksi on Non-Equilibrium Statistical Mechanics

18 November, 2016

santa_fe_institute

Here at the Santa Fe Institute we’re having a workshop on Statistical Physics, Information Processing and Biology. Unfortunately the talks are not being videotaped, so it’s up to me to spread the news of what’s going on here.

Christopher Jarzynski is famous for discovering the Jarzynski equality. It says

\displaystyle{ e^ { -\Delta F / k T} = \langle e^{ -W/kT } \rangle }

where k is Boltzmann’s consstant and T is the temperature of a system that’s in equilibrium before some work is done on it. \Delta F is the change in free energy, W is the amount of work, and the angle brackets represent an average over the possible options for what takes place—this sort of process is typically nondeterministic.

We’ve seen a good quick explanation of this equation here on Azimuth:

• Eric Downes, Crooks’ Fluctuation Theorem, Azimuth, 30 April 2011.

We’ve also gotten a proof, where it was called the ‘integral fluctuation theorem’:

• Matteo Smerlak, The mathematical origin of irreversibility, Azimuth, 8 October 2012.

It’s a fundamental result in nonequilibrium statistical mechanics—a subject where inequalities are so common that this equation is called an ‘equality’.

Two days ago, Jarzynski gave an incredibly clear hour-long tutorial on this subject, starting with the basics of thermodynamics and zipping forward to modern work. With his permission, you can see the slides here:

• Christopher Jarzynski, A brief introduction to the delights of non-equilibrium statistical physics.

Also try this review article:

• Christopher Jarzynski, Equalities and inequalities: irreversibility and the Second Law of thermodynamics at the nanoscale, Séminaire Poincaré XV Le Temps (2010), 77–102.


Algorithmic Thermodynamics (Part 3)

15 November, 2016

 

This is my talk for the Santa Fe Institute workshop on Statistical Mechanics, Information Processing and Biology:

Computation and thermodynamics.

It’s about the link between computation and entropy. I take the idea of a Turing machine for granted, but starting with that I explain recursive functions, the Church-Turing thesis, Kolomogorov complexity, the relation between Kolmogorov complexity and Shannon entropy, the uncomputability of Kolmogorov complexity, the ‘complexity barrier’, Levin’s computable version of complexity, and finally my work with Mike Stay on algorithmic thermodynamics.

In my talk slides I mention the ‘complexity barrier’, and state this theorem:

Theorem. Choose your favorite set of axioms for math. If it’s finite and consistent, there exists C ≥ 0, the complexity barrier, such that for no natural number n can you prove the Kolmogorov complexity of n exceeds C.

For a sketch of the proof of this result, go here:

Chaitin’s incompleteness theorem.

In my talk I showed a movie related to this: an animated video created in 2009 using a program less than 4 kilobytes long that runs on a Windows XP machine:

For more

For more details, read our paper:

• John Baez and Mike Stay, Algorithmic thermodynamics, Math. Struct. Comp. Sci. 22 (2012), 771-787.

or these blog articles:

Algorithmic thermodynamics (part 1).

Algorithmic thermodynamics (part 2).

They all emphasize slightly different aspects!


Monoidal Categories of Networks

12 November, 2016

Here are the slides of my colloquium talk at the Santa Fe Institute at 11 am on Tuesday, November 15th. I’ll explain some not-yet-published work with Blake Pollard on a monoidal category of ‘open Petri nets’:

Monoidal categories of networks.

Nature and the world of human technology are full of networks. People like to draw diagrams of networks: flow charts, electrical circuit diagrams, chemical reaction networks, signal-flow graphs, Bayesian networks, food webs, Feynman diagrams and the like. Far from mere informal tools, many of these diagrammatic languages fit into a rigorous framework: category theory. I will explain a bit of how this works and discuss some applications.

There I will be using the vaguer, less scary title ‘The mathematics of networks’. In fact, all the monoidal categories I discuss are symmetric monoidal, but I decided that too many definitions will make people unhappy.

The main new thing in this talk is my work with Blake Pollard on symmetric monoidal categories where the morphisms are ‘open Petri nets’. This allows us to describe ‘open’ chemical reactions, where chemical flow in and out. Composing these morphisms then corresponds to sticking together open Petri nets to form larger open Petri nets.


Information Processing and Biology

7 November, 2016

santa_fe_institute

The Santa Fe Institute, in New Mexico, is a place for studying complex systems. I’ve never been there! Next week I’ll go there to give a colloquium on network theory, and also to participate in this workshop:

Statistical Mechanics, Information Processing and Biology, November 16–18, Santa Fe Institute. Organized by David Krakauer, Michael Lachmann, Manfred Laubichler, Peter Stadler, and David Wolpert.

Abstract. This workshop will address a fundamental question in theoretical biology: Does the relationship between statistical physics and the need of biological systems to process information underpin some of their deepest features? It recognizes that a core feature of biological systems is that they acquire, store and process information (i.e., perform computation). However to manipulate information in this way they require a steady flux of free energy from their environments. These two, inter-related attributes of biological systems are often taken for granted; they are not part of standard analyses of either the homeostasis or the evolution of biological systems. In this workshop we aim to fill in this major gap in our understanding of biological systems, by gaining deeper insight in the relation between the need for biological systems to process information and the free energy they need to pay for that processing.

The goal of this workshop is to address these issues by focusing on a set three specific questions: 1) How has the fraction of free energy flux on earth that is used by biological computation changed with time? 2) What is the free energy cost of biological computation or functioning? 3) What is the free energy cost of the evolution of biological computation or functioning? In all of these cases we are interested in the fundamental limits that the laws of physics impose on various aspects of living systems as expressed by these three questions.

I think it’s not open to the public, but I will try to blog about it. The speakers include a lot of experts on information theory, statistical mechanics, and biology. Here they are:

Wednesday November 16: Chris Jarzynski, Seth Lloyd, Artemy Kolchinski, John Baez, Manfred Laubichler, Harold de Vladar, Sonja Prohaska, Chris Kempes.

Thursday November 17: Phil Ball, Matina C. Donaldson-Matasci, Sebastian Deffner, David Wolpert, Daniel Polani, Christoph Flamm, Massimiliano Esposito, Hildegard Meyer-Ortmanns, Blake Pollard, Mikhail Prokopenko, Peter Stadler, Ben Machta.

Friday November 18: Jim Crutchfield, Sara Walker, Hyunju Kim, Takahiro Sagawa, Michael Lachmann, Wojciech Zurek, Christian Van den Broeck, Susanne Still, Chris Stephens.