Here’s a video of a talk I gave yesterday, made by Brendan Fong. You can see the slides here—and then click the items in blue, and the pictures, for more information!

The idea: nature and the world of human technology are full of networks! People like to draw diagrams of networks. Mathematical physicists know that in principle these diagrams can be understood using category theory. But why should physicists have all the fun? This is the century of understanding living systems and adapting to life on a finite planet. Math isn’t the main thing we need for this, but it’s got to be part of the solution… so one thing we should do is develop a unified and powerful theory of networks.

We are still far from doing this. In this overview, I briefly described three parts of the jigsaw puzzle, and invited everyone to help fit them together:

• electrical circuits and signal-flow graphs.

• stochastic Petri nets, chemical reaction networks and Feynman diagrams.

• Bayesian networks, information and entropy.

In my talks coming up, I’ll go into more detail on each of these. With luck, you’ll be able to see videos here.

But if you’re near Oxford, you might as well actually attend! You can see dates, times, locations, my slides, and the talks themselves as they show up by going here.

You could keep a lot of graduate students busy applying network theory to crowd activity seen through surveillance cameras in prisons. See: Chang, Ming-Ching, Ge, Weina, Khranstoever, Nils (2013) ADVANCED BEHAVIOR RECOGNITION IN CROWDED ENVIRONMENTS U.S. Department of Justice publication Document # 240575 Award # 2009-SQ-B9-K013

Concerning your interesting criticism near the talk’s end that computer scientists should use more two-cells. There are cases where they *must* use two-cells to avoid trivial models: when one models propositional classical logic in such a way that the objects are the propositions and the morphisms the proofs, the proof-simplification process called “cut-elimination” suggests itself as the basis of two-cells. However, a widespread view is to ignore that and treat a proof and its cut-reducts as *equal*. But then the category collapses to a boolean lattice. (More technically, any cartesian closed category with an involutive negation is a boolean lattice.) That is, eschewing two-cells seems to be a hard obstacle to studying classical proofs.

Yes, linear logic is important in this context. However, it differs from classical logic w.r.t. the propositions and proofs (even though there are of course connections between the two logics). What I’m trying to get is that, using bicategories, one can get a non-collapsing categorical model of classical logic with cut elimination, *without changing the original logic*, and without putting another kind of maths or logic in the middle. Admittedly, I don’t know how useful those bicategorical models are, but I *do* know they exist and have hardly been studied. And I suspect that neglect is because, historically, categorical models of logics focus on natural deduction: there’s always some monoidal closure modelling some beta-reduction, but folks rarely talk about modelling cut elimination, which can be non-deterministic.

I’m not really following what you’re pointing to here, but if you could describe the 2-cells (and what it means for *them* to be equal) or maybe give a suitable reference for the bicategories you have in mind, that may be of some interest to people. I do recall that Girard touches on the same themes you brought up in his Proofs and Types, but probably more as an advertisement of linear logic, and obviously without mentioning bicategories.

Todd, the answer I’ll give now is built on *your* (great!) work with Cockett and Seely! Please believe me that what I’ll do now is not meant to advertise, just to give you a quick idea of what I’m talking about. It’s in this paper by David Pym and myself: http://carstenfuehrmann.org/?attachment_id=835. Right on the second page, we give an example of non-deterministic cut reduction which goes back to Yves Lafont. Later in the paper, we build on top of *your* symmetric linearly distributive categories to give models where the above non-deterministic cut-reduction is modelled by two-cells. More specifically, we add monoids and comonoids in a certain way, and an order-enrichment, which turns out to be not extra structure, but derivable from the rest. The partial order on the hom-spaces models cut reduction. That partial order is our two-cells. But maybe one could use more substantial two cells. My point is *not* to sell that paper though – I’m not sure myself whether it’s useful. I’m rather making the meta-point that nobody ever seems to have tried this type of model, even though attempting it seems a totally obvious way to go. That seems to tie in with John’s remark that in computer science there’s a strange lack of two cells, which is what got me started here.

Carsten, thank you very much for pointing to this! I’ve begun to take a look (but don’t have any response yet). We were actually discussing GoI at the nForum recently, and it would be great to have your insights there as well.

Thanks for pointing me to the nForum/nLab! I must say to my chagrin that I hadn’t been aware of it. Because of my current occupation as a software engineer, my scientific endeavours are “only” recreational, and very moderate. Still, it will be great to keep an eye on this.

Carsten, I’ve had a little more time now to look at your paper. It seems a key example of a bicategory to keep in mind here is Rel (the bicategory of sets and relations), where the inequality from the redex to the reduct in Lafont-type examples should be compared to the ever-present lax morphisms between commutative comonoids considered by Carboni and Walters in their work on cartesian bicategories. Anyway, it’s a nice paper; thanks very much for bringing it to attention!

Todd, I had a first glance at cartesian bicategories, as presented in your nLab article and in the original paper by Carboni and Walters. I once got within an inches length to that paper – but didn’t read it. So thanks! It does indeed look as if “cut-elimination-models” I mentioned are closely related to posetal cartesian bicategories. The catch seems to be that the latter are logically degenerate in that they have only one monoidal product (am I right?), which models both “and” and “or”, in other words, they are compact. What irked me about my own work was that I never managed to give *nice* non-compact models. (Contrived non-compact models can be obtained by taking a product of a compact model and and a distributive/boolean lattice, but that’s unsatisfactory.) The non-posetal bicartesian categories you descibe in the nLab could be an interesting way towards a finer-grained analysis of cut elimination. On the other hand, the partial order I used is not extra data, but derived with the help of a natural transformation called “MIX”. So it remains to be explored what this bodes for a move towards non-posetal 2-cells.

Finally, I have a stupid question: am I right in assuming that a locally posetal bicategory is the same thing as a poset-enriched category?

Just to get some easy stuff out of the way: yes, a locally posetal bicategory is the same thing as a Pos-enriched category. Generally, the composition of 1-cells in a bicategory is associative up to coherent isomorphism, but since isomorphisms in posets are identities (by antisymmetry), one has strictly associative composition of 1-cells, so a locally posetal bicategory is a locally posetal 2-category is a Pos-enriched category. I should also mention that the general definition of cartesian bicategory is meant to encompass non-(locally posetal) examples like Span and Prof (the bicategory of categories, profunctors, and natural transformations between them), but for Rel-like examples the Carboni-Walters definition is probably easier to follow. There’s another way to present the general definition, due I think to Carboni, Kelly, Verity, and Wood.

Yes, good point about the monoidal products: the definition of cartesian bicategory just mentions one (corresponding to “and”). I’ve not thought about two! That would be worth considering. The compactness condition for cartesian bicategories most naturally arises through something called the Frobenius law (more on this at the nLab article on bicategory of relations), but there are notable examples like Prof which do not obey this law, and perhaps one should think about possible interpretations of your set-up in such examples. (I mean, Prof *is* compact as a monoidal bicategory, but the dual of an object is not equivalent to the same object as it is in Rel, or really any cartesian bicategory obeying the Frobenius law. Objects in Prof being categories, the compact dual of a category is the opposite category.)

I’d not paid too much attention to MIX in the past, but your paper gives me some motivation to revisit this axiom. :-)

Thanks Todd. By putting together the nLab article on cartesian bicategories, and your comments here, I obtained an great overview of the situation. Painlessly, that’s a bonus :)

Carsten: when I get back to Erlangen I’d love you to show me what you know about a treatment of proofs in classical logic using bicategories. Maybe after you tell me the other things you were planning to tell me about!

John, I’m looking forward to this. Actually, I never intended to bother you with logic, but with diagrams: the diagrams (“nets” or “circuits”) Todd, Cockett, and Seely introduced, and which later turned out to have to do a lot with logic, and, I suspect, with the diagram work you and some of your students do.

PS: In previous comments I had a typo in my email, which I now corrected.

John, I actually went to a lecture called “Conflict in international Development” by Paul-Andre Wilton – had a network in a powerpoint slide. I asked him and it turned out that they do do rather more complex diagrams in his offices.

The search for ‘self-consistent scenarios’, which is what this post is about, is important in studying international development as well as in climate science.

You can use HTML in your comments. You can also use LaTeX, like this: $latex E = m c^2 $. The word 'latex' comes right after the first dollar sign, with a space after it. Cancel reply

You need the word 'latex' right after the first dollar sign, and it needs a space after it. Double dollar signs don't work, and other limitations apply, some described here. You can't preview comments here, but I'm happy to fix errors.

You could keep a lot of graduate students busy applying network theory to crowd activity seen through surveillance cameras in prisons. See: Chang, Ming-Ching, Ge, Weina, Khranstoever, Nils (2013) ADVANCED BEHAVIOR RECOGNITION IN CROWDED ENVIRONMENTS U.S. Department of Justice publication Document # 240575 Award # 2009-SQ-B9-K013

Concerning your interesting criticism near the talk’s end that computer scientists should use more two-cells. There are cases where they *must* use two-cells to avoid trivial models: when one models propositional classical logic in such a way that the objects are the propositions and the morphisms the proofs, the proof-simplification process called “cut-elimination” suggests itself as the basis of two-cells. However, a widespread view is to ignore that and treat a proof and its cut-reducts as *equal*. But then the category collapses to a boolean lattice. (More technically, any cartesian closed category with an involutive negation is a boolean lattice.) That is, eschewing two-cells seems to be a hard obstacle to studying classical proofs.

Hence the beauty of linear logic…

Yes, linear logic is important in this context. However, it differs from classical logic w.r.t. the propositions and proofs (even though there are of course connections between the two logics). What I’m trying to get is that, using bicategories, one can get a non-collapsing categorical model of classical logic with cut elimination, *without changing the original logic*, and without putting another kind of maths or logic in the middle. Admittedly, I don’t know how useful those bicategorical models are, but I *do* know they exist and have hardly been studied. And I suspect that neglect is because, historically, categorical models of logics focus on natural deduction: there’s always some monoidal closure modelling some beta-reduction, but folks rarely talk about modelling cut elimination, which can be non-deterministic.

I’m not really following what you’re pointing to here, but if you could describe the 2-cells (and what it means for *them* to be equal) or maybe give a suitable reference for the bicategories you have in mind, that may be of some interest to people. I do recall that Girard touches on the same themes you brought up in his Proofs and Types, but probably more as an advertisement of linear logic, and obviously without mentioning bicategories.

Todd, the answer I’ll give now is built on *your* (great!) work with Cockett and Seely! Please believe me that what I’ll do now is not meant to advertise, just to give you a quick idea of what I’m talking about. It’s in this paper by David Pym and myself: http://carstenfuehrmann.org/?attachment_id=835. Right on the second page, we give an example of non-deterministic cut reduction which goes back to Yves Lafont. Later in the paper, we build on top of *your* symmetric linearly distributive categories to give models where the above non-deterministic cut-reduction is modelled by two-cells. More specifically, we add monoids and comonoids in a certain way, and an order-enrichment, which turns out to be not extra structure, but derivable from the rest. The partial order on the hom-spaces models cut reduction. That partial order is our two-cells. But maybe one could use more substantial two cells. My point is *not* to sell that paper though – I’m not sure myself whether it’s useful. I’m rather making the meta-point that nobody ever seems to have tried this type of model, even though attempting it seems a totally obvious way to go. That seems to tie in with John’s remark that in computer science there’s a strange lack of two cells, which is what got me started here.

Carsten, thank you very much for pointing to this! I’ve begun to take a look (but don’t have any response yet). We were actually discussing GoI at the nForum recently, and it would be great to have your insights there as well.

Thanks for pointing me to the nForum/nLab! I must say to my chagrin that I hadn’t been aware of it. Because of my current occupation as a software engineer, my scientific endeavours are “only” recreational, and very moderate. Still, it will be great to keep an eye on this.

Carsten, I’ve had a little more time now to look at your paper. It seems a key example of a bicategory to keep in mind here is Rel (the bicategory of sets and relations), where the inequality from the redex to the reduct in Lafont-type examples should be compared to the ever-present lax morphisms between commutative comonoids considered by Carboni and Walters in their work on cartesian bicategories. Anyway, it’s a nice paper; thanks very much for bringing it to attention!

Todd, I had a first glance at cartesian bicategories, as presented in your nLab article and in the original paper by Carboni and Walters. I once got within an inches length to that paper – but didn’t read it. So thanks! It does indeed look as if “cut-elimination-models” I mentioned are closely related to posetal cartesian bicategories. The catch seems to be that the latter are logically degenerate in that they have only one monoidal product (am I right?), which models both “and” and “or”, in other words, they are compact. What irked me about my own work was that I never managed to give *nice* non-compact models. (Contrived non-compact models can be obtained by taking a product of a compact model and and a distributive/boolean lattice, but that’s unsatisfactory.) The non-posetal bicartesian categories you descibe in the nLab could be an interesting way towards a finer-grained analysis of cut elimination. On the other hand, the partial order I used is not extra data, but derived with the help of a natural transformation called “MIX”. So it remains to be explored what this bodes for a move towards non-posetal 2-cells.

Finally, I have a stupid question: am I right in assuming that a locally posetal bicategory is the same thing as a poset-enriched category?

Just to get some easy stuff out of the way: yes, a locally posetal bicategory is the same thing as a Pos-enriched category. Generally, the composition of 1-cells in a bicategory is associative up to coherent isomorphism, but since isomorphisms in posets are identities (by antisymmetry), one has strictly associative composition of 1-cells, so a locally posetal bicategory is a locally posetal 2-category is a Pos-enriched category. I should also mention that the general definition of cartesian bicategory is meant to encompass non-(locally posetal) examples like Span and Prof (the bicategory of categories, profunctors, and natural transformations between them), but for Rel-like examples the Carboni-Walters definition is probably easier to follow. There’s another way to present the general definition, due I think to Carboni, Kelly, Verity, and Wood.

Yes, good point about the monoidal products: the definition of cartesian bicategory just mentions one (corresponding to “and”). I’ve not thought about two! That would be worth considering. The compactness condition for cartesian bicategories most naturally arises through something called the Frobenius law (more on this at the nLab article on bicategory of relations), but there are notable examples like Prof which do not obey this law, and perhaps one should think about possible interpretations of your set-up in such examples. (I mean, Prof *is* compact as a monoidal bicategory, but the dual of an object is not equivalent to the same object as it is in Rel, or really any cartesian bicategory obeying the Frobenius law. Objects in Prof being categories, the compact dual of a category is the opposite category.)

I’d not paid too much attention to MIX in the past, but your paper gives me some motivation to revisit this axiom. :-)

Thanks Todd. By putting together the nLab article on cartesian bicategories, and your comments here, I obtained an great overview of the situation. Painlessly, that’s a bonus :)

Carsten: when I get back to Erlangen I’d love you to show me what you know about a treatment of proofs in classical logic using bicategories. Maybe after you tell me the other things you were planning to tell me about!

John, I’m looking forward to this. Actually, I never intended to bother you with logic, but with diagrams: the diagrams (“nets” or “circuits”) Todd, Cockett, and Seely introduced, and which later turned out to have to do a lot with logic, and, I suspect, with the diagram work you and some of your students do.

PS: In previous comments I had a typo in my email, which I now corrected.

John, I actually went to a lecture called “Conflict in international Development” by Paul-Andre Wilton – had a network in a powerpoint slide. I asked him and it turned out that they do do rather more complex diagrams in his offices.

Cool! He may be interested in this new post here on Azimuth:

• Alaistair Jamieson-Lane, Markov models of social change.

The search for ‘self-consistent scenarios’, which is what this post is about, is important in studying international development as well as in climate science.

Many interesting posts about math, physics, and science!