Heh. I didn’t mean to suggest you’re studying death and decay, at least not exclusively. I imagine your nonequilibrium steady states would have equal amounts of birth and death.

]]>John and Blake are looking at the right end.

It almost sounds from this like we’re studying death and decay, but in fact there are many processes in living systems that can be (approximately) modeled by ‘nonequilibrium steady states’: processes where something flows in one end and out the other end at a constant rate. This is more interesting than ‘equilibrium’ but less interesting than truly dynamic processes, where flows change with time.

There’s a lot of work on nonequilibrium steady states in thermodynamics, and we’re trying to gather it up and package it in our formalism. What’s mostly interesting to us is how you can take a nonequilibrium steady state on a big complicated network and think of it as built from nonequilibrium steady states on the pieces: category theory helps formalize this.

Nonequilibrium steady states are a lot more interesting than equilibrium (the bread and butter of classical thermodynamics) but less interesting than truly dynamical processes: life in all its pulsing, throbbing splendor. We are working our way up the ladder of interestingness, and in a while we’ll start talking about truly dynamical processes, always taking the network viewpoint.

]]>This law is just the master equation.

I have thought about that but this is all not so clear to me.

Current is usually the time derivative of charge. So is the current in a electric circuit defined to be the time derivative of the different potentials (“the charges”) at the two vertices of an edge?

Then yes Ohms Law would give you a “master equation” for the differences of potentials, i.e. an equation which “lives” on an edge. But the time differential of the differences of population deviations between two vertices along an edge would give for the Markov Master equation a sum over all sorts of edges.

Apart from that it seems the detailed balance condition says something about the connection between vertices and not about the seperate edges….but as said I skipped the category part.

]]>Oh, okay. I guess I made up the name “flow law” to make it sound more analogous to “Ohm’s law”. That was silly.

This law is just the master equation in disguise. We should call it the master equation.

]]>I don’t think I ever said “flow law”. I don’t know what you mean by “flow law”.

In the conclusions: In the analogy table on page 38. there the “flow law” is listed as an analogy to Ohms law.

I saw the text, in the conclusions, but unfortunately it didn’t help to answer the above questions.

]]>Nad wrote:

Did he [Prigogine] give at all a definition of dissipation?

Again, I haven’t read everything he wrote. But in the book I’m reading now, *Modern Thermodynamics*, he defines quantities called ‘entropy production’ and ‘excess entropy production’ which are somewhat similar to our ‘dissipation’, but also somewhat different. In particular, he’s working in the context of thermodynamics rather than statistical mechanics, so he doesn’t have probabilities or Markov processes.

(For anyone who has this book, look at equation 18.3.6, which is a formula for ‘excess entropy production’: it looks a lot like the quantity

which we discuss in Part 16 of this series. But they have different meanings, since ours is defined in terms of probabilities and his is not.)

I find in particular confusing that if you have an edge between two points and then there could be a flow on the edge (as defined in your article) but no flow from to which makes the analogy between current and flow (see analogy table in your article) a bit hard to see.

We explain this issue, and various other tricky issues in the analogy between circuits and detailed balanced Markov processes, in the conclusions to our paper with Brendan. Very *very* briefly, current can flow in either direction along an edge in a circuit, while probability can only flow in one direction along an edge in a Markov process, so the detailed balanced Markov process analogous to a specific circuit needs to have two edges for each edge of the circuit.

But please read the conclusions if you want my real thoughts on this analogy! Category theory is the way to make analogies into actual mappings (functors), so our paper is all about a functor from detailed balanced Markov processes to circuits, but in the conclusions we try to avoid talking about functors.

But maybe I haven’t right understood what you mean here that is in particular by my brief browsing I couldn’t find a definition of but just what you call “flow law” which -if you call it “law”- seems in particular to be not a “definition”.

I don’t think I ever said “flow law”. I don’t know what you mean by “flow law”.

Detailed balanced Markov processes do obey a law analogous to Ohm’s law.

]]>But I’ve never seen our definition of dissipation in anything he wrote or anything anyone else wrote.

Did he give at all a definition of dissipation?

If I were less experienced, I would have said our definition is new.

experienced? could also be due to culture and/or character.

Anyways I haven’t yet figured out how “complete” the correspondence between electrical circuits and the above Markov processes is. Part of the problem is of course that I do not understand that category theoretic language and that I can’t find the time to learn it. I find in particular confusing that if you have an edge between two points i and j then there could be a flow on the edge (as defined in your article) but no flow from j to i, which makes the analogy between current and flow (see analogy table in your article) a bit hard to see. But maybe I haven’t right understood what you mean here that is in particular by my brief browsing I couldn’t find a definition of but just what you call “flow law” which -if you call it “law”- seems in particular to be not a “definition”.

Moreover you relate the “flow law” to Ohms law and that analogy I also dont really see.

Nad wrote:

Is it the same definition as in Prigogine?

Prigogine has written about 10 books and lots of papers, and I haven’t read all of them. But I’ve never seen our definition of dissipation in anything he wrote or anything anyone else wrote. That’s why I said it “might be slightly new”.

If I were less experienced, I would have said our definition is new. But I’ve learned how hard it is to invent something simple that nobody has ever thought of before.

]]>thanks Graham, I actually just wanted to get an overview and I don’t understand your last comment.

I briefly looked into the cpsrep.pdf but I know already by first sight, that it would take me way too much time to understand it.

There seems also a lot of guess work necessary, like is the S in equation (2) entropy? If yes what kind of entropy? a.s.o.

I searched whether the term dissipation shows up and it didn’t so its not even possible for me to see how on earth this article relates to the pop science article.

]]>Here is the article by Jeremy England to which the pop science article refers:

The big differerence is that he is considering the early stages of a Markov process, starting with a single cell in a closed (but friendly) system. A somewhat mangled quote (from PDF) follows:

… we expect that the eventual steady-state for this system will be a Boltzmann distribution over microstates in which detailed balance holds. What makes the scenario of interest here a far-from-equilibrium process, then, is only that its initial conditions correspond to a highly non-Boltzmann distribution over microstates. The bacterial growth that takes place at the very beginning of the long process of relaxation to equilibrium is in this sense a mere transient, the fuel for which comes from chemical energy stored in the system’s starting microscopic arrangement. By the time detailed balance sets in, we expect all bacteria initially in the system will have long since overgrown their container, starved, perished, and disintegrated into their constituent parts.

See Fig 1 in this paper: http://www.cs.cornell.edu/cv/researchpdf/19ways+.pdf. Jeremy England is looking at the left end of the graph, well before s/m. John and Blake are looking at the right end.

]]>