Network Theory (Part 10)

Last time Brendan showed us a proof of the Anderson-Craciun-Kurtz theorem on equilibrium states. Today we’ll look at an example that illustrates this theorem. This example brings up an interesting ‘paradox’—or at least a puzzle. Resolving this will get us ready to think about a version of Noether’s theorem relating conserved quantities and symmetries. Next time Brendan will state and prove a version of Noether’s theorem that applies to stochastic Petri nets.

A reversible reaction

In chemistry a type of atom, molecule, or ion is called a chemical species, or species for short. Since we’re applying our ideas to both chemistry and biology, it’s nice that ‘species’ is also used for a type of organism in biology. This stochastic Petri net describes the simplest reversible reaction of all, involving two species:

We have species 1 turning into species 2 with rate constant \beta, and species 2 turning back into species 1 with rate constant \gamma. So, the rate equation is:

\begin{array}{ccc} \displaystyle{ \frac{d x_1}{d t} } &=& -\beta x_1 + \gamma x_2  \\  \qquad \mathbf{} \\ \displaystyle{ \frac{d x_2}{d t} } &=&  \beta x_1 - \gamma x_2  \end{array}

where x_1 and x_2 are the amounts of species 1 and 2, respectively.

Equilibrium solutions of the rate equation

Let’s look for equilibrium solutions of the rate equation, meaning solutions where the amount of each species doesn’t change with time. Equilibrium occurs when each species is getting created at the same rate at which it’s getting destroyed.

So, let’s see when

\displaystyle{ \frac{d x_1}{d t} = \frac{d x_2}{d t} = 0 }

Clearly this happens precisely when

\beta x_1 = \gamma x_2

This says the rate at which 1’s are turning into 2’s equals the rate at which 2’s are turning back into 1’s. That makes perfect sense.

Complex balanced equilibrium

In general, a chemical reaction involves a bunch of species turning into a bunch of species. Since ‘bunch’ is not a very dignified term, a bunch of species is usually called a complex. We saw last time that it’s very interesting to study a strong version of equilibrium: complex balanced equilibrium, in which each complex is being created at the same rate at which it’s getting destroyed.

However, in the Petri net we’re studying today, all the complexes being produced or destroyed consist of a single species. In this situation, any equilibrium solution is automatically complex balanced. This is great, because it means we can apply the Anderson-Craciun-Kurtz theorem from last time! This says how to get from a complex balanced equilibrium solution of the rate equation to an equilibrium solution of the master equation.

First remember what the master equation says. Let \psi_{n_1, n_2}(t) be the probability that we have n_1 things of species 1 and n_2 things of species 2 at time t. We summarize all this information in a formal power series:

\Psi(t) = \sum_{n_1, n_2 = 0}^\infty \psi_{n_1, n_2}(t) z_1^{n_1} z_2^{n_2}

Then the master equation says

\frac{d}{d t} \Psi(t) = H \Psi (t)

where following the general rules laid down in Part 8,

\begin{array}{ccl} H &=& \beta (a_2^\dagger - a_1^\dagger) a_1 + \gamma (a_1^\dagger - a_2^\dagger )a_2 \end{array}

This may look scary, but the annihilation operator a_i and the creation operator a_i^\dagger are just funny ways of writing the partial derivative \partial/\partial z_i and multiplication by z_i, so

\begin{array}{ccl} H &=& \displaystyle{ \beta (z_2 - z_1) \frac{\partial}{\partial z_1} + \gamma (z_1 - z_2) \frac{\partial}{\partial z_2} } \end{array}

or if you prefer,

\begin{array}{ccl} H &=& \displaystyle{ (z_2 - z_1) \, (\beta \frac{\partial}{\partial z_1} - \gamma \frac{\partial}{\partial z_2}) } \end{array}

The first term describes species 1 turning into species 2. The second describes species 2 turning back into species 1.

Now, the Anderson-Cranciun-Kurtz theorem says that whenever (x_1,x_2) is a complex balanced solution of the rate equation, this recipe gives an equilibrium solution of the master equation:

\displaystyle{ \Psi = \frac{e^{x_1 z_1 + x_2 z_2}}{e^{x_1 + x_2}} }

In other words: whenever \beta x_1 = \gamma x_2, we have

we have

H\Psi = 0

Let’s check this! For starters, the constant in the denominator of \Psi doesn’t matter here, since H is linear. It’s just a normalizing constant, put in to make sure that our probabilities \psi_{n_1, n_2} sum to 1. So, we just need to check that

\displaystyle{ (z_2 - z_1) (\beta \frac{\partial}{\partial z_1} - \gamma \frac{\partial}{\partial z_2}) e^{x_1 z_1 + x_2 z_2} = 0 }

If we do the derivatives on the left hand side, it’s clear we want

\displaystyle{ (z_2 - z_1) (\beta x_1 - \gamma x_2) e^{x_1 z_1 + x_2 z_2} = 0 }

And this is indeed true when \beta x_1 = \gamma x_2 .

So, the theorem works as advertised. And now we can work out the probability \psi_{n_1,n_2} of having n_1 things of species 1 and n_2 of species 2 in our equilibrium state \Psi. To do this, we just expand the function \Psi as a power series and look at the coefficient of z_1^{n_1} z_2^{n_2}. We have

\Psi = \displaystyle{ \frac{e^{x_1 z_1 + x_2 z_2}}{e^{x_1 + x_2}} } = \displaystyle{ \frac{1}{e^{x_1} e^{x_2}} \sum_{n_1,n_2}^\infty \frac{(x_1 z_1)^{n_1}}{n_1!} \frac{(x_2 z_2)^{n_2}}{n_2!} }

so we get

\psi_{n_1,n_2} = \displaystyle{ \frac{1}{e^{x_1}} \frac{x_1^{n_1}}{n_1!} \; \cdot \; \frac{1}{e^{x_2}} \frac{x_1^{n_2}}{n_2!} }

This is just a product of two independent Poisson distributions!

In case you forget, a Poisson distribution says the probability of k events occurring in some interval of time if they occur with a fixed average rate and independently of the time since the last event. If the expected number of events is \lambda, the Poisson distribution is

\displaystyle{ \frac{1}{e^\lambda} \frac{\lambda^k}{k!} }

and it looks like this for various values of \lambda:

It looks almost like a Gaussian when \lambda is large, but when \lambda is small it becomes very lopsided.

Anyway: we’ve seen that in our equilibrium state, the number of things of species i = 1,2 is given by a Poisson distribution with mean x_i. That’s very nice and simple… but the amazing thing is that these distributions are independent.

Mathematically, this means we just multiply them to get the probability of finding n_1 things of species 1 and n_2 of species 2. But it also means that knowing how many things there are of one species says nothing about the number of the other.

But something seems odd here. One transition in our Petri net consumes a 1 and produces a 2, while the other consumes a 2 and produces a 1. The total number of particles in the system never changes. The more 1’s there are, the fewer 2’s there should be. But we just said knowing how many 1’s we have tells us nothing about how many 2’s we have!

At first this seems like a paradox. Have we made a mistake? Not exactly. But we’re neglecting something.

Conserved quantities

Namely: the equilibrium solutions of the master equation we’ve found so far are not the only ones! There are other solutions that fit our intuitions better.

Suppose we take any of our equilibrium solutions \Psi and change it like this: set the probability \psi_{n_1,n_2} equal to 0 unless

n_1 + n_2 = n

but otherwise leave it unchanged. Of course the probabilities no longer sum to 1, but we can rescale them so they do.

The result is a new equilibrium solution, say \Psi_n. Why? Because, as we’ve already seen, no transitions will carry us from one value of n_1 + n_2 to another. And in this new solution, the number of 1’s is clearly not independent from the number of 2’s. The bigger one is, the smaller the other is.

Puzzle 1. Show that this new solution \Psi_n depends only on n and the ratio x_1/x_2 = \gamma/\beta, not on anything more about the values of x_1 and x_2 in the original solution

\Psi = \displaystyle{ \frac{e^{x_1 z_1 + x_2 z_2}}{e^{x_1 + x_2}} }

Puzzle 2. What is this new solution like when \beta = \gamma? (This particular choice makes the problem symmetrical when we interchange species 1 and 2.)

What’s happening here is that this particular stochastic Petri net has a ‘conserved quantity’: the total number of things never changes with time. So, we can take any equilibrium solution of the master equation and—in the language of quantum mechanics—`project down to the subspace’ where this conserved quantity takes a definite value, and get a new equilibrium solution. In the language of probability theory, we say it a bit differently: we’re ‘conditioning on’ the conserved quantity taking a definite value. But the idea is the same.

This important feature of conserved quantities suggests that we should try to invent a new version of Noether’s theorem. This theorem links conserved quantities and symmetries of the Hamiltonian.

There are already a couple versions of Noether’s theorem for classical mechanics, and for quantum mechanics… but now we want a version for stochastic mechanics. And indeed one exists, and it’s relevant to what we’re doing here. We’ll show it to you next time.

8 Responses to Network Theory (Part 10)

  1. You need the z‘s in the right hand side of

    \Psi = \displaystyle{ \frac{e^{x_1 z_1 + x_2 z_2}}{e^{x_1 + x_2}} } = \displaystyle{ \frac{1}{e^{x_1} e^{x_2}} \sum_{n_1,n_2}^\infty \frac{x_1^{n_1}}{n_1!} \frac{x_2^{n_2}}{n_2!} } .

    As for puzzle 1, I reached

    \psi_{n_1,n_2} = \displaystyle{ \frac{n!}{(\frac{x_1}{x_2} + 1)^n} \cdot \frac{(\frac{x_1}{x_2})^{n_1}}{n_1!n_2!} }

    • So, if we put \alpha = \frac{x_1}{x_2}, we’d have

      \Psi = \displaystyle{ (\alpha z_1 + z_2)^n/(\alpha + 1)^n }

      or something like that.

      • John Baez says:

        Nice! There must be some more symmetrical way to express this, probably by using both x_1 and x_2 instead of

        \alpha = x_1/x_2.

        How about

        (x_1 z_1 + x_2 z_2)^n / (x_1 + x_2)^n \; ?

        I like ways of writing solutions that leave the symmetries of the problem manifest.

        This should remind you of one of those Pascal’s triangle gadgets where a ball can fall from the top to the bottom making a bunch of random binary choices en route. But here the probability of going left (picking ‘1’) is x_1/(x_1 + x_2), while the probability of going right (picking ‘2’) is x_2/(x_1 + x_2).

        • That would do.

          I like ways of writing solutions that leave the symmetries of the problem manifest.

          Then you need to word your puzzles more carefully, and not break the symmetry there:)

        • John Baez says:

          True, true. I found that hard to do.

          Anyway, I hope the point of the puzzle is clear. We start with a product of independent Poisson distributions to get a probability distribution on pairs (n_1, n_2). Then we ‘slice’ it, fixing a constant value of n_1 + n_2. We get a probability distribution on (n_1, n_2) which is exactly like the distribution for flipping a biased coin and getting n_1 heads and n_2 tails.

  2. Ilya Surdin says:

    Hello, here’s the solution to puzzle 1:

    \frac{1}{ \sum_{k_1+k_2 = n} {C^{k_1}_n x_1^{k_1}x_2^{k_2}}}  \sum_{n_1+n_2 = n} {C^{n_1}_n (x_1z_1)^{n_1}(x_2z_2)^{n_2}}=
    \frac{1}{(x_1+x_2)^n } (x_1z_1+x_2z_2)^n=
    (\frac{ x_1}{x_1+x_2}z_1+\frac{x_2}{x_1+x_2 }z_2)^n=
    (\frac{ 1}{1+\frac{x_2}{x_1}}z_1+\frac{1}{1+\frac{x_1}{x_2} }z_2)^n

    In Puzzle 2, we get: (\frac{z_1+z_2}{2})^n

    I hope, I’m correct this time, and haven’t hurried too much. :)

  3. John Baez says:

    Here are Graham Jones’ answers to the puzzles:

    Puzzle 1. Expand the \exp(x_1z_1 + x_2z_2) in your expression for \Psi, and the term in (x_1z_1 + x_2z_2)^n is \Psi_n. It can be written as ((\gamma/\beta)z_1 + z_2)^n multiplied by something involving x_1,x_2 (and n). When \Psi_n is normalised to make the sum of coefficients equal to one, x_1 and x_2 will go away.

    Puzzle 2. It looks like you tossed a fair coin n times.

  4. Arjun Jain says:

    The particular stochastic network considered here, had a conserved quantity n1+n2.
    For a general network, how do we identify all the conserved quantities and therefore all the symmetries of the corresponding Hamiltonian?

You can use Markdown or HTML in your comments. You can also use LaTeX, like this: $latex E = m c^2 $. The word 'latex' comes right after the first dollar sign, with a space after it.

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.