guest post by Arjun Jain
I am a master’s student in the physics department of the Indian Institute of Technology Roorkee. I’m originally from Delhi. Since some time now, I’ve been wanting to go into Mathematical Physics. I hope to do a PhD in that. Apart from maths and physics, I am also quite passionate about art and music.
Right now I am visiting John Baez at the Centre for Quantum Technologies, and we’re working on chemical reaction networks. This post can be considered as an annotation to the last paragraph of John’s paper, Quantum Techniques for Reaction Networks, where he raises the question of when a solution to the master equation that starts as a coherent state will remain coherent for all times. Remember, the ‘master equation’ describes the random evolution of collections of classical particles, and a ‘coherent state’ is one where the probability distribution of particles of each type is a Poisson distribution.
If you’ve been following the network theory series on this blog, you’ll know these concepts, and you’ll know the Anderson-Craciun-Kurtz theorem gives many examples of coherent states that remain coherent. However, all these are equilibrium solutions of the master equation: they don’t change with time. Moreover they are complex balanced equilibria: the rate at which any complex is produced equals the rate at which it is consumed.
There are also non-equilibrium examples where coherent states remain coherent. But they seem rather rare, and I would like to explain why. So, I will give a necessary condition for it to happen. I’ll give the proof first, and then discuss some simple examples. We will see that while the condition is necessary, it is not sufficient.
First, recall the setup. If you’ve been following the network theory series, you can skip the next section.
Reaction networks
Definition. A reaction network consists of:
• a finite set of species,
• a finite set of complexes, where a complex is a finite sum of species, or in other words, an element of
• a graph with as its set of vertices and some set
of edges.
You should have in mind something like this:

where our set of species is the complexes are things like
and the arrows are the elements of
called transitions or reactions. So, we have functions
saying the source and target of each transition.
Next:
Definition. A stochastic reaction network is a reaction network together with a function assigning a rate constant to each reaction.
From this we can write down the master equation, which describes how a stochastic state evolves in time:
Here is a vector in the stochastic Fock space, which is the space of formal power series in a bunch of variables, one for each species, and
is an operator on this space, called the Hamiltonian.
From now on I’ll number the species with numbers from to
so
Then the stochastic Fock space consists of real formal power series in variables that I’ll call We can write any of these power series as
where
We have annihilation and creation operators on the stochastic Fock space:
and the Hamiltonian is built from these as follows:
John explained this here (using slightly different notation), so I won’t go into much detail now, but I’ll say what all the symbols mean. Remember that the source of a transition is a complex, or list of natural numbers:
So, the power is really an abbreviation for a big product of annihilation operators, like this:
This describes the annihilation of all the inputs to the transition Similarly, we define
and
The result
Here’s the result:
Theorem. If a solution of the master equation is a coherent state for all times
then
must be complex balanced except for complexes of degree 0 or 1.
This requires some explanation.
First, saying that is a coherent state means that it is an eigenvector of all the annihilation operators. Concretely this means
where
and
It will be helpful to write
so we can write
Second, we say that a complex has degree if it is a sum of exactly
species. For example, in this reaction network:

the complexes and
have degree 2, while the rest have degree 1. We use the word ‘degree’ because each complex
gives a monomial
and the degree of the complex is the degree of this monomial, namely
Third and finally, we say a solution of the master equation is complex balanced for a specific complex
if the total rate at which that complex is produced equals the total rate at which it’s destroyed.
Now we are ready to prove the theorem:
Proof. Consider the master equation
Assume that is a coherent state for all
This means
For convenience, we write simply as
and similarly for the components
. Then we have
On the other hand, the master equation gives
So,
As a result, we get
Comparing the coefficients of all we obtain the following. For
which is the only complex of degree zero, we get
For the complexes of degree one, we get these equations:
and so on. For all the remaining complexes we have
This says that the total rate at which this complex is produced equals the total rate at which it’s destroyed. So, our solution of the master equation is complex balanced for all complexes of degree greater than one. This is our necessary condition. █
To illustrate the theorem, I’ll consider three simple examples. The third example shows that the condition in the theorem, though necessary, is not sufficient. Note that our proof also gives a necessary and sufficient condition for a coherent state to remain coherent: namely, that all the equations we listed hold, not just initially but for all times. But this condition seems a bit complicated.
Introducing amoebae into a Petri dish

Suppose that there is an inexhaustible supply of amoebae, randomly floating around in a huge pond. Each time an amoeba comes into our collection area, we catch it and add it to the population of amoebae in the Petri dish. Suppose that the rate constant for this process is 3.
So, the Hamiltonian is If we start with a coherent state, say
then
which is coherent at all times.
We can see that the condition of the theorem is satisfied, as all the complexes in the reaction network have degree 0 or 1.
Amoebae reproducing and competing

This example shows a Petri dish with one species, amoebae, and two transitions: fission and competition. We suppose that the rate constant for fission is 2, while that for competition is 1. The Hamiltonian is then
If we start off with the coherent state
we find that
which is coherent. It should be noted that the chosen initial state
was a complex balanced equilibrium solution. So, the Anderson–Craciun–Kurtz Theorem applies to this case.
Amoebae reproducing, competing, and being introduced

This is a combination of the previous two examples, where apart from ongoing reproduction and competition, amoebae are being introduced into the dish with a rate constant 3.
As in the above examples, we might think that coherent states could remain coherent forever here too. Let’s check that.
Assuming that this was true, if
then would have to satisfy the following:
and
Using the second equation, we get
But this is certainly not a solution of the second equation. So, here we find that initially coherent states do not remain remain coherent for all times.
However, if we choose
then this coherent state is complex balanced except for complexes of degree 1, since it was in the previous example, and the only new feature of this example, at time zero, is that single amoebas are being introduced—and these are complexes of degree 1. So, the condition of the theorem does hold.
So, the condition in the theorem is necessary but not sufficient. However, it is easy to check, and we can use it to show that in many cases, coherent states must cease to be coherent.
Those necessary and sufficient equations in the proof are very interesting…they look like some sort of complicated continuity equation. I haven’t wrapped my head around them yet, but very interesting.
BTW, I think there are a few minor typos in the proof of the theorem. It looks like maybe they’re errors in translation from the old
notation (which I always found confusing) to the newer
notation? Anyway, in the second equation after
I think the power of the
factor should be
rather than
. In the same equation, I think the exponent of the first
term should be
rather than
. Finally, on the right hand side of the next equation, I think the exponent of the
factor should be
. After that, I think it looks fine.
And I managed to screw up the block quote…wrong slash probably…. Oh, well.
You’re right, these are typos the editor introduced when translating Arjun’s post from the old
notation to the new
notation. Fixed—thanks!
I’m glad you like the new notation better; it’s a pain switching notations but it seems to have more mnemonic and conceptual value to talk about the source and target.
But you are also using t for time. Maybe d for destination?
Ugh, we’d have to rewrite the whole book.
There’s no serious ambiguity, I think, since the target is always the target of some reaction, like
while time always appears inside a function, like
or 
Oh, and my mind was so busy trying to wrap itself around those interesting equations that I forgot to say: “Great post!”
So, allow me to remedy that.
Great post!
I am not a physicist, and I only have a vague idea about what coherence is. It sounds like a property I have met in probability theory but maybe the resemblance is superficial.
An example is the Chinese restaurant process, see http://en.wikipedia.org/wiki/Chinese_restaurant_process. This gives a family of distributions
on partitions of
for any positive integer
. A customer arriving is like a creation operator, going from
to
, and (randomly chosen) customer leaving is like an annihilation operator. (I’m pretty sure these are inverses of one another.) Does this mean we can make a Hamiltonian which acts on partitions?
Other examples are family of distributions on binary trees. (eg Aldous, D. 1996. Probability distributions on cladograms, http://www.lix.polytechnique.fr/~ponty/enseignement/BIBS09articles/Aldous-Cladograms-1996.pdf, Ford, D. 2005. Probabilities on cladogram: Introduction to the alpha model, http://arxiv.org/pdf/math/0511246.pdf.) These can have a property called ‘sampling consistency’ or ‘deletion stability’ meaning that randomly removing a tip from a tree leaves you in the same family of distributions.
Graham wrote:
In the context of quantum physics, a typical coherent state is the state of photons in a laser beam. But in the current context, a coherent state is just another name for a product of independent Poisson distributions! It just so happens that the math is very similar.
Arjun is studying reaction networks, which describe random reaction processes involving random assortments of molecules of different kinds. At any moment there’s some probability
of having
molecules of the
th kind. When this is a product of independent Poisson distributions, one for each kind of molecule, we call that a coherent state.
If you start with a coherent state, will it stay coherent? Sometimes, but not often.
The Anderson-Craciun-Kurtz theorem says that for a large class of reaction networks, there are lots of equilibrium states that are products of independent Poisson distributions. This is quite an amazing theorem, given how complicated the reactions can be. This gives one source of examples.
Another source of examples consists of reaction networks where each reaction has just one molecule as input and one molecule as output. These examples are a bit atypical in chemistry… but mathematically they give interesting examples, because for these, any state that’s initially a product of Poisson distributions will always evolve to other such states.
Arjun put severe limits on what other kinds of examples we can have.
I’ll write another comment addressing your actual idea… when I’m feeling a bit more peppy and intelligent.
Note from a theoretical physicist: Deterministic evolution of a pure quantum state should be unitary, which requires that your H be skew-Hermitian. You seem to ignore that restriction. Also, as a matter of terminology, physicists define the Hamiltonian as an Hermitian operator, and insert an i in Schroedinger’s equation so that the H that appears there has that property.
Arjun is discussing stochastic mechanics, not quantum mechanics. For the difference, read Part 12 of the network theory series on this blog, or for more detail my book on the subject.
Briefly:
represents a probability distribution, not a wavefunction. Time evolution operators should thus preserve
not
We thus say these operators are ‘stochastic’ rather than unitary. The Hamiltonian
must therefore be ‘infinitesimal stochastic’, not skew-adjoint. There is also no
in the master equation, which otherwise looks formally like Schrödinger’s equation.
Well I’ll shut up then! :-) The problem description certainly makes a lot more sense for classical master equations. In my defence, I presume that here “annihilation operators”, “Hamiltonians” and “coherent states” (not to mention
) are all borrowed from QM in a deliberate attempt to confuse physicists ;-) But these coherent states don’t correspond to Poissonian distributions in this case, do they? Weird. I have no intuition what their significance is.
Howard wrote:
I don’t know which case “this case” is: quantum or stochastic.
In the stochastic case, a coherent state for one type of particle looks like
so the probability of having
particles of this type is given by the
th coefficient of this series:
which is a Poisson distribution. Here we need
I guess in the quantum case you get a Poisson distribution too, since while you have to take the absolute value squared of an amplitude to get a probability, you also need to use the coefficients with respect to the orthonormal basis
instead of with respect to the basis
as in stochastic mechanics. So, you get amplitudes
and probabilities
at least when
is real. So, again a Poisson distribution. (I’m too lazy to think about the case of complex
right this instant.)
I meant the classical stochastic case. I didn’t read enough of your notes to catch that the basis in that case is
, so that you end up with a Poissonian distribution in that case too. (Yes, it is certainly Poissonian for any coherent amplitude in the quantum case). Good, I’m glad that “coherent state” means something similar in both cases, and now I understand that the question posed by Arjun is a natural one in classical stochastic processes (even if attacked in a way unfamiliar to me). Thanks for taking the time to explain.
Would be nice to move the definition of coherent to the intro, not after the statement of the theorem
Unfortunately the full definition of ‘coherent’ state relies on lots of notation that’s brought in only after the introduction. But I added a quick rough definition of coherent state to the intro… and since someone had gotten fooled into thinking Arjun was talking about quantum mechanics, I clarified that too: