guest post by Blake S. Pollard
Over a century ago James Clerk Maxwell created a thought experiment that has helped shape our understanding of the Second Law of Thermodynamics: the law that says entropy can never decrease.
Maxwell’s proposed experiment was simple. Suppose you had a box filled with an ideal gas at equilibrium at some temperature. You stick in an insulating partition, splitting the box into two halves. These two halves are isolated from one another except for one important caveat: somewhere along the partition resides a being capable of opening and closing a door, allowing gas particles to flow between the two halves. This being is also capable of observing the velocities of individual gas particles. Every time a particularly fast molecule is headed towards the door the being opens it, letting fly into the other half of the box. When a slow particle heads towards the door the being keeps it closed. After some time, fast molecules would build up on one side of the box, meaning half the box would heat up! To an observer it would seem like the box, originally at a uniform temperature, would for some reason start splitting up into a hot half and a cold half. This seems to violate the Second Law (as well as all our experience with boxes of gas).
Of course, this apparent violation probably has something to do with positing the existence of intelligent microscopic doormen. This being, and the thought experiment itself, are typically referred to as Maxwell’s demon.
When people cook up situations that seem to violate the Second Law there is typically a simple resolution: you have to consider the whole system! In the case of Maxwell’s demon, while the entropy of the box decreases, the entropy of the system as a whole, demon include, goes up. Precisely quantifying how Maxwell’s demon doesn’t violate the Second Law has led people to a better understanding of the role of information in thermodynamics.
At the American Physical Society March Meeting in San Antonio, Texas, I had the pleasure of hearing some great talks on entropy, information, and the Second Law. Jordan Horowitz, a postdoc at Boston University, gave a talk on his work with Massimiliano Esposito, a researcher at the University of Luxembourg, on how one can understand situations like Maxwell’s demon (and a whole lot more) by analyzing the flow of information between subsystems.
Consider a system made up of two parts, and
. Each subsystem has a discrete set of states. Each systems makes transitions among these discrete states. These dynamics can be modeled as Markov processes. They are interested in modeling the thermodynamics of information flow between subsystems. To this end they consider a bipartite system, meaning that either
transitions or
transitions, never both at the same time. The probability distribution
of the whole system evolves according to the master equation:
where is the rate at which the system transitions from
The ‘bipartite’ condition means that
has the form
The joint system is an open system that satisfies the second law of thermodynamics:
where
is the Shannon entropy of the system, satisfying
and
is the entropy change of the environment.
We want to investigate how the entropy production of the whole system relates to entropy production in the bipartite pieces and
. To this end they define a new flow, the information flow, as the time rate of change of the mutual information
Its time derivative can be split up as
where
and
are the information flows associated with the subsystems and
respectively.
When
a transition in increases the mutual information
meaning that
‘knows’ more about
and vice versa.
We can rewrite the entropy production entering into the second law in terms of these information flows as
where
and similarly for . This gives the following decomposition of entropy production in each subsystem:
where the inequalities hold for each subsystem. To see this, if you write out the left hand side of each inequality you will find that they are both of the form
which is non-negative for .
The interaction between the subsystems is contained entirely in the information flow terms. Neglecting these terms gives rise to situations like Maxwell’s demon where a subsystem seems to violate the second law.
Lots of Markov processes have boring equilibria where there is no net flow among the states. Markov processes also admit non-equilibrium steady states, where there may be some constant flow of information. In this steady state all explicit time derivatives are zero, including the net information flow:
which implies that In this situation the above inequalities become
and
If
then is learning something about
or acting as a sensor. The first inequality
quantifies the minimum amount of energy
must supply to do this sensing. Similarly
bounds the amount of useful energy is available to
as a result of this information transfer.
In their paper Horowitz and Esposito explore a few other examples and show the utility of this simple breakup of a system into two interacting subsystems in explaining various interesting situations in which the flow of information has thermodynamic significance.
For the whole story, read their paper!
• Jordan Horowitz and Massimiliano Esposito, Thermodynamics with continuous information flow, Phys. Rev. X 4 (2014), 031015.

Posted by John Baez 








