Quantropy (Part 2)

In my first post in this series, we saw that filling in a well-known analogy between statistical mechanics and quantum mechanics requires a new concept: ‘quantropy’. To get some feeling for this concept, we should look at some examples. But to do that, we need to develop some tools to compute quantropy. That’s what we’ll do today.

All these tools will be borrowed from statistical mechanics. So, let me remind you how to compute the entropy of a system in thermal equilibrium starting if we know the energy of every state. Then we’ll copy this and get a formula for the quantropy of a system if we know the action of every history.

Computing entropy

Everything in this section is bog-standard. In case you don’t know, that’s British slang for ‘extremely, perhaps even depressingly, familiar’. Apparently it rains so much in England that bogs are not only standard, they’re the standard of what counts as standard!

Let X be a measure space: physically, the set of states of some system. In statistical mechanics we suppose the system occupies states with probabilities given by some probability distribution

p : X \to [0,\infty)

where of course

\int_X p(x) \, dx = 1

The entropy of this probability distribution is

S = - \int_X p(x) \ln(p(x)) \, dx

There’s a nice way to compute the entropy when our system is in thermal equilibrium. This idea makes sense when we have a function

H : X \to \mathbb{R}

saying the energy of each state. Our system is in thermal equilibrium when p maximizes entropy subject to a constraint on the expected value of energy:

\langle H \rangle = \int_X H(x) p(x) \, dx

A famous calculation shows that thermal equilibrium occurs precisely when p is the so-called Gibbs state:

\displaystyle{ p(x) = \frac{e^{-\beta H(x)}}{Z} }

for some real number \beta, where Z is a normalization factor called the partition function:

Z = \int_X e^{-\beta H(x)} \, dx

The number \beta is called the coolness, since physical considerations say that

\displaystyle{ \beta = \frac{1}{T} }

where T is the temperature in units where Boltzmann’s constant is 1.

There’s a famous way to compute the entropy of the Gibbs state; I don’t know who did it first, but it’s both straightforward and tremendously useful. We take the formula for entropy

S = - \int_X p(x) \ln(p(x)) \, dx

and substitute the Gibbs state

\displaystyle{ p(x) = \frac{e^{-\beta H(x)}}{Z} }

getting

\begin{array}{ccl} S &=& \int_X p(x) \left( \beta H(x) - \ln Z \right)\, dx \\   \\  &=& \beta \, \langle H \rangle - \ln Z \end{array}

Reshuffling this a little bit, we obtain:

- T \ln Z = \langle H \rangle - T S

If we define the free energy by

F = - T \ln Z

then we’ve shown that

F = \langle H \rangle - T S

This justifies the term ‘free energy’: it’s the expected energy minus the energy in the form of heat, namely T S.

It’s nice that we can compute the free energy purely in terms of the partition function and the temperature, or equivalently the coolness \beta:

\displaystyle{ F = - \frac{1}{\beta} \ln Z }

Can we also do this for the entropy? Yes! First we’ll do it for the expected energy:

\begin{array}{ccl} \langle H \rangle &=& \displaystyle{ \int_X H(x) p(x) \, dx } \\   \\  &=& \displaystyle{ \frac{1}{Z} \int_X H(x) e^{-\beta H(x)} \, dx } \\   \\  &=& \displaystyle{ -\frac{1}{Z} \frac{d}{d \beta} \int_X e^{-\beta H(x)} \, dx } \\ \\  &=& \displaystyle{ -\frac{1}{Z} \frac{dZ}{d \beta} } \\ \\  &=& \displaystyle{ - \frac{d}{d \beta} \ln Z } \end{array}

This gives

\begin{array}{ccl} S &=& \beta \, \langle H \rangle - \ln Z \\ \\ &=& \displaystyle{ - \beta \, \frac{d \ln Z}{d \beta} - \ln Z }\end{array}

So, if we know the partition function of a system in thermal equilibrium as a function of the temperature, we can work out its entropy, expected energy and free energy.

Computing quantropy

Now we’ll repeat everything for quantropy! The idea is simply to replace the energy by action and the temperature T by i \hbar where \hbar is Planck’s constant. It’s harder to get the integrals to converge in interesting examples. But we’ll worry about that next time, that when we actually do an example!

It’s annoying that in physics S stands for both entropy and action, since in this article we need to think about both. People also use H to stand for entropy, but that’s no better, since that letter also stands for ‘Hamiltonian’! To avoid this let’s use A to stand for action. This letter is also used to mean ‘Helmholtz free energy’, but we’ll just have to live with that. It would be real bummer if we failed to unify physics just because we ran out of letters.

Let X be a measure space: physically, the set of histories of some system. In quantum mechanics we suppose the system carries out histories with amplitudes given by some function

a : X \to \mathbb{C}

where perhaps surprisingly

\int_X a(x) \, dx = 1

The quantropy of this function is

Q = - \int_X a(x) \ln(a(x)) \, dx

There’s a nice way to compute the entropy in Feynman’s path integral formalism. This formalism makes sense when we have a function

A : X \to \mathbb{R}

saying the action of each history. Feynman proclaimed that in this case we have

\displaystyle{ a(x) = \frac{e^{i A(x)/\hbar}}{Z} }

where \hbar is Planck’s constant and Z is a normalization factor called the partition function:

Z = \int_X e^{i A(x)/\hbar} \, dx

Last time I showed that we obtain Feynman’s prescription for a by demanding that it’s a stationary point for the quantropy

Q = - \int_X a(x) \, \ln (a(x)) \, dx

subject to a constraint on the expected action:

\langle A \rangle = \int_X A(x) a(x) \, dx

As I mentioned last time, the formula for quantropy is dangerous, since we’re taking the logarithm of a complex-valued function. There’s not really a ‘best’ logarithm for a complex number: if we have one choice we can add any multiple of 2 \pi i and get another. So in general, to define quantropy we need to pick a choice of \ln (a(x)) for each point x \in X. That’s a lot of ambiguity!

Luckily, the ambiguity is much less when we use Feynman’s prescription for a. Why? Because then a(x) is defined in terms of an exponential, and it’s easy to take the logarithm of an exponential! So, we can declare that

\ln (a(x)) = \displaystyle{ \ln \left( \frac{e^{iA(x)/\hbar}}{Z}\right) } = \frac{i}{\hbar} A(x) - \ln Z

Once we choose a logarithm for Z, this formula will let us define \ln (a(x)) and thus the quantropy.

So let’s do this, and say the quantropy is

\displaystyle{ Q = - \int_X a(x) \left( \frac{i}{\hbar} A(x) - \ln Z \right)\, dx }

We can simplify this a bit, since the integral of a is 1:

\displaystyle{ Q = \frac{1}{i \hbar} \langle A \rangle + \ln Z }

Reshuffling this a little bit, we obtain:

- i \hbar \ln Z = \langle A \rangle - i \hbar Q

By analogy to free energy in statistical mechanics, let’s define the free action by

F = - i \hbar \ln Z

I’m using the same letter for free energy and free action, but they play exactly analogous roles, so it’s not so bad. Indeed we now have

F = \langle A \rangle - i \hbar Q

which is the analogue of a formula we saw for free energy in thermodynamics.

It’s nice that we can compute the free action purely in terms of the partition function and Planck’s constant. Can we also do this for the quantropy? Yes!

It’ll be convenient to introduce a parameter

\displaystyle{ \beta = \frac{1}{i \hbar} }

which is analogous to ‘coolness’. We could call it ‘quantum coolness’, but a better name might be classicality, since it’s big when our system is close to classical. Whatever we call it, the main thing is that unlike ordinary coolness, it’s imaginary!

In terms of classicality, we have

\displaystyle{ a(x) = \frac{e^{- \beta A(x)}}{Z} }

Now we can compute the expected action just as we computed the expected energy in thermodynamics:

\begin{array}{ccl} \langle A \rangle &=& \displaystyle{ \int_X A(x) a(x) \, dx } \\ \\  &=& \displaystyle{ \frac{1}{Z} \int_X A(x) e^{-\beta A(x)} \, dx } \\   \\  &=& \displaystyle{ -\frac{1}{Z} \frac{d}{d \beta} \int_X e^{-\beta A(x)} \, dx } \\ \\  &=& \displaystyle{ -\frac{1}{Z} \frac{dZ}{d \beta} } \\ \\  &=& \displaystyle{ - \frac{d}{d \beta} \ln Z } \end{array}

This gives:

\begin{array}{ccl} Q &=& \beta \,\langle A \rangle - \ln Z \\ \\ &=& \displaystyle{ - \beta \,\frac{d \ln Z}{d \beta} - \ln Z } \end{array}

So, if we can compute the partition function in the path integral approach to quantum mechanics, we can also work out the quantropy, expected action and free action!

Next time I’ll use these formulas to compute quantropy in an example: the free particle. We’ll see some strange and interesting things.

Summary

Here’s where our analogy stands now:

Statistical Mechanics Quantum Mechanics
states: x \in X histories: x \in X
probabilities: p: X \to [0,\infty) amplitudes: a: X \to \mathbb{C}
energy: H: X \to \mathbb{R} action: A: X \to \mathbb{R}
temperature: T Planck’s constant times i: i \hbar
coolness: \beta = 1/T classicality: \beta = 1/i \hbar
partition function: Z = \sum_{x \in X} e^{-\beta H(x)} partition function: Z = \sum_{x \in X} e^{-\beta A(x)}
Boltzmann distribution: p(x) = e^{-\beta H(x)}/Z Feynman sum over histories: a(x) = e^{-\beta A(x)}/Z
entropy: S = - \sum_{x \in X} p(x) \ln(p(x)) quantropy: Q = - \sum_{x \in X} a(x) \ln(a(x))
expected energy: \langle H \rangle = \sum_{x \in X} p(x) H(x) expected action: \langle A \rangle = \sum_{x \in X} a(x) A(x)
free energy: F = \langle H \rangle - TS free action: F = \langle A \rangle - i \hbar Q
\langle H \rangle = - \frac{d}{d \beta} \ln Z \langle A \rangle = - \frac{d}{d \beta} \ln Z
F = -\frac{1}{\beta} \ln Z F = -\frac{1}{\beta} \ln Z
S =  \ln Z - \beta \,\frac{d}{d \beta}\ln Z Q = \ln Z - \beta \,\frac{d }{d \beta}\ln Z

I should also say a word about units and dimensional analysis. There’s enormous flexibility in how we do dimensional analysis. Amateurs often don’t realize this, because they’ve just learned one system, but experts take full advantage of this flexibility to pick a setup that’s convenient for what they’re doing. The fewer independent units you use, the fewer dimensionful constants like the speed of light, Planck’s constant and Boltzmann’s constant you see in your formulas. That’s often good. But here I don’t want to set Planck’s constant equal to 1 because I’m treating it as analogous to temperature—so it’s important, and I want to see it. I’m also finding dimensional analysis useful to check my formulas.

So, I’m using units where mass, length and time count as independent dimensions in the sense of dimensional analysis. On the other hand, I’m not treating temperature as an independent dimension: instead, I’m setting Boltzmann’s constant to 1 and using that to translate from temperature into energy. This is fairly common in some circles. And for me, treating temperature as an independent dimension would be analogous to treating Planck’s constant as having its own independent dimension! I don’t feel like doing that.

So, here’s how the dimensional analysis works in my setup:

Statistical Mechanics Quantum Mechanics
probabilities: dimensionless amplitudes: dimensionless
energy: ML/T^2 action: ML/T
temperature: ML/T^2 Planck’s constant: ML/T
coolness: T^2/ML classicality: T/ML
partition function: dimensionless partition function: dimensionless
entropy: dimensionless quantropy: dimensionless
expected energy: ML/T^2 expected action: ML/T
free energy: ML/T^2 free action: ML/T

I like this setup because I often think of entropy as closely allied to information, measured in bits or nats depending on whether I’m using base 2 or base e. From this viewpoint, it should be dimensionless.

Of course, in thermodynamics it’s common to put a factor of Boltzmann’s constant in front of the formula for entropy. Then entropy has units of energy/temperature. But I’m using units where Boltzmann’s constant is 1 and temperature has the same units as energy! So for me, entropy is dimensionless.

19 Responses to Quantropy (Part 2)

  1. some guy on the street says:

    “Feynman proclaimed that in this case we have…” two Ss that we said were to be A!

  2. Mike Stay says:

    I think it’s a mistake to say \hbar is analogous to T; it’s much more like Boltzmann’s constant k, a constant that’s there just to make the units work. It makes more sense to introduce a unitless parameter \lambda to play the role of temperature and have kT (with units of energy) analogous to \hbar \lambda (with units of action).

    • John Baez says:

      That sounds good, but what’s the physical meaning of \lambda? Maybe it’s a dimensionless version of an adjustable Planck’s constant, with the original \hbar now playing the role of an arbitrary constant for the purposes of changing units?

      The real physical problem is that in reality we can change the temperature but not \hbar.

    • Mike Stay says:

      Lambda’s a measure of classicality; it changes with the constraint on the expected action. By increasing the frequency of a particle (whether due to a photon’s momentum or a particle’s mass), the action along the path increases. For a particle of mass m, we can take \lambda to be m over the Planck mass.

      • John Baez says:

        Okay, this is getting good. Here’s another idea. When dealing with entropy we are doing integrals over states; when dealing with quantropy we are doing integrals over histories. Histories often have a built-parameter that states don’t: how long the history lasts. For example, for a particle moving on a line, a history is a path

        q : [0,t_0] \to \mathbb{R}

        so we have this parameter t_0 at hand. This parameter has dimensions of time, obviously. If we try to make this parameter go away by taking t_0 = +\infty, we get nonsense because the action of most histories becomes infinite. So we need t_0, and it’s probably good to make it explicit.

        I noticed this while calculating the quantropy of a free particle.

        So: in statistical mechanics we can adjust the temperature, but not really Boltzmann’s constant. In quantum mechanics we can adjust the length of time our histories last, but not really Planck’s constant. This seems like a promising analogy, especially since we’ve already thought about a famous analogy relating temperature to imaginary time.

        This extra parameter t_0 is closely related to how action has units of energy × time. I think it should explain all the ‘discrepancies’ in units here:

        Statistical Mechanics Quantum Mechanics
        probabilities: dimensionless amplitudes: dimensionless
        energy: ML/T^2 action: ML/T
        temperature: ML/T^2 Planck’s constant: ML/T
        coolness: T^2/ML classicality: T/ML
        partition function: dimensionless partition function: dimensionless
        entropy: dimensionless quantropy: dimensionless
        expected energy: ML/T^2 expected action: ML/T
        free energy: ML/T^2 free action: ML/T

        Finally, I can’t resist adding that one of Hawking’s famous calculations of black hole entropy treat the coolness \beta = 1/k T as how long the universe lasts in imaginary time.

        (If you click the link I hope you get taken to page 47 of Hawking and Penrose’s book The Nature of Space and Time. This portion of this book written by Hawking is truly wonderful introduction to black hole thermodynamics and the concept of imaginary time. I recommend it to everyone who knows enough physics to understand, say, this blog entry. It’s accurate and meaty without becoming extremely technical. It explains what Hawking was actually thinking when he did his most famous work.)

        • Jim says:

          What if you associate with your histories an energy E\equiv \frac{A}{t_0} and a “temperature” T\equiv \frac{i\hbar}{kt_0}?

        • John Baez says:

          That sounds interesting. I’d call your quantity

          E = A/t_0

          ‘action per time’, or activity, or something. I will investigate these different options soon. It may be more fun after I explain an example of the formalism, namely the free particle. Then we can get more of a feeling for these concepts.

          By the way, this business about ‘action = energy × time’ is closely related to Blake Stacey’s observation that in classical mechanics, conjugate quantities multiply to give something with units of action, while in thermodynamics, they multiply to give something with units of energy. This should ultimately resolve the puzzle of ‘if we try to quantize thermodynamics, what takes the place of Planck’s constant?’… though I can say I see how yet.

      • Mike Stay says:

        I like that idea even better. What both of them have in common is that lambda is the ratio of the time traversing the path to the time for a complete quantum vibration.

        This is related to the ability to accurately determine the pitch of a note: it’s much more important to get your fingers in exactly the right place when you’re bowing a bass than when you’re plucking it.

        It’s also related to the variant of Heisenberg’s inequality \Delta E \Delta t \ge \hbar, where we have an observable Q that does not commute with the Hamiltonian and we define \Delta t to be the time that it takes the expected value of Q to change by one standard deviation.

  3. Toby Bartels says:

    I notice the T versus T² in the dimensions. This reminds me of how thermodynamics uses the heat equation and quantum mechanics uses the wave equation, the difference between which is ∂/∂t versus (∂/∂t)². Except that it’s backwards!

  4. Jim Stuttard says:

    Just for fun. Bog standard was a standard phrase in Lancashire, UK at least from the 1960s.

    One meaning of bog has always been a synonym for toilet in my experience of common english parlance.

    Iiuc the US equivalent is “brown bag”.

    Wiktionary has:

    “Believed to derive from early ceramic toilets being produced to exactly the same specification in white only which became known as the bog standard. [1] [2]”

    http://bit.ly/y2sbYW

    Much enjoyed quantropy part 1. Thanks.

    • John Baez says:

      Thanks! I didn’t really believe in my folk etymology of ‘bog standard’, but I hoped my joke would elicit more accurate information. It’s interesting that ‘bog’ is a synonym for ‘toilet’. I’m (again somewhat jokingly) imagining that this goes back to primitive Anglo-Saxon customs.

      I hope you liked part 2, too.

      • Tom Lowe says:

        Funny, I heard a different and interesting etymology (probably from QI). Old (Hornby?) train sets used to come in two versions which were written ‘box standard’ and ‘box deluxe’, these were so so common that they got converted and distorted into the negative and positive phrases: ‘bog standard’ and ‘the dog’s bollocks’.

  5. I’ve been talking a lot about ‘quantropy’. Last time we figured out a trick for how to compute it starting from the partition function of a quantum system. But it’s hard to get a feeling for this concept without doing some examples […]

  6. If you have carefully read all my previous posts on quantropy (Part 1, Part 2 and Part 3), there’s only a little new stuff here. But still, it’s better organized […]

  7. Besides questions, I like ‘analogy charts’, consisting of two or more columns with analogous items lined up side by side. You can see one near the bottom of my 2nd article on quantropy. Quantropy is an idea born of the analogy between thermodynamics and quantum mechanics […]

  8. I really like ‘analogy charts’ with two or more columns, with analogous items lined up side by side. You can see one near the bottom of my 2nd article on quantropy. Quantropy is an idea born of the analogy between thermodynamics and quantum mechanics. […]

You can use Markdown or HTML in your comments. You can also use LaTeX, like this: $latex E = m c^2 $. The word 'latex' comes right after the first dollar sign, with a space after it.

This site uses Akismet to reduce spam. Learn how your comment data is processed.