In 2009, Erik Verlinde argued that gravity is an entropic force. This created a big stir—and it helped him win about $6,500,000 in prize money and grants! But what the heck is an ‘entropic force’, anyway?

Entropic forces are nothing unusual: you’ve felt one if you’ve ever stretched a rubber band. Why does a rubber band pull back when you stretch it? You might think it’s because a stretched rubber band has *more energy* than an unstretched one. That would indeed be a fine explanation for a metal spring. But rubber doesn’t work that way. Instead, a stretched rubber band mainly has *less entropy* than an unstretched one—and this too can cause a force.

You see, molecules of rubber are like long chains. When unstretched, these chains can curl up in lots of random wiggly ways. ‘Lots of random ways’ means lots of entropy. But when you stretch one of these chains, the number of ways it can be shaped decreases, until it’s pulled taut and there’s just one way! Only past that point does stretching the molecule take a lot of energy; before that, you’re mainly decreasing its entropy.

So, the force of a stretched rubber band is an entropic force.

But how can changes in either energy or entropy give rise to forces? That’s what I want to explain. But instead of talking about force, I’ll start out talking about pressure. This too arises both from changes in energy and changes in entropy.

### Entropic pressure — a sloppy derivation

If you’ve ever studied thermodynamics you’ve probably heard about an ideal gas. You can think of this as a gas consisting of point particles that almost never collide with each other—because they’re just points—and bounce elastically off the walls of the container they’re in. If you have a box of gas like this, it’ll push on the walls with some pressure. But the cause of this pressure is *not* that slowly making the box smaller increases the energy of the gas inside: in fact, it doesn’t! The cause is that making the box smaller decreases the *entropy* of the gas.

To understand how pressure has an ‘energetic’ part and an ‘entropic’ part, let’s start with the basic equation of thermodynamics:

What does this mean? It means the internal energy of a box of stuff changes when you heat or cool it, meaning that you change its entropy but also when you shrink or expand it, meaning that you change its volume Increasing its entropy raises its internal energy at a rate proportional to its temperature Increasing its volume lowers its internal energy at a rate proportional to its pressure

We can already see that both changes in energy, and entropy, can affect Pressure is like force—indeed it’s just force per area—so we should try to solve for

First let’s do it in a sloppy way. One reason people don’t like thermodynamics is that they don’t understand partial derivatives when there are lots different coordinate systems floating around—which is what thermodynamics is all about! So, they manipulate these partial derivatives sloppily, feeling a sense of guilt and unease, and sometimes it works, but other times it fails disastrously. The cure is *not* to learn more thermodynamics; the cure is to learn about differential forms. All the expressions in the basic equation are differential forms. If you learn what they are and how to work with them, you’ll never get in trouble with partial derivatives in thermodynamics—as long as you proceed slowly and carefully.

But let’s act like we don’t know this! Let’s start with the basic equation

and solve for First we get

This is fine. Then we divide by and get

This is not so fine: here the guilt starts to set in. After all, we’ve been told that we need to use ‘partial derivatives’ when we have functions of several variables—and the main fact about partial derivatives, the one that everybody remembers, is that these are written with with curly d’s, not ordinary letter d’s. So we must have done something wrong. So, we make the d’s curly:

But we still feel guilty. First of all, who gave us the right to make those d’s curly? Second of all, a partial derivative like makes no sense unless is one of a set of coordinate functions: only then we can talk about how much some function changes as we change *while keeping the other coordinates fixed*. The value of actually depends on what other coordinates we’re keeping fixed! So what coordinates are we using?

Well, it seems like one of them is and the other is… we don’t know! It could be or or or perhaps even This is where real unease sets in. If we’re taking a test, we might in desperation think something like this: “Since the easiest things to control about our box of stuff are its volume and its temperature, let’s take these as our coordinates!” And then we might write

And then we might do okay on this problem, because this formula is in fact *correct!* But I hope you agree that this is an unsatisfactory way to manipulate partial derivatives: we’re shooting in the dark and hoping for luck.

#### Entropic pressure and entropic force

So, I want to show you a better way to get this result. But first let’s take a break and think about what it means. It means there are two possible reasons a box of gas may push back with pressure as we try to squeeze it smaller while keeping its temperature constant. One is that the energy may go up:

will be positive if the internal energy goes up as we squeeze the box smaller. But the other reason is that entropy may go down:

will be positive if the entropy goes down as we squeeze the box smaller, assuming

Let’s turn this fact into a result about force. Remember that pressure is just force per area. Say we have some stuff in a cylinder with a piston on top. Say the the position of the piston is given by some coordinate and its area is Then the stuff will push on the piston with a force

and the change in the cylinder’s volume as the piston moves is

Then

gives us

So, the force consists of two parts: the **energetic force**

and the **entropic force**:

Energetic forces are familiar from classical statics: for example, a rock pushes down on the table because its energy would decrease if it could go down. Entropic forces enter the game when we generalize to thermal statics, as we’re doing now. But when we set these entropic forces go away and we’re back to classical statics!

#### Entropic pressure—a better derivation

Okay, enough philosophizing. To conclude, let’s derive

in a less sloppy way. We start with

which is true no matter what coordinates we use. We can choose 2 of the 5 variables here as local coordinates, generically at least, so let’s choose and Then

and similarly

Using these, our equation

becomes

If you know about differential forms, you know that the differentials of the coordinate functions, namely and form a basis of 1-forms. Thus we can equate the coefficients of in the equation above and get:

and thus:

which is what we wanted! There should be no bitter aftertaste of guilt this time.

### The big picture

That’s almost all I want to say: a simple exposition of well-known stuff that’s not quite as well-known as it should be. If you know some thermodynamics and are feeling mildly ambitious, you can now work out the pressure of an ideal gas and show that it’s *completely* entropic in origin: only the first term in the right-hand side above is nonzero. If you’re feeling a lot more ambitious, you can try to read Verlinde’s papers and explain them to me. But my own goal was not to think about gravity. Instead, it was to ponder a question raised by Allen Knutson: how does the ‘entropic force’ idea fit into my ruminations on classical mechanics versus thermodynamics?

It seems to fit in this way: as we go from classical statics (governed by the principle of least energy) to thermal statics at fixed temperature (governed by the principle of least *free* energy), the definition of force familiar in classical statics must be adjusted. In classical statics we have

where

is the energy as a function of some coordinates on the configuration space of our system, some manifold But in thermal statics at temperature our system will try to minimize, not the energy but the Helmholtz free energy

where

is the entropy. So now we should define force by

and we see that force has an entropic part and an energetic part:

When the entropic part goes away and we’re back to classical statics!

*I’m subject to the natural forces.*– Lyle Lovett

Nice post!

Suggestion for next post: Deriving the force-extension curve of a freely-jointed chain model for a polymer (“rubber band”). Then the worm-like-chain model, and compare with single-molecule DNA-stretching experiments.

So there should be a similar splitting of the momentum, with a part due to the free action and a part due to quantropy.

Darn, you beat me to it. Shh!

Yes, the nice thing about having

twoanalogies to play with (classical statics versus thermal statics, thermal statics versus quantum dynamics) is that one can generate a lot of ideas; it takes longer for both analogies to ‘saturate’ than if you have just one.I’m busy writing a post on quantropy, where I try to work it out in an example so we can explore in detail ideas like the one you mentioned. It’s hard to develop a good intuition for quantropy without looking at some examples. Of course one can follow the analogies and make a lot of very good guesses about it. But the hands-on feel for entropy that I’ve built up through many calculations, I’m still lacking for quantropy.

Shouldn’t there be a + sign in equation for entropic force?

Anyway I always have trouble with entropy and especially with the notion of it as a fundamental quantity (same goes for information).

For example let’s look at the simplest case I can think of – of one dimensional piston of length L with just one molecule of ideal gas going back and forth between the walls. The molecule will hit walls with certain average frequency dependent on the average momentum (ie temperature). So if I understand it correctly in this case entropy is directly related to the length of a piston since to describe the microscopic state we have to specify the position of the molecule and it’s direction. So decreasing the piston length L while keeping temperature (and therefore avg momentum) constant will decrease the entropy and also result in the molecule hitting the walls more frequently so the avg. force exerted by the molecule on the walls will increase.

Ok, so one could say that the average force increased because of decrease in entropy, but while correct that is an abstract statement which seems (to me anyway) much less informative then stating that the average force increased due to decrease in piston length. Here the piston length seems like a fundamental parameter of the problem and entropy is just an abstract concept derived from it.

Now I understand the usefulness of entropy when talking about macroscopic processes since it allows us to abstract from the details of microscopic behavior so we can calculate useful quantities even when we don’t have good grasp of the details of microscopic behavior in our problem. But I don’t see it’s usefulness at the microscopic level where quantities like space, time, momentum and energy seem much more fundamental and relevant.

This is also why the notion of “gravity as an entropic force” seems much less appealing to me then gravity as spacetime curvature (if only other forces could be derived from spacetime geometry…btw I’ve seen papers that show EM can be seen as a manifestation of spacetime torsion, is this a valid approach?).

Arrow wrote:

No, not if we’re talking about the same equation. But you may indeed have noticed an inconsistency in what I wrote, due to a typo. I wrote:

But that last minus sign was wrong. In fact

In other words, the entropic force points in the direction of

increasingentropy (at least if , which is true except in rather unusual circumstances, which I will ignore henceforth).It sounds like you’re saying the force

decreasesif the entropy increases as we expand the piston. The equations I’m throwing around say the forceis positiveif if the entropy increases as we expand the piston:It’s true that as you expand a piston full of ideal gas, the force pushing on its top decreases. But my blog post is talking about itself, not how this force changes as you change (the length of the piston). Obviously the force changes like this:

I will avoid discussing gravity, except for this:

I’d never seen such an idea, despite spending an unhealthy amount of time thinking about ‘teleparallel gravity’, a theory that’s almost equivalent to general relativity, but in which gravity is described using torsion rather than curvature. Now that you mention it, I see a paper that claims you can describe gravity coupled to electromagnetism and spinors using torsion. I can see that it’s not the work of a crackpot, but I can’t assure you that it’s correct.

This is an excellent post, and a jumping off point for lots of discussion.

Here is one —

If we were to use the rubber band analogy in terms of the greenhouse gas theory, how would it work?

I would suggest that a greenhouse gas serves to limit the outgoing radiation into bands of wavelength. This reduces the space of allowable energy states and thus reduces the entropy of the subsystem. However, we still must maintain an energy balance with the external system, and so the entropic part of the decrease in free energy is exactly compensated by a temperature increase.

At the most elemental level, that is why greenhouse gases raise the temperature of a planet’s surface. We can talk all we want about variability in climate dynamics and atmospheric lapse rate, etc, but this is the heart of the argument.

Stretching the rubber band is like putting notches in the emission spectrum. That decreases entropy of the photonic volume, and temperature has to compensate. Mathematically, this is calculated by rescaling the Planck gray-body response.

I bring this up because the complexity of the gravity=entropic force argument makes this look simple in comparison.

So now we have four very similar equations:

The minimal-action path given Hamilton’s principal function satisfies

d(Action) = Momentum * d(Position) – Energy * d(Time)

where all of these are functions of time.

The one you talked about here is

d(Energy) = kT * d(Entropy) – Force * d(Position).

If we have a statistical ensemble of paths and need to choose one based on a constraint on the mean action and, say, the mean position at a given time, we have

d(Action) = Lambda * d(Entropy) – Momentum * d(Position)

When we do quantum superpositions rather than statistical ensembles, we get your notion of quantropy.

If we have a rubber band under tension and increase the temperature (like in this heat engine described by Feynman) then the rubber band contracts:

d(Entropy) = Force * d(ThermalExpansionCoefficient) – Energy * d(Coolness)

Can we describe these last three in a similar way to the first? As we change the position of the piston, do the temperature and entropy change as though they were a particle moving in phase space with energy playing the role of Hamilton’s principal function? Similarly, if we change the temperature, do the force and thermal expansion coefficient change as though they were a particle moving in a phase space with entropy playing the role of the principal function?

Mike wrote:

Yes, I believe so! Blake mentioned some examples of this phenomenon here, where he wrote:

The boldface sentence is a way of saying ‘yes’ to your question in a bunch of thermodynamic examples. I’m pretty sure it’s a very general fact.

Hello, John! Are your posts (for example this one) available as PDF’s? Some of them, like network theory, are on azimuth wiki, which can produce pdf, but not this one. I wanted to read it on a e-book reader, however this html doesn’t fit it really well, especially latex as images.

Hi! No, I haven’t made them available as PDFs. You can get these series of posts on my website:

• Information Geometry.

• Network Theory.

I think they look better there than here—just click the box on top to get the jsmath set up and the box will go away.

I not put my posts on quantropy or ‘thermodynamics versus classical mechanics’ onto my website yet, but I will, and I’ll let people know when I do. It takes a bit of work. I’ll probably put them into a single series, because they belong together. (In fact all this stuff fits together into a big story, but that’s going to take a while for me to flesh out!)

I’m writing a paper based on the Network Theory series, and I plan to write a paper on quantropy too. They’ll be more polished than these blog posts…

Well, I do love thermodynamics, but the most difficult thing for me is to decide what is the sign near the work term . And what work is it — done by the system or by the environment. May be there is some trick to remember?

Anyway, I hope what follows will be right. So consider a rubber band of length — let it be the only geometric parameter describing the band. Let be the force that

pulls your handwhen you are stretching the band. So if it pulls, it is positive. Then:or

Hence :

Thus if you heat the rubber band it will pull harder, it shrinks. I was just curious whether I could prove it :-) Maybe I failed but the fact still holds. One need to know this property of rubber in order to explain the rotating sense of a rubber band heat engine.

Hi! I don’t think there’s any ‘trick’ to remembering the sign of work. I agree that it’s an annoying issue. But it just means I need to spend a minute deciding whether I’m talking about

the work the system is doing on the environmentorthe work the environment is doing on the system, which has the opposite sign.I find it much more annoying when people tell me set my watch “forward” one hour when Daylight Saving Time starts in the spring. Do they mean to set my watch to an earlier time, or a later time? The word “forward” is confusing. The “forward” of a book is near the front, but as you read “forwards” through the book you move toward the back. Similarly,

ancient historyis the study of the time when everything was a lotyoungerthan it is now!I had to learn category theory to really understand this stuff.

Of course, one can try to choose a convention and stick with it. President Kennedy famously said “ask not what your country can do for you—ask what you can do for your country!” So he preferred to always think about the work the system (you) did on its environment (your country).

I think your argument is correct, and it’s nice! My argument would be to use the formula I gave:

The force of a rubber band or stretched spring has an entropic part (the first term) and an energetic part (the second term). The entropic part is proportional to temperature, so it gets bigger when it’s hot. The energetic part doesn’t change.

Before proceeding, note that my is opposite to your $F$ — when the band pulls is

negative(pressure is negative here, unlike the gas piston). So, to my point. Actually both parts depend on temperature and they both can change. So from your formula one should carefully findSo both derivations are identical indeed (despite the notion difference ).

But I’d like to emphasize again what really matters — the sign

$\frac{\partial S}{\partial L} < 0 $

For a metal rod or a piston (and I guess for a spring) it is positive. Stretching these systems increases the phase space allowed for the system so the entropy increases. Meanwhile if you heat the systems mentioned they expand. Well, just like we were taught at school "when a substance is heated it expands".

The story is opposite for a rubber band. If it is stretched, its entropy decreases. If it is heated it contracts. So the whole thing was to demonstrate how these two "anomalies" are interconnected.

[…] John Baez. Entropic forces, 2012. URL https://johncarlosbaez.wordpress.com/2012/02/01/entropic-forces/. […]

[…] The thing with rubber is that the elastic forces you experience are entropic, that is when you stretch a rubber band you (roughly speaking) do not increase its internal energy, you decrease its entropy. That’s because rubber molecules are long twisted chains and when you expand rubber you straighten them, thus ordering (decreasing their entropy). A simple kinetic theory of rubber based on entropic reasoning is presented in the book. For quick introduction on rubber thermodynamics I suggest you John Baez’s post about entropic forces. […]

I has been thinking about this post for a long time, John. The reason is that your expression for the force as the sum of an entropic term plus a potential (energy) term looks pretty similar (but not identical) to the expresion of the force in lagrangian dynamics with dissipation. The big “but” is that the dissipative part is generally assumed to take the form of the so-called Rayleigh dissipative function D:

so

and then

Therefore, if we identify the entropic part with the dissipative term related to the Rayleigh function, we have

Does this last equation make sense?

I don’t think it makes sense to identify an entropic force with a frictional force coming from a Rayleigh function, because a frictional force is almost always velocity-dependent while an entropic force is often not.

Furthermore, the entropic force

involves a partial derivative with respect to while the frictional force

involves a partial derivative with respect to .

Furthermore, the entropic force is proportional to temperature, , while the frictional force is not.

They seem very different.