Entropic Forces

 

In 2009, Erik Verlinde argued that gravity is an entropic force. This created a big stir—and it helped him win about $6,500,000 in prize money and grants! But what the heck is an ‘entropic force’, anyway?

Entropic forces are nothing unusual: you’ve felt one if you’ve ever stretched a rubber band. Why does a rubber band pull back when you stretch it? You might think it’s because a stretched rubber band has more energy than an unstretched one. That would indeed be a fine explanation for a metal spring. But rubber doesn’t work that way. Instead, a stretched rubber band mainly has less entropy than an unstretched one—and this too can cause a force.

You see, molecules of rubber are like long chains. When unstretched, these chains can curl up in lots of random wiggly ways. ‘Lots of random ways’ means lots of entropy. But when you stretch one of these chains, the number of ways it can be shaped decreases, until it’s pulled taut and there’s just one way! Only past that point does stretching the molecule take a lot of energy; before that, you’re mainly decreasing its entropy.

So, the force of a stretched rubber band is an entropic force.

But how can changes in either energy or entropy give rise to forces? That’s what I want to explain. But instead of talking about force, I’ll start out talking about pressure. This too arises both from changes in energy and changes in entropy.

Entropic pressure — a sloppy derivation

If you’ve ever studied thermodynamics you’ve probably heard about an ideal gas. You can think of this as a gas consisting of point particles that almost never collide with each other—because they’re just points—and bounce elastically off the walls of the container they’re in. If you have a box of gas like this, it’ll push on the walls with some pressure. But the cause of this pressure is not that slowly making the box smaller increases the energy of the gas inside: in fact, it doesn’t! The cause is that making the box smaller decreases the entropy of the gas.

To understand how pressure has an ‘energetic’ part and an ‘entropic’ part, let’s start with the basic equation of thermodynamics:

d U = T d S - P d V

What does this mean? It means the internal energy U of a box of stuff changes when you heat or cool it, meaning that you change its entropy S, but also when you shrink or expand it, meaning that you change its volume V. Increasing its entropy raises its internal energy at a rate proportional to its temperature T. Increasing its volume lowers its internal energy at a rate proportional to its pressure P.

We can already see that both changes in energy, U, and entropy, S, can affect P d V. Pressure is like force—indeed it’s just force per area—so we should try to solve for P.

First let’s do it in a sloppy way. One reason people don’t like thermodynamics is that they don’t understand partial derivatives when there are lots different coordinate systems floating around—which is what thermodynamics is all about! So, they manipulate these partial derivatives sloppily, feeling a sense of guilt and unease, and sometimes it works, but other times it fails disastrously. The cure is not to learn more thermodynamics; the cure is to learn about differential forms. All the expressions in the basic equation d U = T d S - P d V are differential forms. If you learn what they are and how to work with them, you’ll never get in trouble with partial derivatives in thermodynamics—as long as you proceed slowly and carefully.

But let’s act like we don’t know this! Let’s start with the basic equation

d U = T d S - P d V

and solve for P. First we get

P d V = T d S - d U

This is fine. Then we divide by d V and get

\displaystyle{ P = T \frac{d S}{d V} - \frac{d U}{d V} }

This is not so fine: here the guilt starts to set in. After all, we’ve been told that we need to use ‘partial derivatives’ when we have functions of several variables—and the main fact about partial derivatives, the one that everybody remembers, is that these are written with with curly d’s, not ordinary letter d’s. So we must have done something wrong. So, we make the d’s curly:

\displaystyle{ P = T \frac{\partial S}{\partial V} - \frac{\partial U}{\partial V} }

But we still feel guilty. First of all, who gave us the right to make those d’s curly? Second of all, a partial derivative like \frac{\partial S}{\partial V} makes no sense unless V is one of a set of coordinate functions: only then we can talk about how much some function changes as we change V while keeping the other coordinates fixed. The value of \frac{\partial S}{\partial V} actually depends on what other coordinates we’re keeping fixed! So what coordinates are we using?

Well, it seems like one of them is V, and the other is… we don’t know! It could be S, or P, or T, or perhaps even P. This is where real unease sets in. If we’re taking a test, we might in desperation think something like this: “Since the easiest things to control about our box of stuff are its volume and its temperature, let’s take these as our coordinates!” And then we might write

\displaystyle{ P = T \left.\frac{\partial S}{\partial V}\right|_T - \left.\frac{\partial U}{\partial V}\right|_T }

And then we might do okay on this problem, because this formula is in fact correct! But I hope you agree that this is an unsatisfactory way to manipulate partial derivatives: we’re shooting in the dark and hoping for luck.

Entropic pressure and entropic force

So, I want to show you a better way to get this result. But first let’s take a break and think about what it means. It means there are two possible reasons a box of gas may push back with pressure as we try to squeeze it smaller while keeping its temperature constant. One is that the energy may go up:

\displaystyle{ -\left.\frac{\partial U}{\partial V}\right|_T }

will be positive if the internal energy goes up as we squeeze the box smaller. But the other reason is that entropy may go down:

\displaystyle{  T \left.\frac{\partial S}{\partial V}\right|_T }

will be positive if the entropy goes down as we squeeze the box smaller, assuming T > 0.

Let’s turn this fact into a result about force. Remember that pressure is just force per area. Say we have some stuff in a cylinder with a piston on top. Say the the position of the piston is given by some coordinate x, and its area is A. Then the stuff will push on the piston with a force

F = P A

and the change in the cylinder’s volume as the piston moves is

d V = A d x

Then

\displaystyle{  P = T \left.\frac{\partial S}{\partial V}\right|_T - \left.\frac{\partial U}{\partial V}\right|_T }

gives us

\displaystyle{ F = T \left.\frac{\partial S}{\partial x}\right|_T - \left.\frac{\partial U}{\partial x}\right|_T }

So, the force consists of two parts: the energetic force

\displaystyle{ F_{\mathrm{energetic}} = - \left.\frac{\partial U}{\partial x}\right|_T }

and the entropic force:

\displaystyle{ F_{\mathrm{entropic}} =  T \left.\frac{\partial S}{\partial x}\right|_T}

Energetic forces are familiar from classical statics: for example, a rock pushes down on the table because its energy would decrease if it could go down. Entropic forces enter the game when we generalize to thermal statics, as we’re doing now. But when we set T = 0, these entropic forces go away and we’re back to classical statics!

Entropic pressure—a better derivation

Okay, enough philosophizing. To conclude, let’s derive

\displaystyle{ P = T \left.\frac{\partial S}{\partial V}\right|_T - \left.\frac{\partial U}{\partial V}\right|_T }

in a less sloppy way. We start with

d U = T d S - P d V

which is true no matter what coordinates we use. We can choose 2 of the 5 variables here as local coordinates, generically at least, so let’s choose V and T. Then

\displaystyle{ d U = \left.\frac{\partial U}{\partial V}\right|_T d V + \left.\frac{\partial U}{\partial T}\right|_V d T }

and similarly

\displaystyle{ d S = \left.\frac{\partial S}{\partial V}\right|_T d V + \left.\frac{\partial S}{\partial T}\right|_V d T }

Using these, our equation

d U = T d S - P d V

becomes

\displaystyle{ \left.\frac{\partial U}{\partial V}\right|_T d V + \left.\frac{\partial U}{\partial T}\right|_V d T = T \left(\left.\frac{\partial S}{\partial V}\right|_T d V + \left.\frac{\partial S}{\partial T}\right|_V d T \right) - P dV }

If you know about differential forms, you know that the differentials of the coordinate functions, namely d T and d V, form a basis of 1-forms. Thus we can equate the coefficients of d V in the equation above and get:

\displaystyle{ \left.\frac{\partial U}{\partial V}\right|_T = T \left.\frac{\partial S}{\partial V}\right|_T - P }

and thus:

\displaystyle{ P = T \left.\frac{\partial S}{\partial V}\right|_T - \left.\frac{\partial U}{\partial V}\right|_T }

which is what we wanted! There should be no bitter aftertaste of guilt this time.

The big picture

That’s almost all I want to say: a simple exposition of well-known stuff that’s not quite as well-known as it should be. If you know some thermodynamics and are feeling mildly ambitious, you can now work out the pressure of an ideal gas and show that it’s completely entropic in origin: only the first term in the right-hand side above is nonzero. If you’re feeling a lot more ambitious, you can try to read Verlinde’s papers and explain them to me. But my own goal was not to think about gravity. Instead, it was to ponder a question raised by Allen Knutson: how does the ‘entropic force’ idea fit into my ruminations on classical mechanics versus thermodynamics?

It seems to fit in this way: as we go from classical statics (governed by the principle of least energy) to thermal statics at fixed temperature (governed by the principle of least free energy), the definition of force familiar in classical statics must be adjusted. In classical statics we have

\displaystyle{ F_i = - \frac{\partial U}{\partial q^i}}

where

U: Q \to \mathbb{R}

is the energy as a function of some coordinates q^i on the configuration space of our system, some manifold Q. But in thermal statics at temperature T our system will try to minimize, not the energy U, but the Helmholtz free energy

A = U - T S

where

S : Q \to \mathbb{R}

is the entropy. So now we should define force by

\displaystyle{ F_i = - \frac{\partial A}{\partial q^i}}

and we see that force has an entropic part and an energetic part:

\displaystyle{  F_i = T \frac{\partial S}{\partial q^i}} -  \frac{\partial U}{\partial q^i}

When T = 0, the entropic part goes away and we’re back to classical statics!


I’m subject to the natural forces.Lyle Lovett

18 Responses to Entropic Forces

  1. anders says:

    Nice post!
    Suggestion for next post: Deriving the force-extension curve of a freely-jointed chain model for a polymer (“rubber band”). Then the worm-like-chain model, and compare with single-molecule DNA-stretching experiments.

  2. Mike Stay says:

    So there should be a similar splitting of the momentum, with a part due to the free action and a part due to quantropy.

    • John Baez says:

      Darn, you beat me to it. Shh!

      Yes, the nice thing about having two analogies to play with (classical statics versus thermal statics, thermal statics versus quantum dynamics) is that one can generate a lot of ideas; it takes longer for both analogies to ‘saturate’ than if you have just one.

      I’m busy writing a post on quantropy, where I try to work it out in an example so we can explore in detail ideas like the one you mentioned. It’s hard to develop a good intuition for quantropy without looking at some examples. Of course one can follow the analogies and make a lot of very good guesses about it. But the hands-on feel for entropy that I’ve built up through many calculations, I’m still lacking for quantropy.

  3. Arrow says:

    Shouldn’t there be a + sign in equation for entropic force?

    Anyway I always have trouble with entropy and especially with the notion of it as a fundamental quantity (same goes for information).

    For example let’s look at the simplest case I can think of – of one dimensional piston of length L with just one molecule of ideal gas going back and forth between the walls. The molecule will hit walls with certain average frequency dependent on the average momentum (ie temperature). So if I understand it correctly in this case entropy is directly related to the length of a piston since to describe the microscopic state we have to specify the position of the molecule and it’s direction. So decreasing the piston length L while keeping temperature (and therefore avg momentum) constant will decrease the entropy and also result in the molecule hitting the walls more frequently so the avg. force exerted by the molecule on the walls will increase.

    Ok, so one could say that the average force increased because of decrease in entropy, but while correct that is an abstract statement which seems (to me anyway) much less informative then stating that the average force increased due to decrease in piston length. Here the piston length seems like a fundamental parameter of the problem and entropy is just an abstract concept derived from it.

    Now I understand the usefulness of entropy when talking about macroscopic processes since it allows us to abstract from the details of microscopic behavior so we can calculate useful quantities even when we don’t have good grasp of the details of microscopic behavior in our problem. But I don’t see it’s usefulness at the microscopic level where quantities like space, time, momentum and energy seem much more fundamental and relevant.

    This is also why the notion of “gravity as an entropic force” seems much less appealing to me then gravity as spacetime curvature (if only other forces could be derived from spacetime geometry…btw I’ve seen papers that show EM can be seen as a manifestation of spacetime torsion, is this a valid approach?).

    • John Baez says:

      Arrow wrote:

      Shouldn’t there be a + sign in equation for entropic force?

      No, not if we’re talking about the same equation. But you may indeed have noticed an inconsistency in what I wrote, due to a typo. I wrote:

      \displaystyle{ F = T \left.\frac{\partial S}{\partial x}\right|_T - \left.\frac{\partial U}{\partial x}\right|_T }

      So, the force consists of two parts: the energetic force

      \displaystyle{ F_{\mathrm{energetic}} = - \left.\frac{\partial U}{\partial x}\right|_T }

      and the entropic force:

      \displaystyle{ F_{\mathrm{entropic}} = - T \left.\frac{\partial S}{\partial x}\right|_T}

      But that last minus sign was wrong. In fact

      \displaystyle{ F_{\mathrm{entropic}} =  T \left.\frac{\partial S}{\partial x}\right|_T}

      In other words, the entropic force points in the direction of increasing entropy (at least if T > 0, which is true except in rather unusual circumstances, which I will ignore henceforth).

      So if I understand it correctly in this case entropy is directly related to the length of a piston since to describe the microscopic state we have to specify the position of the molecule and it’s direction. So decreasing the piston length L while keeping temperature (and therefore avg momentum) constant will decrease the entropy and also result in the molecule hitting the walls more frequently so the avg. force exerted by the molecule on the walls will increase.

      It sounds like you’re saying the force decreases if the entropy increases as we expand the piston. The equations I’m throwing around say the force is positive if if the entropy increases as we expand the piston:

      \displaystyle{ F_{\mathrm{entropic}} =  T \left.\frac{\partial S}{\partial x}\right|_T }

      It’s true that as you expand a piston full of ideal gas, the force pushing on its top decreases. But my blog post is talking about F_{\mathrm{entropic}} itself, not how this force changes as you change x (the length of the piston). Obviously the force changes like this:

      \displaystyle{ \frac{\partial F_{\mathrm{entropic}}}{\partial x} =  T \left.\frac{\partial^2 S}{\partial x^2}\right|_T }

      I will avoid discussing gravity, except for this:

      btw I’ve seen papers that show EM can be seen as a manifestation of spacetime torsion, is this a valid approach?

      I’d never seen such an idea, despite spending an unhealthy amount of time thinking about ‘teleparallel gravity’, a theory that’s almost equivalent to general relativity, but in which gravity is described using torsion rather than curvature. Now that you mention it, I see a paper that claims you can describe gravity coupled to electromagnetism and spinors using torsion. I can see that it’s not the work of a crackpot, but I can’t assure you that it’s correct.

  4. This is an excellent post, and a jumping off point for lots of discussion.

    Here is one —

    If we were to use the rubber band analogy in terms of the greenhouse gas theory, how would it work?

    I would suggest that a greenhouse gas serves to limit the outgoing radiation into bands of wavelength. This reduces the space of allowable energy states and thus reduces the entropy of the subsystem. However, we still must maintain an energy balance with the external system, and so the entropic part of the TS decrease in free energy is exactly compensated by a temperature increase.

    At the most elemental level, that is why greenhouse gases raise the temperature of a planet’s surface. We can talk all we want about variability in climate dynamics and atmospheric lapse rate, etc, but this is the heart of the argument.

    Stretching the rubber band is like putting notches in the emission spectrum. That decreases entropy of the photonic volume, and temperature has to compensate. Mathematically, this is calculated by rescaling the Planck gray-body response.

    I bring this up because the complexity of the gravity=entropic force argument makes this look simple in comparison.

  5. Mike Stay says:

    So now we have four very similar equations:

    The minimal-action path given Hamilton’s principal function satisfies
    d(Action) = Momentum * d(Position) – Energy * d(Time)
    where all of these are functions of time.

    The one you talked about here is
    d(Energy) = kT * d(Entropy) – Force * d(Position).

    If we have a statistical ensemble of paths and need to choose one based on a constraint on the mean action and, say, the mean position at a given time, we have
    d(Action) = Lambda * d(Entropy) – Momentum * d(Position)
    When we do quantum superpositions rather than statistical ensembles, we get your notion of quantropy.

    If we have a rubber band under tension and increase the temperature (like in this heat engine described by Feynman) then the rubber band contracts:
    d(Entropy) = Force * d(ThermalExpansionCoefficient) – Energy * d(Coolness)

    Can we describe these last three in a similar way to the first? As we change the position of the piston, do the temperature and entropy change as though they were a particle moving in phase space with energy playing the role of Hamilton’s principal function? Similarly, if we change the temperature, do the force and thermal expansion coefficient change as though they were a particle moving in a phase space with entropy playing the role of the principal function?

    • John Baez says:

      Mike wrote:

      As we change the position of the piston, do the temperature and entropy change as though they were a particle moving in phase space with energy playing the role of Hamilton’s principal function? Similarly, if we change the temperature, do the force and thermal expansion coefficient change as though they were a particle moving in a phase space with entropy playing the role of the principal function?

      Yes, I believe so! Blake mentioned some examples of this phenomenon here, where he wrote:

      Here’s M. J. Peterson (1979), “Analogy between thermodynamics and mechanics” American Journal of Physics 47, 6: 488, DOI:10.1119/1.11788.

      We note that equations of state—by which we mean identical relations among the thermodynamic variables characterizing a system—are actually first‐order partial differential equations for a function which defines the thermodynamics of the system. Like the Hamilton‐Jacobi equation, such equations can be solved along trajectories given by Hamilton’s equations, the trajectories being quasistatic processes which obey the given equation of state. This gives rise to the notion of thermodynamic functions as infinitesimal generators of quasistatic processes, with a natural Poisson bracket formulation. This formulation of thermodynamic transformations is invariant under canonical coordinate transformations, just as classical mechanics is, which is to say that thermodynamics and classical mechanics have the same formal structure, namely a symplectic structure.

      The boldface sentence is a way of saying ‘yes’ to your question in a bunch of thermodynamic examples. I’m pretty sure it’s a very general fact.

  6. Yrogirg says:

    Hello, John! Are your posts (for example this one) available as PDF’s? Some of them, like network theory, are on azimuth wiki, which can produce pdf, but not this one. I wanted to read it on a e-book reader, however this html doesn’t fit it really well, especially latex as images.

    • John Baez says:

      Hi! No, I haven’t made them available as PDFs. You can get these series of posts on my website:

      Information Geometry.

      Network Theory.

      I think they look better there than here—just click the box on top to get the jsmath set up and the box will go away.

      I not put my posts on quantropy or ‘thermodynamics versus classical mechanics’ onto my website yet, but I will, and I’ll let people know when I do. It takes a bit of work. I’ll probably put them into a single series, because they belong together. (In fact all this stuff fits together into a big story, but that’s going to take a while for me to flesh out!)

      I’m writing a paper based on the Network Theory series, and I plan to write a paper on quantropy too. They’ll be more polished than these blog posts…

  7. Yrogirg says:

    One reason people don’t like thermodynamics is that they don’t understand partial derivatives…

    Well, I do love thermodynamics, but the most difficult thing for me is to decide what is the sign near the work term dU =  \delta Q - \delta A. And what work is it — done by the system or by the environment. May be there is some trick to remember?

    Anyway, I hope what follows will be right. So consider a rubber band of length L — let it be the only geometric parameter describing the band. Let R be the force that pulls your handwhen you are stretching the band. So if it pulls, it is positive. Then:

    dU = T dS + R dL

    or

    dA = -SdT + R dL

    Hence \frac{\partial A}{\partial T \partial L}:

    \displaystyle{\frac{\partial R}{\partial T} \bigg|_L = - \frac{\partial S}{\partial L} \bigg|_T > 0}

    Thus if you heat the rubber band it will pull harder, it shrinks. I was just curious whether I could prove it :-) Maybe I failed but the fact still holds. One need to know this property of rubber in order to explain the rotating sense of a rubber band heat engine.

    • John Baez says:

      Hi! I don’t think there’s any ‘trick’ to remembering the sign of work. I agree that it’s an annoying issue. But it just means I need to spend a minute deciding whether I’m talking about the work the system is doing on the environment or the work the environment is doing on the system, which has the opposite sign.

      I find it much more annoying when people tell me set my watch “forward” one hour when Daylight Saving Time starts in the spring. Do they mean to set my watch to an earlier time, or a later time? The word “forward” is confusing. The “forward” of a book is near the front, but as you read “forwards” through the book you move toward the back. Similarly, ancient history is the study of the time when everything was a lot younger than it is now!

      I had to learn category theory to really understand this stuff.

      Of course, one can try to choose a convention and stick with it. President Kennedy famously said “ask not what your country can do for you—ask what you can do for your country!” So he preferred to always think about the work the system (you) did on its environment (your country).

      Thus if you heat the rubber band it will pull harder, it shrinks. I was just curious whether I could prove it :-)

      I think your argument is correct, and it’s nice! My argument would be to use the formula I gave:

      \displaystyle{  F_i = T \frac{\partial S}{\partial q^i}} -  \frac{\partial U}{\partial q^i}

      The force of a rubber band or stretched spring has an entropic part (the first term) and an energetic part (the second term). The entropic part is proportional to temperature, so it gets bigger when it’s hot. The energetic part doesn’t change.

      • Yrogirg says:

        The entropic part is proportional to temperature, so it gets bigger when it’s hot. The energetic part doesn’t change.

        Before proceeding, note that my R is opposite to your $F$ — when the band pulls F is negative (pressure is negative here, unlike the gas piston). So, to my point. Actually both parts depend on temperature and they both can change. So from your formula one should carefully find

        \frac{\partial F}{\partial T} = \frac{\partial}{\partial T} {- \frac{\partial A}{\partial L}} = - \frac{\partial A}{\partial L  \partial T} = \frac{\partial S}{\partial L}

        So both derivations are identical indeed (despite the notion difference F = -R).

        But I’d like to emphasize again what really matters — the sign

        $\frac{\partial S}{\partial L} < 0 $

        For a metal rod or a piston (and I guess for a spring) it is positive. Stretching these systems increases the phase space allowed for the system so the entropy increases. Meanwhile if you heat the systems mentioned they expand. Well, just like we were taught at school "when a substance is heated it expands".

        The story is opposite for a rubber band. If it is stretched, its entropy decreases. If it is heated it contracts. So the whole thing was to demonstrate how these two "anomalies" are interconnected.

  8. […] The thing with rubber is that the elastic forces you experience are entropic, that is when you stretch a rubber band you (roughly speaking) do not increase its internal energy, you decrease its entropy. That’s because rubber molecules are long twisted chains and when you expand rubber you straighten them, thus ordering (decreasing their entropy). A simple kinetic theory of rubber based on entropic reasoning is presented in the book. For quick introduction on rubber thermodynamics I suggest you John Baez’s post about entropic forces. […]

  9. amarashiki says:

    I has been thinking about this post for a long time, John. The reason is that your expression for the force as the sum of an entropic term plus a potential (energy) term looks pretty similar (but not identical) to the expresion of the force in lagrangian dynamics with dissipation. The big “but” is that the dissipative part is generally assumed to take the form of the so-called Rayleigh dissipative function D:

    D\equiv =\dfrac{1}{2}c\dot{q}^2
    so

    F_d=-\dfrac{\partial D}{\partial \dot{q}}

    and then

    F_t = F_d+F_p = -\dfrac{\partial D}{\partial \dot{q}}-\dfrac{\partial V}{\partial q}

    Therefore, if we identify the entropic part with the dissipative term related to the Rayleigh function, we have

    T\dfrac{\partial S}{\partial  q}=\dfrac{\partial D}{\partial \dot{q}}

    Does this last equation make sense?

    • John Baez says:

      I don’t think it makes sense to identify an entropic force with a frictional force coming from a Rayleigh function, because a frictional force is almost always velocity-dependent while an entropic force is often not.

      Furthermore, the entropic force

      T \dfrac{\partial S}{\partial q}

      involves a partial derivative with respect to q, while the frictional force

      -\dfrac{\partial D}{\partial \dot{q}}

      involves a partial derivative with respect to \dot{q}.

      Furthermore, the entropic force is proportional to temperature, T,, while the frictional force is not.

      They seem very different.

  10. […] Starting with the discussion of the inclined plane in school, most people are used to to energetic forces. These are forces that arise due to the gradient of an energy function, often also called a potential function or just potential. Examples are the electric forces drawing electrons through wires due to the potential difference created by a battery in simple circuits, or the force of gravity. Think again of our red ball on the slope, who’s acceleration and de-acceleration is simply due to the gradient in its gravitational potential function, which just happens to be the slope at its current position. However, many seemingly familiar forces actually arise not from the drive to decrease potential energy, but from a gain in entropy. A very familiar example is the force pulling a rubber band back together when you stretch it. In its relaxed state, the long polymers making up the band can curl up and wiggle in many more ways, compared to when they are all stretched in parallel. Thus, the force pulling the band back together is not the potential energy of the molecular bonds, which would have a very different characteristic, but indeed the potential increase in entropy, i.e. the increased volume of the accessible microscopic phase space in the relaxed state. And even seemingly more familiar forces, such as the force created by an expanding gas pushing on a cylinder, are in fact of entropic origin. A very insightful, and mathematically explicit discussion of entropic forces is given in this excellent blog post by John Baez. […]

You can use Markdown or HTML in your comments. You can also use LaTeX, like this: $latex E = m c^2 $. The word 'latex' comes right after the first dollar sign, with a space after it.

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.