I showed you last time that in many branches of physics—including classical mechanics and thermodynamics—we can see our task as minimizing or maximizing some function. Today I want to show how we get from that task to symplectic geometry.

So, suppose we have a smooth function

where is some manifold. A minimum or maximum of can only occur at a point where

Here the differential which is a 1-form on If we pick local coordinates in some open set of then we have

and these derivatives are very interesting. Let’s see why:

**Example 1.** In classical mechanics, consider a particle on a manifold Suppose the particle starts at some fixed position at some fixed time. Suppose that it ends up at the position at time Then the particle will seek to follow a path that minimizes the action given these conditions. Assume this path exists and is unique. The action of this path is then called **Hamilton’s principal function**, Let

and assume Hamilton’s principal function is a smooth function

We then have

where are local coordinates on

is called the **momentum** in the *i*th direction, and

is called the **energy**. The minus signs here are basically just a mild nuisance. Time is different from space, and in special relativity the difference comes from a minus sign, but I don’t think that’s the explanation here. We could get rid of the minus signs by working with negative energy, but it’s not such a big deal.

**Example 2.** In thermodynamics, consider a system with the internal energy and volume Then the system will choose a state that maximizes the entropy given these constraints. Assume this state exists and is unique. Call the entropy of this state Let

and assume the entropy is a smooth function

We then have

where is the **temperature** of the system, and is the **pressure**. The slight awkwardness of this formula makes people favor other setups.

**Example 3.** In thermodynamics there are many setups for studying the same system using different minimum or maximum principles. One of the most popular is called the **energy scheme**. If internal energy increases with increasing entropy, as usually the case, this scheme is equivalent to the one we just saw.

In the energy scheme we fix the entropy and volume Then the system will choose a state that minimizes the internal energy given these constraints. Assume this state exists and is unique. Call the internal energy of this state Let

and assume the entropy is a smooth function

We then have

where

is the temperature, and

is the pressure. You’ll note the formulas here closely resemble those in Example 1!

**Example 4.** Here are the four most popular schemes for thermodynamics:

• If we fix the entropy and volume the system will choose a state that minimizes the internal energy

• If we fix the entropy and pressure the system will choose a state that minimizes the enthalpy

• If we fix the temperature and volume the system will choose a state that minimizes the Helmholtz free energy

• If we fix the temperature and pressure the system will choose a state that minimizes the Gibbs free energy

These quantities are related by a pack of similar-looking formulas, from which we may derive a mind-numbing little labyrinth of Maxwell relations. But for now, all we need to know is that all these approaches to thermodynamics are equivalent given some reasonable assumptions, and all the formulas and relations can be derived using the Legendre transformation trick I explained last time. So, I won’t repeat what we did in Example 3 for all these other cases!

**Example 5.** In classical statics, consider a particle on a manifold This particle will seek to minimize its **potential energy** which we’ll assume is some smooth function of its position We then have

where are local coordinates on and

is called the **force** in the *i*th direction.

#### Conjugate variables

So, the partial derivatives of the quantity we’re trying

to minimize or maximize are very important! As a result, we often want to give them more of an equal status as independent quantities in their own right. Then we call them ‘conjugate variables’.

To make this precise, consider the cotangent bundle which has local coordinates (coming from the coordinates on ) and (the corresponding coordinates on each cotangent space). We then call the **conjugate variable** of the coordinate

Given a smooth function

the 1-form can be seen as a section of the cotangent bundle. The graph of this section is defined by the equation

and this equation ties together two intuitions about ‘conjugate variables’: as coordinates on the cotangent bundle, and as partial derivatives of the quantity we’re trying to minimize or maximize.

#### The tautological 1-form

There is a lot to say here, especially about Legendre transformations, but I want to hasten on to a bit of symplectic geometry. And for this we need the ‘tautological 1-form’ on

We can think of as a map

sending each point to the point where is defined by the equation we just saw:

Using this map, we can pull back any 1-form on to get a 1-form on

What 1-form on might we like to get? Why, of course!

Amazingly, there’s a 1-form on such that when we pull it back using the map we get the 1-form —no matter what smooth function we started with!

Thanks to this wonderfully tautological property, is called the **tautological 1-form** on You should check that it’s given by the formula

If you get stuck, try this.

So, if we want to see how much changes as we move along a path in we can do this in three equivalent ways:

• Evaluate at the endpoint of the path and subtract off at the starting-point.

• Integrate the 1-form along the path.

• Use to map the path over to and then integrate over this path in

The last method is equivalent thanks to the ‘tautological’ property of It may seem overly convoluted, but it shows that if we work in where the conjugate variables are accorded equal status, *everything we want to know about the change in * * is contained in the 1-form * *no matter which function* *we decide to use!*

So, in this sense, knows everything there is to know about the change in Hamilton’s principal function in classical mechanics, or the change in entropy in thermodynamics… and so on!

But this means it must know about things like Hamilton’s equations, and the Maxwell relations.

#### The symplectic structure

We saw last time that the fundamental equations of classical mechanics and thermodynamics—Hamilton’s equations and the Maxwell relations—are mathematically *just the same*. They both say simply that partial derivatives commute:

where is the function we’re trying to minimize or maximize.

I also mentioned that this fact—the commuting of partial derivatives—can be stated in an elegant coordinate-free way:

Perhaps I should remind you of the proof:

but

changes sign when we switch and while

does not, so It’s just a wee bit more work to show that conversely, starting from it follows that the mixed partials must commute.

How can we state this fact using the tautological 1-form ? I said that using the map

we can pull back to and get . But pulling back commutes with the operator! So, if we pull back we get But So, has the magical property that when we pull it back to we always get zero, *no matter what* *we choose!*

This magical property captures Hamilton’s equations, the Maxwell relations and so on—for all choices of at once. So it shouldn’t be surprising that the 2-form

is colossally important: it’s the famous **symplectic structure** on the so-called **phase space**

Well, actually, most people prefer to work with

It seems this whole subject is a monument of austere beauty… covered with minus signs, like bird droppings.

**Example 6.** In classical mechanics, let

as in Example 1. If has local coordinates then has these along with the conjugate variables as coordinates. As we explained, it causes little trouble to call these conjugate variables by the same names we used for the partial derivatives of namely, and So, we have

and thus

**Example 7.** In thermodynamics, let

as in Example 3. If has coordinates then the conjugate variables deserve to be called So, we have

and

You’ll see that in these formulas for variables get paired with their conjugate variables. That’s nice.

But let me expand on what we just saw, since it’s important. And let me talk about without tossing in that extra sign.

What we saw is that the 2-form is a ‘measure of noncommutativity’. When we pull back to we get zero. This says that *partial derivatives commute*—and this gives Hamilton’s equations, the Maxwell relations, and all that. But up in is not zero. And this suggests that there’s some built-in noncommutativity hiding in phase space!

Indeed, we can make this very precise. Consider a little parallelogram up in :

Suppose we integrate the 1-form up the left edge and across the top. Do we get the same answer if integrate it across the bottom edge and then up the right?

No, not necessarily! The difference is the same as the integral of all the way around the parallelogram. By Stokes’ theorem, this is the same as integrating over the parallelogram. And there’s no reason that should give zero.

However, suppose we got our parallelogram in by taking a parallelogram in and applying the map

Then the integral of around our parallelogram would be zero, since it would equal the integral of around a parallelogram in … and that’s the change in as we go around a loop from some point to… itself!

And indeed, the fact that a function doesn’t change when we go around a parallelogram is precisely what makes

So the story all fits together quite nicely.

#### The big picture

I’ve tried to show you that the symplectic structure on the phase spaces of classical mechanics, and the lesser-known but utterly analogous one on the phase spaces of thermodynamics, is a natural outgrowth of utterly trivial reflections on the process of minimizing or maximizing a function on a manifold .

The first derivative test tells us to look for points with

while the commutativity of partial derivatives says that

everywhere—and this gives Hamilton’s equations and the Maxwell relations. The 1-form is the pullback of the tautologous 1-form on and similarly is the pullback of the symplectic structure The fact that

says that holds noncommutative delights, almost like a world where partial derivatives no longer commute! But of course we still have

everywhere, and this becomes part of the official definition of a **symplectic structure**.

All very simple. I hope, however, the experts note that to see this unified picture, we had to avoid the most common approaches to classical mechanics, which start with either a ‘Hamiltonian’

or a ‘Lagrangian’

Instead, we started with Hamilton’s principal function

where is not the usual configuration space describing possible positions for a particle, but the ‘extended’ configuration space, which also includes *time*. Only this way do Hamilton’s equations, like the Maxwell relations, become a trivial consequence of the fact that partial derivatives commute.

But what about those ‘noncommutative delights’? First, there’s a noncommutative Poisson bracket operation on functions on . This makes the functions into a so-called Poisson algebra. In classical mechanics of a point particle on the line, for example, it’s well-known that we have

In thermodynamics, the analogous relations

seem sadly little-known. But you can see them here, for example:

• M. J. Peterson, Analogy between thermodynamics and mechanics, *American Journal of Physics* **47** (1979), 488–490.

at least up to one of those pesky minus signs! We can use these Poisson brackets to study how one thermodynamic variable changes as we slowly change another, staying close to equilibrium all along.

Second, we can go further and ‘quantize’ the functions on . This means coming up with an associative but noncommutative product of these function that mimics the Poisson bracket to some extent. In the case of a particle on a line, we’d get commutation relations like

where is Planck’s constant. Now we can represent these quantities as operators on a Hilbert space, the uncertainty principle kicks in, and life gets really interesting.

In thermodynamics, the analogous relations would be

The math works just the same, but *what does it mean physically?* Are we now thinking of temperature, entropy and the like as ‘quantum observables’—for example, operators on a Hilbert space? Are we just *quantizing thermodynamics?*

That’s one possible interpretation, but I’ve never heard anyone discuss it. Here’s one good reason: as Blake Stacey pointed out below, these equations don’t pass the test of dimensional analysis! The quantities at left have units of energy, while Plank’s constant has units of action. So maybe we need to introduce a quantity with units of time at right, or maybe there’s some other interpretation, where we don’t interpret the parameter as the good old-fashioned Planck’s constant, but something else instead.

And if you’ve really been paying attention, you may wonder how quantropy fits into this game! I showed that at least in a toy model, the path integral formulation of quantum mechanics arises, not exactly from maximizing or minimizing something, but from finding its critical points: that is, points where its first derivative vanishes. This something is a complex-valued quantity analogous to entropy, which I called ‘quantropy’.

Now, while I keep throwing around words like ‘minimize’ and ‘maximize’, most everything I’m doing works just fine for critical points. So, it seems that the apparatus of symplectic geometry may apply to the path-integral formulation of quantum mechanics.

But that would be weirdly interesting! In particular, what would happen when we go ahead and *quantize* the path-integral formulation of quantum mechanics?

If you’re a physicist, there’s a guess that will come tripping off your tongue at this point, without you even needing to think. Me too. But I don’t know if that guess is right.

Less mind-blowingly, there is also the question of how symplectic geometry enters into classical *statics* via the idea of Example 4.

But there’s a lot of fun to be had in this game already with thermodynamics.

#### Appendix

I should admit, just so you don’t think I failed to notice, that only rather esoteric physicists study the approach to quantum mechanics where time is an *operator* that doesn’t commute with the Hamiltonian In this approach *commutes* with the momentum and position operators. I didn’t write down those commutation equations, for fear you’d think I was a crackpot and stop reading! It is however a perfectly respectable approach, which can be reconciled with the usual one. And this issue is not only quantum-mechanical: it’s also important in classical mechanics.

Namely, there’s a way to start with the so-called **extended phase space** for a point particle on a manifold :

with coordinates and and get back to the usual phase space:

with just and as coordinates. The idea is to impose a constraint of the form

to knock off one degree of freedom, and use a standard trick called ‘symplectic reduction’ to knock off another.

Similarly, in quantum mechanics we can start with a big Hilbert space

on which and are all operators, then impose a constraint expressing in terms of and and then use that constraint to pick out states lying in a smaller Hilbert space. This smaller Hilbert space is naturally identified with the usual Hilbert space for a point particle:

Here is called the configuration space for our particle; its cotangent bundle is the usual phase space. We call the **extended configuration space** for a particle on the line; its cotangent bundle is the extended phase space.

I’m having some trouble remembering where I first learned about these ideas, but here are some good places to start:

• Toby Bartels, Abstract Hamiltonian mechanics.

• Nikola Buric and Slobodan Prvanovic, Space of events and the time observable.

• Piret Kuusk and Madis Koiv, Measurement of time in nonrelativistic quantum and classical mechanics, *Proceedings of the Estonian Academy of Sciences, Physics and Mathematics* **50** (2001), 195–213.

Esoteric or not, I would like to learn more about this trick. Can you please provide a reference or elaborate this a little further?

Thanks for pushing me on this… it’s one of those things I’ve managed to learn in my work on quantum gravity without remembering the best reference. Here are two references I just dug up:

• Nikola Buric and Slobodan Prvanovic, Space of events and the time observable.

• Piret Kuusk and Madis Koiv, Measurement of time in nonrelativistic quantum and classical mechanics,

Proceedings of the Estonian Academy of Sciences, Physics and Mathematics50(2001), 195–213.I feel sure some of my quantum gravity pals have written about this too—maybe Carlo Rovelli. The phrase ‘extended phase space’ is a useful buzzword: this is a name for the cotangent bundle of phase space. The issue I mention is not only quantum-mechanical: it’s also important in classical mechanics. Namely, there’s a way to get from the extended phase space to the usual phase space of a point particle by imposing a constraint of the form

and doing a standard trick called ‘symplectic reduction’.

It’s good to get software that reads DjVU files, because they’re more efficient than PDF files, especially for scans of big books. For example, here you can download a free copy of my book

Introduction to Algebraic and Constructive Quantum Field Theory, either in PDF (35 megabytes) or DjVU (3.26 megabytes).Thanks for reminding me of your one-page summary, Toby! That’s very helpful. I’ve added an appendix to my blog article answering Uwe’s question and linking to some references including a PDF version of this summary.

In my appendix I was trying to say “I don’t remember I learnt this stuff, so read these instead”.

Your appendix gives the impression that you might have learnt this from me. If anything, it was the reverse!

In another millennium, I wrote a one-page summary of this: http://tobybartels.name/notes/#Hamiltonian

That worked. Thank you David :)

Eric, try

http://windjview.sourceforge.net/

How do you view DjVU? Windows is telling me I should download some photography software :)

I was going to comment on the previous post about “quantizing” the Poisson brackets and , and how I had no idea what a formal statement like actually means. (There’s an extra unit of time floating about somewhere, since is a quantum of action whilst conjugate variables in thermodynamics multiply to give energies, but I don’t think that’s a major point.) There goes my claim to priority on that statement of ignorance. . . .

I’m not sure how the “quantized thermodynamics” one gets in this way relates to the quantum statistical physics I’ve studied before. Maybe they match up somehow, but it’s not clear to me. More thinking required!

(Also, you’ve an extra dollar sign in the sentence beginning, “The trick is to start with a big Hilbert space”.)

Noted here for future reference:

R. Gilmore (1985), “Uncertainty relations of statistical mechanics”

Physical Review A31: 3237–39, DOI:10.1103/PhysRevA.31.3237. Abstract:I believe the “energy representation” is closer to what we’ve been talking about. The bound on the right-hand side would then be the thermal Boltzmann energy , instead of Planck’s constant.

Maybe symplectic geometry is a way of talking about classical statistical mechanics in the limit where fluctuations are not quite negligible?

In this blog entry I had said it was ‘fine’ to write , not noticing that it was dimensionally incorrect. Thanks for catching that, Blake! This makes the issue even more interesting.

I can’t stomach the thought of a blog article where I say it’s ‘fine’ to write down an impossible equation, so I’ve corrected this passage while giving you credit for noticing this.

Yes, I don’t see how they’d match up neatly. In ‘quantized thermodynamics’ entropy is an

operator; in ordinary quantum statistical mechanics it’s not. I think it’s time to google the phrase ‘entropy operator’.Hey thanks, Blake, that’s nice! Here are some thoughts before I break down and read the paper:

It’s a bit mysterious. Here I’m using the analogy

This makes it odd to use the analogy

In other words, it’s odd to use as the analogue of in a formalism where is not a constant but a variable! It would be analogous doing quantum mechanics in a setup where was proportional to the momentum operator .

(There

arephysicists who consider versions of quantum mechanics in which Planck’s constant becomes an operator, but I’ve never been remotely eager to join their ranks.)On the other hand, you’ll note that I have a slightly

differentanalogy going on in my quantropy post! Over there I was quite happily using the analogySo maybe this is what we need to do: study thermodynamics at

fixed temperature, get a symplectic structure on the phase space of the remaining variables (for example volume and pressure, particle numbers and chemical potentials, etc.), and then ‘quantize’ it with or maybe playing the role of Planck’s constant.That’s a nice thought. By the way, the weird intrusion of in the analogy here:

reminds me of my formalism that unifies thermal fluctuations and quantum fluctuations as the real and imaginary parts of a complex-valued inner product on the tangent space of a manifold (here and here). However, in that setup, the thermal fluctuations are not described by a

skew-symmetrictensor (like a symplectic structure), but rather asymmetrictensor.So in short, there are too many thoughts buzzing around my brain… on lucky days most of them fly away and one sits there long enough for me to catch it.

Just to note before I have the chance to read this that symplectic geometry seems to be in the air in the math blogosophere at the moment.

David: your link is broken!

By the way, I found the title ‘symplectic manifolds need to be closed’ quite irksome, since there’s such a thing as a ‘closed manifold’ (meaning a compact manifold), and symplectic manifolds don’t need to be closed in that sense. I’d say ‘symplectic structures need to be closed’: a symplectic structure is a 2-form , and it must obey

http://sbseminar.wordpress.com/2012/01/14/why-do-symplectic-manifolds-need-to-be-closed/

I really enjoyed this post. Thanks for writing it!

The tautological 1-form reminds me of a tautological vector-valued 1-form, i.e. the identity map

Interesting. We have

for a 1-form and

for a vector field , but feeding a 0-form , we get

Given that , it makes me think there might be a commuting diagram hiding somewhere.

Since , what I meant is .

Glad you liked it. Yes, the tautological 1-form is a close relative of that vector-valued 1-form; not the same, but based on the same idea.

Interesting stuff! You say that the tautological 1-form “knows everything there is to know” about the mechanics, but it seems like such a trivial construction from the coordinates on T*Q…would it be fair to say that T*Q is what “knows” about the mechanics, i.e. the mechanics is encoded in the structure of the manifold? Or perhaps it’s the coordinates that “know” this, since alpha only has that nice formula in canonical coordinates on T*Q?

BTW, you seem to have two “Example 5″s. :)

From the definition of the cotangent bundle together with its map , all else follows. I just wanted to emphasize the role of the tautological 1-form.

Coordinates are not really required for mechanics; I only introduced them because people like them and in our examples they have exciting names like ‘position’, ‘momentum’, ‘time’, and ‘energy’.

Thanks for catching the misnumbered example. I also tried to fix a bunch of minus signs in the commutation relations… but I also added some more examples, so there are probably some new mistakes too.

I hope that you don’t mind off topic questions. I was mucking through the nits trying to understand this, and something bubbled up. Is a bilinear form the tool that we use to build up the concept of an ‘exact’ differential? [Which was a concept tossed at me in my undergraduate thermodynamics class, and contrasted with an inexact differential] [And so… are the Cauchy-Riemann equations constraints on a bilinear form?]

I’ve never thought of bilinear forms as a key tool to understanding the concept of exact differentials. To me what matters most is differential forms, which are something else, and extremely important throughout physics. There’s an operation , the exterior derivative, that you can apply to any differential form , and if we say this differential form is ‘exact‘. An ‘exact differential’ is a slightly older name for an exact 1-form. (Differential forms come in various kinds: 0-forms, 1-forms, 2-forms, etcetera, and of an n-form is an (n+1)-form.)

All this is very good to study.

I don’t think of them that way.

Every time I read through the wikis on this stuff, it makes more sense. Thanks for the reply.

Is example 2 OK? I can’t make the units work for the terms of dS.

Sorry, that example was screwed up. Thanks for catching my mistake! The formula should be

and the slight awkwardness of this formula is one reason people prefer the so-called ‘energy scheme’, where we talk about minimizing internal energy at fixed entropy and volume, rather than maximizing entropy at fixed energy and volume. Then we get

and you’ll notice this formula, while nicer-looking, is equivalent to the previous one if we set , which is in fact correct: ‘internal energy’ is just I’d been calling ‘energy’ in Example 2.

To fix this, I rewrote Example 2 and added a new Example 3.

Over on Google+, Alexander Golden wrote:

I replied:

He replied:

I replied:

Hi John,

You might have noticed I’ve been writing a series of posts

Network Theory and Discrete Calculus

which largely follow what you are doing on this Azimuth blog, but formulated using discrete calculus. Your posts are providing interesting material faster than I can “discretize”.

Of course, this post would be very fun to try to discretize, but one focus here is on coordinates. As you said, this is mostly to allow you to use interesting words for various components and is not really necessary for the physics. For me, it is easiest to discretize something expressed in a coordinate free manner.

What would be a nice “coordinate free” example?

I also really like what you said about Hamilton’s principle function and extended phase space. That resonates very well with discrete calculus and what I call “diamonation.”

It would be fun to reformulate all these interconnections using discrete calculus.

Hi! Yes, I’ve been following your blog posts, but pretty distracted by other things.

It would be good if you could discretize the concept of “cotangent bundle” and the “tautological 1-form” on this bundle. Here’s a nice coordinate-free formula for the tautological 1-form.

For any manifold there’s a map

sending any cotangent vector at the point to the point . Differentiating this we get a map sending tangent vectors on to tangent vectors on :

The tautological 1-form on eats any tangent vector at the point and gives a number as follows:

It’s mind-blowingly tautological, eh? In case your brain starts to melt, it may help to keep in mind.

I only recently noticed the more appealing but equivalent formulation I gave in the blog article, but it was already in Wikipedia.

It’s a good exercise to see why the formula I just gave is equivalent to the tautological property of that I gave in my blog article, and also to the explicit coordinate-ridden formula I gave for . For anyone who gets stuck, Wikipedia gives some hints.

Thanks for the challenge! However, I should manage expectations. I liked the extended phase space and the Hamilton-Jacobi approach precisely because it does not require defining (co)tangent spaces and I have a chance of formulating this discretely.

The notion of cotangent space is tricky because there is no such thing as “covector at a

point“. 1-forms are associated explicitly to 1-dimension edges and not to 0-dimensional points.I will try to do something sensible, but I may not have the maths chops.

Some of the pictures here might be helpful:

Discrete calculus on a 3-diamond and the discrete heat equation in n-dimension

A 2-diamond is a toy model for a (directed) configuration space . A 3-diamond can either be a toy model for a (directed) configuration space or possibly a (directed) phase space .

The projections are obvious.

The extended phase space is usually defined as the cotangent bundle of spacetime, or more generally, of the ‘extended configuration space’. See the Appendix in my blog article if you didn’t see it the first time: I added it later due to a question from Uwe.

You should figure out some good analogue of a cotangent bundle if you want to do classical mechanics discretely. It will require creativity, but getting something like a tautologous 1-form and symplectic structure will be a sign you’re on the right track. It may be okay if cotangent vectors are associated to edges rather than points: after all, in the continuum limit these edges are very short so people with bad vision may have thought they were were points.

Oops! I misspoke. I did not mean “extended phase space”, but “extended configuration space”.

I will take your advice and try to cook up a sensible discrete cotangent bundle, but I’d expect such a thing to allow a discrete calculus on it. Hmm…

Thanks!

What about Zariski cotangent space, which is constructed directly from the functions? (From this perspective cotangent space is the first thing, and tangent space comes later as the dual of cotangent space.)

I think I got it. It may take some time to write up. Thanks!

Possibly relevant:

http://arxiv.org/abs/0807.4632

“We show that when the thermal wavelength is comparable to the spatial size of a system, thermodynamic observables like Pressure and Volume have quantum fluctuations that cannot be ignored. They are now represented by operators; conventional (classical) thermodynamics is no longer applicable…”

http://arxiv.org/abs/math-ph/0703061

“The physical variables of classical thermodynamics occur in conjugate pairs such as pressure/volume, entropy/temperature, chemical potential/particle number. Nevertheless, and unlike in classical mechanics, there are an odd number of such thermodynamic co-ordinates. We review the formulation of thermodynamics and geometrical optics in terms of contact geometry. The Lagrange bracket provides a generalization of canonical commutation relations. Then we explore the quantization of this algebra by analogy to the quantization of mechanics…”

Wow, thanks!

(By the way, I can’t resist telling our readers that you are not the ‘bob’ that often posts here, namely

Bob Coecke.)

These papers look

veryrelevant, especially the second one:• S. G. Rajeev, Quantization of contact manifolds and thermodynamics.

He starts by wondering what could play the role of Planck’s constant in the analogy between thermodynamics and classical mechanics—exactly what Blake and I have been talking about.

He emphasizes contact manifolds, which have already been mentioned in the comments to Part 1 of this series, by Matt Parry. So, Matt should read this! Personally while I like contact geometry and think it

mightbe more fundamental than symplectic geometry, I consider them part of the same story.Here’s an ultra-quick introduction to the idea. In the classical mechanics of a point particle on the line I’ve been talking about a 4-dimensional manifold with coordinates (position, momentum, time, energy). This manifold comes with a 1-form

and then the 2-form

is a ‘symplectic structure’, so our manifold becomes a ‘symplectic manifold’. But another approach is to treat Hamilton’s principal function as a fifth independent coordinate. On this new 5-dimensional manifold there is a 1-form

which is a ‘contact structure’, so this new manifold becomes a ‘contact manifold’.

Similarly, in thermodynamics I’ve been working with the 4-dimensional manifold having (entropy, temperature, volume and pressure) as coordinates, but we can switch to a 5-dimensional manifold with the internal energy as an extra coordinate—and this is a contact manifold.

Since you mention “wondering what could play the role of Planck’s constant in the analogy between thermodynamics and classical mechanics”, I thought I’d point out yet another analogy:

Black-Scholes and Schrodinger

In the post, I highlighted a relationship between the Black-Scholes equation of mathematical finance and the Schrodinger equation.

It was mostly just for fun, but I also show the commutation relation

indicating that the standard deviation plays the role of Planck’s constant.

Interesting. But when my eye first hit the screen, I thought you’d said

Black-Holes and Schrodinger

[…] Here my goal is to ponder a question raised by Allen Knutson: how does the ‘entropic force’ idea fit into my ruminations on classical mechanics versus thermodynamics? […]