It looks like such an encrypted energy storage device should be possible, but it will probably be an engineering nightmare since one first of all has to make a reversible computer that is so efficient that it can decrypt a pseudorandom sequence using much less energy than one obtains from the energy storage device in the first place.

]]>It’s about the link between computation and entropy. I take the idea of a Turing machine for granted, but starting with that I explain recursive functions, the Church-Turing thesis, Kolomogorov complexity, the relation between Kolmogorov complexity and Shannon entropy, the uncomputability of Kolmogorov complexity, the ‘complexity barrier’, Levin’s computable version of complexity, and finally my work with Mike Stay on algorithmic thermodynamics.

For more details, read our paper:

• John Baez and Mike Stay, Algorithmic thermodynamics, *Math. Struct. Comp. Sci.* **22** (2012), 771-787.

or these blog articles:

• Algorithmic thermodynamics (part 1).

• Algorithmic thermodynamics (part 2).

They all emphasize slightly different aspects!

]]>after thinking about it some more, ignore the comment above… sorry.

]]>assuming that there’s two bytes per step, one instruction and one data byte, each byte four bits long. If you have an FSA, though, (recalling from 33 years ago), you might have problems: 0010 0101 0011 1100 1110 0001 1111 might halt prematurely?

]]>@Tim: yes, that’s what I was thinking about, pretty much. I’m thinking of a program in the sense of byte-length instructions, the way I used to do things long ago in my assembly language days.

]]>Well, if it *was* a food fight, I think I was the only one who ended up with egg on my face. :)

I think we had read the same materials and started with different base cases in mind. I’ll remind myself and others in discussions of extensivity to explicitly specify whether (non-)independent subsystems are under consideration or if they are excluded. That changes the context considerably.

]]>John E. Gray wrote:

John B, I did not mean to start a food fight with Justin after seeing his comments.

Don’t worry: I wouldn’t characterize anything that happened here as a ‘food fight’, at worst a heated discussion of the dessert menu.

In particular, this was motivated by the editorial by John Maddox in Nature in 1993 “Is entropy extensive?” in reference to the Bekenstein-Hawking entropy of black holes that started a debate reexamining these issues because the entropy of a black hole is proportional to the area rather than the volume.

Interesting!

As I mentioned, ordinary Shannon entropy fails to add for correlated systems, and the correction terms are often roughly proportional to the area of the boundary between systems. This has made people wonder if black hole entropy arises from this mechanism, and much ink has been spilt on this question.

I’ve never read anything trying to relate black hole entropy to Tsallis entropy. But that’s probably because I’m a bit scared of Tsallis entropy. Now that I’m looking, I’m seeing a bit about this combination of ideas…

Thanks for mentioning the announcement regarding the Mathematics and Climate Research Network! I’ll post a blog article about that.

]]>[4] S. R. Addison … “Is extensivity a fundamental property of entropy?”, *J. Phys. A. Gen.* **34** (2001), 7733-7737.

In particular, this was motivated by the editorial by John Maddox in Nature in 1993 “Is entropy Extensive?” in reference to the Bersktein-Hawkings entropy of black holes that started a debate reexamining these issues because the entropy of a black hole is proportional to the area rather than the volume. Tsallis has database of hundreds of papers related to non-extensive thermodynamics that he regularly updates about non-extensive physics/ entropy. The discussion of entropy from its inception is discussed in Jaynes:

[5] E. T. Jaynes, in *Maximum Entropy and Bayesian Methods*, edited by C.R. Smith, G. J. Erickson, and P.O. Neudorfer, (Kluwer Academic, 1992).

where he notes the original definition of entropy due to Claudius, is the integral of dQ/T is path dependent, so Jaynes noted it has nothing to due with extensivity *per se*. It depends on the particular physical system. For the Gibbs-Boltzmann entropy, this is still the case for micro-systems as well, though most physical systems appear to be extensive for the logarithmic measure. Hill, well before many in this paper

[6] T. L. Hill, *J. Chem. Phys.* **36** (1962), 3182.

reviewed different types of systems, some of which the logarithmic measure were extensive and some which aren’t. This is a point of discussion in nano-systems… Hill’s discussion of this is available online in *Nano-systems Letters*. (He is an example to us all in his nineties and still active). It is this that lead Addison to suggest generalizing the notion of extensivity might be useful. I will quit rambling and not post anymore.

One final note, not related, for you readers: SIAM has posted the following:

The Mathematics and Climate Research Network is a nation-wide NSF funded initiative. Our goal is to define, develop and solve mathematical problems that arise in climate science. A number of postdoctoral positions will be available starting in Summer or Fall, 2011. The successful applicants will be known as Ed Lorenz Postdoctoral Fellows in the Mathematics of Climate and will have an affiliation with one of the network nodes. The topics of research will range from sea-ice processes to paleoclimate studies and data assimilation in climate models. The network has twelve funded nodes and a number of other collaborating institutions. For more details, see http://www.mathclimate.org.

to the special function SIAG, which may be of interest to any children with new PhD’s. As, always it is a pleasure to learn from your posts.

]]>