]]>on a first reading (warning: I don’t fully understand that category thing and I don’t want to spend to much time on studying it) this looks vaguely like a triangle equation. I. e. relative Renyi entropy looks like a “deformed” (unsymmetric) metric. And in fact if p and q are not too far apart then it seems you explained this already – if my interpretation of reading this Fisher information blog post diagonally is right.

I don’t know wether this also holds for this “modified” version but by fastly setting e.g.

and it seems one would get

which looks -eventually apart from a minus sign- on a first glance like it could be a metric, but I might be totally wrong. At least if it is than it should be known.

and the corresponding “fudge factor” giving your modified “nonrelative” Renyi entropy (with the above change of parameters) would then be

On the other hand one could treat the two summands differently in order to deal with the “rescalability” of the energy:

Speaking of ambiguities, our choice of inverse temperature was arbitrary as long as it’s not zero. If we multiply by some number and divide all the energies by that same number, the numbers and thus the probabilities don’t change.

That is with a “modified” relative Renyi entropy which reads as

one should get in the limit a “modified” relative Shannon entropy:

]]>“modifed” version of the relative Rényi entropy, which goes to the “usual” Renyi entropy for

this should read:

“modifed” version of the relative Rényi entropy, which goes to the “usual”

relativeRenyi entropy for

That is in particular one gets the “nonrelative” Rényi entropy with a “fudge factor” if one uses the distribution , being the particle number. The fudge factor is then

Thanks for the comment. Unfortunately I haven’t followed the free energy discussion as closely as I should in order to comment here. I even didn’t notice that you had written an article in this area. I just now briefly looked at the article. “Briefly” because looking at this is for me now a “hobby” to which I rather should not devote too much time given my current “pension plan”….this should serve as an excuse if my comments here should not fully follow the line, like if I didn’t read some comments etc.

Anyways, it is a nice result that the Rényi entropy is the q-deformed derivative of the free energy with respect to .

And your seems to be what is my and your , that is if I use the above special distribution then one would get your Rényi entropy (in your variables) by multiplying with the “fudge factor”

So if the above modified expression should make sense at all than the relative Rényi entropy seems to be some kind of strange “q-integral” of the Rényi entropy at the “critical temperature” that is if

]]>I find that constraint in here is a a bit irritating.

That constraint was in the initial blog article, but we removed it in the following discussion. If you want to avoid reading the discussion, go straight to the arXiv paper I cited in the blog article:

• John Baez, Rényi entropy and free energy, 6 June 2011.

This is more polished than the initial blog article. Here’s the abstract:

]]>The Rényi entropy is a generalization of the usual concept of entropy which depends on a parameter q. In fact, Rényi entropy is closely related to free energy. Suppose we start with a system in thermal equilibrium and then suddenly divide the temperature by q. Then the maximum amount of work the system can do as it moves to equilibrium at the new temperature, divided by the change in temperature, equals the system’s Rényi entropy in its original state. This result applies to both classical and quantum systems. Mathematically, we can express this result as follows: the Rényi entropy of a system in thermal equilibrium is minus the “1/q-derivative” of its free energy with respect to temperature. This shows that Renyi entropy is a q-deformation of the usual concept of entropy.

I second this puzzlement.

Sorry for not noticing Alan’s puzzlement earlier.

I probably just typed two characters incorrectly while turning Dmitri’s equation into LaTeX—the matrix given above is not stochastic and the matrix multiplication is done incorrectly! He’s a smart guy, and I liked his solution, so he must have meant this:

Bruce wrote:

My own solution to Puzzle 1 was just to let T move every state to the first state.

Yes, mine too—that’s why I had replied to Dmitri as follows:

]]>You could also replace those 0.1′s and 0.9′s by 0′s and 1′s. Then we’re taking a fair coin, flipping it, and turning it so it surely has heads up.

(My own solution to Puzzle 1 was just to let T move every state to the first state (so every column in T looks like 1 0 0 0). Note that this is also one of those T’s for which exactly one probability distribution is fixed, though it’s a degenerate one.)

]]>I find that constraint in here is a a bit irritating. But indeed it seems if one supplies the relative Rényi entropy with an overall factor $T$ and defines and one should get for the relative Rényi entropy in those tilde variables:

and this should go for to the free energy term you gave in here (….if there aren’t again embarrassing errors)

but the problem here is of course: the don’t sum to 1 that is it seems if one would like to get this more general term of free energy then this may eventually require some strange

“modifed” version of the relative Rényi entropy, which goes to the “usual” Renyi entropy for

Is there a way to recover the relative Shannon entropy from a generalized version of a Rényi entropy?

Yes, relative Shannon entropy is the limit of relative Rényi entropy:

where

is the **Shannon entropy of** **relative to** and

is the **Rényi entropy of** **relative to** The negative of relative Rényi entropy is called the **Rényi divergence**, and you can read more about it here:

• Tim van Erven and Peter Harremoës, Rényi divergence and majorization.

I worked out the relation between Rényi entropy and free energy here:

• Rényi entropy and free energy, *Azimuth*, 10 February 2011.

In a nutshell, Rényi entropy is proportional to the change in free energy as we suddenly reduce the temperature of a system.

]]>