Thanks, Blake! Wonmin Son of the Entropy Club also pointed this out.

Whenever a physics paper contains a figure showing a demon wearing a joker’s cap, you know you’re in trouble.

]]>Thanks for the reference. Mike Stay also pointed it out to me, when it hit the arXiv:

• Joan A. Vaccaro, Stephen M. Barnett, Information erasure without an energy cost.

I also got an email from John Norton, pointing out another attack on the orthodoxy:

• John Norton, No go result for the thermodynamics of computation.

I hope that the Entropy Club, here at the CQT, figures out what’s going on with all these arguments.

]]>Vaccaro and Barnett just published “Information erasure without an energy cost” in ProcRSoc

http://rspa.royalsocietypublishing.org/content/early/2011/01/07/rspa.2010.0577.full.pdf+html

Presumably the physical problem is the assumed decoupling within the spin and heat bath, while somehow still maintaining the ability to “return to equilibrium”.

]]>once knowing that the photons can interact strongly with the fiber, you may not be surprised that the effect has been exploited for years. The application with the most use is distributed temperature sensing

http://www.apsensing.com/applications/industrial/

although distributed vibration sensing has the most history. ]]>

Where does light go when it gets absorbed? At a fundamental level, there’s a reaction where a charged particle like an electron can absorb a photon. People write it like this:

or using a Feynman diagram where two particles come in and just one comes out. And this sort of reaction is taking place whenever ordinary everyday matter absorbs light.

I was surprised at how few photons make it very far down an optical fiber without being absorbed. I’m guessing that in ordinary applications of optical fibers one can use tricks to keep ‘reboosting’ the light, but these don’t work well for quantum cryptography, because they don’t preserve the quantum states of individual photons.

]]>Maybe the demon categorizes the information to create rules and after a while evolves a complete set of rules and perhaps even a rule for the outliers, too, cf. http://www.gocomics.com/calvinandhobbes/2009/07/05/ which doesn’t necessitate the “erasing” of information. You can have all the observations you want, but unless you somehow systematize them, you don’t have any useful information.

]]>probably an easily-answered question, but where do the photons that don’t make it go? Are they somehow annhilated?

]]>Quantum information processing focuses on finite-dimensional Hilbert spaces. So, maybe Poulin is doing what people in this subject often do: considering a lattice with a finite-dimensional Hilbert space at each site, and a Hamiltonian that’s a sum of ‘n-particle Hamiltonians’, i.e., operators on Hilbert spaces of the form .

For example if we have a square lattice and each site only interacts with itself and its nearest neighbors, we’ll have just 1-particle and 2-particle terms in the sum. A famous example like this is the quantum Heisenberg model.

For a fancier example involving a hexagonal lattice, see page 4 of the slides on talk about the AKLT state.

But anyway: yes, it would be easier to understand the details if there were a paper to read.

]]>Shoot, I don’t understand the slides of David Poulin’s talk and there is no reference to any published paper, but I’m pretty sure that the talk is not about Hamiltonians that are bounded from above: Poulin talks about n-particle Hamiltonians that contain interaction terms with at most k interacting particles, so that the number of interacting particles is bounded, but not the norm of the Hamiltonian.

]]>