Entropy in the Universe

If you click on this picture, you’ll see a zoomable image of the Milky Way with 84 million stars:



But stars contribute only a tiny fraction of the total entropy in the observable Universe. If it’s random information you want, look elsewhere!

First: what’s the ‘observable Universe’, exactly?

The further you look out into the Universe, the further you look back in time. You can’t see through the hot gas from 380,000 years after the Big Bang. That ‘wall of fire’ marks the limits of the observable Universe.

But as the Universe expands, the distant ancient stars and gas we see have moved even farther away, so they’re no longer observable. Thus, the so-called ‘observable Universe’ is really the ‘formerly observable Universe’. Its edge is 46.5 billion light years away now!

This is true even though the Universe is only 13.8 billion years old. A standard challenge in understanding general relativity is to figure out how this is possible, given that nothing can move faster than light.

What’s the total number of stars in the observable Universe? Estimates go up as telescopes improve. Right now people think there are between 100 and 400 billion stars in the Milky Way. They think there are between 170 billion and 2 trillion galaxies in the Universe.

In 2009, Chas Egan and Charles Lineweaver estimated the total entropy of all the stars in the observable Universe at 1081 bits. You should think of these as qubits: it’s the amount of information to describe the quantum state of everything in all these stars.

But the entropy of interstellar and intergalactic gas and dust is about ten times more the entropy of stars! It’s about 1082 bits.

The entropy in all the photons in the Universe is even more! The Universe is full of radiation left over from the Big Bang. The photons in the observable Universe left over from the Big Bang have a total entropy of about 1090 bits. It’s called the ‘cosmic microwave background radiation’.

The neutrinos from the Big Bang also carry about 1090 bits—a bit less than the photons. The gravitons carry much less, about 1088 bits. That’s because they decoupled from other matter and radiation very early, and have been cooling ever since. On the other hand, photons in the cosmic microwave background radiation were formed by annihilating
electron-positron pairs until about 10 seconds after the Big Bang. Thus the graviton radiation is expected to be cooler than the microwave background radiation: about 0.6 kelvin as compared to 2.7 kelvin.

Black holes have immensely more entropy than anything listed so far. Egan and Lineweaver estimate the entropy of stellar-mass black holes in the observable Universe at 1098 bits. This is connected to why black holes are so stable: the Second Law says entropy likes to increase.

But the entropy of black holes grows quadratically with mass! So black holes tend to merge and form bigger black holes — ultimately forming the ‘supermassive’ black holes at the centers of most galaxies. These dominate the entropy of the observable Universe: about 10104 bits.

Hawking predicted that black holes slowly radiate away their mass when they’re in a cold enough environment. But the Universe is much too hot for supermassive black holes to be losing mass now. Instead, they very slowly grow by eating the cosmic microwave background, even when they’re not eating stars, gas and dust.

So, only in the far future will the Universe cool down enough for large black holes to start slowly decaying via Hawking radiation. Entropy will continue to increase… going mainly into photons and gravitons! This process will take a very long time. Assuming nothing is falling into it and no unknown effects intervene, a solar-mass black hole takes about 1067 years to evaporate due to Hawking radiation — while a really big one, comparable to the mass of a galaxy, should take about 1099 years.

If our current most popular ideas on dark energy are correct, the Universe will continue to expand exponentially. Thanks to this, there will be a cosmological event horizon surrounding each observer, which will radiate Hawking radiation at a temperature of roughly 10-30 kelvin.

In this scenario the Universe in the very far future will mainly consist of massless particles produced as Hawking radiation at this temperature: photons and gravitons. The entropy within the exponentially expanding ball of space that is today our ‘observable Universe’ will continue to increase exponentially… but more to the point, the entropy density will approach that of a gas of photons and gravitons in thermal equilibrium at 10-30 kelvin.

Of course, it’s quite likely that some new physics will turn up, between now and then, that changes the story! I hope so: this would be a rather dull ending to the Universe.

For more details, go here:

• Chas A. Egan and Charles H. Lineweaver, A larger estimate of the entropy of the universe, The Astrophysical Journal 710 (2010), 1825.

Also read my page on information.

19 Responses to Entropy in the Universe

  1. Toby Bartels says:

    You should think of these as qubits

    Surely they're using ‘bit’ here as a unit of measurement (equal to \ln 2 \approx 0.6931 in natural units, about 9.57 \times 10^{-24} joules per kelvin). They're not claiming that the stars actually consist of so many independent classical two-state systems, and you surely don't mean that they consist of so many independent quantum two-state systems either. (And a qubit has the same amount of entropy as a classical bit, \ln 2, so it's not a different unit of measurement.)

    • Toby Bartels says:

      (In fact, a bit is exactly 1\,380\,649 \times 10^{-29} \times \ln 2 joules per kelvin since May 20.)

    • John Baez says:

      Surely they’re using ‘bit’ here as a unit of measurement […]

      It’s me who is using bits as a unit of measurement, they used “k”, meaning Boltzmann’s constant, and I converted the answers to bits for popularization purposes. But in my series of tweets on this issue, I found people worrying about information versus quantum information, so I thought this remark about qubits would help put some people at ease. As always, when you make someone happy you make someone else unhappy (though you are probably happy to find a nuance like this to talk about).

  2. Ishi Crew says:

    I’m sort of in a physics discussion group (not high level–anyone can go, though it has 2 professional physicists in it–one cosmologist who is a professor and runs an observatory, and one more into laser astronomy and health physics ) which holds its meetings at an ‘ethical society church’ associated with Unitarians.

    The next topic is entropy , so i shared this post with this group. The main question they seem to have raised is whether the accepted view that the universe was initially at a low entropy state is still accepted, and also if its true , then why? (Feynman seemed to accept this view–see quote in https://arxiv.org/abs/1903.11870 ).
    Most of the people in this group likely would not be familiar with things like relative or nonequilibrium entropy .

    I wonder if you have any views on the entropy of the initial state of the universe (i would guess most would say it was 0.)

    The people in this group want to know if that is so, and also why .

    (As an aside, since I gather you studied under Irving Segal, who had his own views on the big bang (‘borel universe’) , and since time, entropy, and cosmology are all connected ideas, i wonder what your views are on this. In some other groups ideas of J Barbour seem to be widely discussed.)

    Also as an aside I came across a reference to a paper by Baez and Gilliam from 1994 in https://arxiv.org/abs/nlin/0611058 (ref 1) on ‘action principals’. (Its also on your website i later realized.) I sort of collect papers on ‘action principals’ —including ones for biological and economic/social systems.

    There is an old paper by C Lumsden and LEH Trainor on ‘Hamiltonian Biodynamics ‘ from around 1980 which attempted to reformulate mathematical biology in terms of a Hamiltonian action principal Lumsden also wrote a book ‘Genes, Mind and Culture’ with famous biologist/ecologist E O Wilson around the same time.

    There is this concept called the ‘hamiltonian dream’ which goes back to “hilbert’s program’ for axiomatization of the sciences–and he developed the einstein-hilbert action, and others later tried to take the idea into other sciences–find a Hamiltonian for everything. I just wonder what the state of the program or idea is. There are plenty of papers on this ( Morse and Feschbech in their 1953 book Methods for Theoretical Physics wrote a lagrangian for the diffusion or heat equation—–in basic physics i took this idea was not mentioned—the view was lagrangians and hamiltonians only applied to conservative systems. From reading papers i found similar formalism was applied to open systems. ) .

    .
    
    • John Baez says:

      Ishi wrote:

      The main question they seem to have raised is whether the accepted view that the universe was initially at a low entropy state is still accepted, and also if its true, then why?

      Yes, it’s still accepted. Nobody knows why; it’s just an observed fact.

      Note that shortly after the Big Bang the radiation and matter in the Universe was thermalized and thus in a fairly high-entropy state, but spacetime was rather smooth so the gravitational field was in a low-entropy state… and the low entropy of the gravitational field is so significant that overall the Universe was in a much lower entropy state than it is now.

      You can puzzle your group by asking whether entropy increases as clouds of gas and dust in the early Universe collapse and form stars and planets. If it doesn’t increase, this process violates the Second Law. So it must increase. But why is it increasing? It seems that order is increasing… so doesn’t that mean entropy is decreasing?

      If someone can’t answer this puzzle correctly, they don’t understand how thermodynamics works in our Universe.

      • Ishi Crew says:

        Thanks for your reply John Baez. I will try to share this post and your reply with my group. (I also shared the post on Einstein and Schrodinger–since the previous discussion was on quantum theory–Adam Becker’s book ‘what is real’ about Bohmian mechanics–, and with another group i’m in i shared ones on climate change issues–i view this blog as somewhat like an encyclopedia. )

        Much of what i know about entropy and nonequilibrium thermodynamics derives from books and papers by Ilya Prigogine so i think more in terms like ‘dissipative structures in chemical reactions and ecosystems’. Gravity is a bit beyond my reach though its the same principal.

        (I had to look up ‘etale’ and ‘sheaf” in your course on topos theory–this is beyond me, but i like the formalism and ideas. ) .

        I think https://en.wikipedia.org/wiki/Crooks_fluctuation_theorem sort of gives the same answer you suggested.

  3. Toby Bartels says:

    A comment on your page on information linked at the bottom: If you're not going to decide whether a zettabyte is 10^{21} bytes or 2^{70}, then you can't use such precision as 33 or 42 zettabytes. In particular, 33 \times 2^{70} and 42 \times 10^{21} are closer to each other than either is to 33 \times 10^{21} or 42 \times 2^{70}. (The difference between the two kinds of zettabyte is much more significant than the difference between the two kinds of kilobyte.)

    • John Baez says:

      In my page I’m always using the standard metric-system meaning of prefixes like giga-, tera-, peta-, exa-, yetta-, zotta-: they refer to powers of ten. I should make this clear.

      Quoting my page:

      Not everyone agrees on the definition of ‘kilobyte’, ‘megabyte’, ‘gigabyte’, and so on! Originally people used ‘kilobyte’ to mean not 1000 but 1024 bytes, since 1024 = 210, and powers of 2 are nicer when you’re using binary. The difference is just 2.4% percent. But by the time you get to ‘zettabytes’ it makes a bigger difference: 280 is about 20.8% more than 1024.

      There have, in fact, been lawsuits over hard drives that contained only 109 bytes per gigabyte, instead of 230. And at some point, defenders of the metric system tried to crack down on the evil practice of using prefixes like ‘kilo-‘, ‘mega-‘, giga-‘ to mean something other than powers of ten. 210 bytes is now officially a kibibyte, 220 bytes is a mebibyte, and so on. But I haven’t heard people actually say these words.

      • Pranav says:

        Not everyone agrees on the definition of ‘kilobyte’, ‘megabyte’, ‘gigabyte’, and so on! Originally people used ‘kilobyte’ to mean not 1000 but 10^24 bytes, since 10^24 = 2^10, and powers of 2 are nicer when you’re using binary.

        I’ve noticed a typo in your comment.
        1 kibibyte is 1024 bytes, not 10^24 bytes. And 10^24 bytes is not 2^10 bytes.

        2^10 bytes is now officially a kibibyte, 2^20 bytes is a mebibyte, and so on. But I haven’t heard people actually say these words.

        Many FOSS utilities use the terms KiB, MiB and GiB to refer to kibibytes, mebibytes and gigibytes. Some use K, M and G without the ‘B’ to refer to kibibytes and others. I’ve only seen developers using these terms. This might be because in schools and many other places a kilobyte is used to mean 1024 bytes instead of 1000 bytes.

        • John Baez says:

          Pranav:

          1 kibibyte is 1024 bytes, not 10^24 bytes

          Ha! The first time I tried to post my comment WordPress destroyed all the scientific notation, changing things like 210 to 210. The second time I tried to post it, WordPress did it again. The third time I figured out a way around the problem—but I was tired by then, and I changed 1024 to 1024.

  4. Bruce Smith says:

    That big image has a bright dot which is about 1/3 of the way to the right, and 1/5 of the way up. If I zoom way in there, I see at least three kinds of artifacts: radiating lines (some colored), overlapping disks made of “ripples”, and some mostly-transparent rectangles below the main bright spot (whose top edges are their most definite features). Do you happen to know the explanation for all those? (BTW, by “artifacts” I mean in the data collection or image processing, rather than extraterrestrial engineering!)

    • John Baez says:

      It’s definitely fun to look at that bright dot close up!

      I don’t know the reason for those artifacts. I can just quote you a bit from the ESO website of how this image was made, and we can try to imagine what’s going on:

      “Observations of the bulge of the Milky Way are very hard because it is obscured by dust,” says Dante Minniti (Pontificia Universidad Catolica de Chile, Chile), co-author of the study. “To peer into the heart of the galaxy, we need to observe in infrared light, which is less affected by the dust.”

      The large mirror, wide field of view and very sensitive infrared detectors of ESO’s 4.1-metre Visible and Infrared Survey Telescope for Astronomy (VISTA) make it by far the best tool for this job. The team of astronomers is using data from the VISTA Variables in the Via Lactea programme (VVV) [1], one of six public surveys carried out with VISTA. The data have been used to create a monumental 108 200 by 81 500 pixel colour image containing nearly nine billion pixels. This is one of the biggest astronomical images ever produced. The team has now used these data to compile the largest catalogue of the central concentration of stars in the Milky Way ever created [2].

      and:

      [1] The VISTA Variables in the Via Lactea (VVV) survey is an ESO public survey dedicated to scanning the southern plane and bulge of the Milky Way through five near-infrared filters. It started in 2010 and was granted a total of 1929 hours of observing time over a five-year period. Via Lactea is the Latin name for the Milky Way.

      [2] The image used in this work covers about 315 square degrees of the sky (a bit less than 1% of the entire sky) and observations were carried out using three different infrared filters. The catalogue lists the positions of the stars along with their measured brightnesses through the different filters. It contains about 173 million objects, of which about 84 million have been confirmed as stars. The other objects were either too faint or blended with their neighbours or affected by other artefacts, so that accurate measurements were not possible. Others were extended objects such as distant galaxies.

      The image used here required a huge amount of data processing, which was performed by Ignacio Toledo at the ALMA OSF. It corresponds to a pixel scale of 0.6 arcseconds per pixel, down-sampled from the original pixel scale of 0.34 arcseconds per pixel.

  5. “The entropy in all the photons in the Universe is even more! The Universe is full of radiation left over from the Big Bang. The photons in the observable Universe left over from the Big Bang have a total entropy of about 10^90 bits. It’s called the ‘cosmic microwave background radiation’.”

    The energy of photons in the CMB is also much larger than that of all other photons.

  6. Giampiero Campa says:

    This is a timeline of (a possible) future with speed doubling every few seconds. I’m sure it’s inaccurate but it’s well done:

  7. Shaz Jan says:

    A truely amazing article on enthropy of the Universe. Thank you.

You can use Markdown or HTML in your comments. You can also use LaTeX, like this: $latex E = m c^2 $. The word 'latex' comes right after the first dollar sign, with a space after it.

This site uses Akismet to reduce spam. Learn how your comment data is processed.