Thermodynamics and Wick Rotation

6 August, 2010

Having two blogs is a bit confusing. My student Mike Stay has some deep puzzles about physics, which I posted over at the n-Category Café:

• Mike Stay, Thermodynamics and Wick Rotation.

But maybe this blog already has some of its own readers, who don’t usually read the n-Café, but are interested in physics? I don’t know.

Anyway: if you’re interested in the mysterious notion of temperature as imaginary time, please click the link and help us figure it out. This should keep us entertained until I’m done with “week300” — the last issue of This Week’s Finds in Mathematical Physics.

No comments here, please — that would get really confusing.


Introduction to Climate Change

4 August, 2010

No, this post is not an introduction to climate change. It’s a question from Alex Hoffnung, who recently got his Ph.D. from U.C. Riverside after working with me on categorified Hecke algebras. Now he’s headed for a postdoc at the University of Ottawa. He’s a cool dude:

And he has a question that I’m sure many mathematicians and other scientists share, so I’ll make it a guest post here:


Have you come across anything like “Intro to Climate Change”? The big problem is that I have following the issues surrounding climate change are getting a handle on what the issues are. How hard is it to objectively state some of the more important foundational issues without running into controversy?


How Hot Is Too Hot?

30 July, 2010

How hot is too hot? This interesting paper tackles that question:

• Steven C. Sherwood and Matthew Huber, An adaptability limit to climate change due to heat stress, Proceedings of the National Academy of Sciences, early edition 2010.

Abstract: Despite the uncertainty in future climate-change impacts, it is often assumed that humans would be able to adapt to any possible warming. Here we argue that heat stress imposes a robust upper limit to such adaptation. Peak heat stress, quantified by the wetbulb temperature TW, is surprisingly similar across diverse climates today. TW never exceeds 31 °C. Any exceedence of 35 °C for extended periods should induce hyperthermia in humans and other mammals, as dissipation of metabolic heat becomes impossible. While this never happens now, it would begin to occur with global-mean warming of about 7 °C, calling the habitability of some regions into question. With 11–12 °C warming, such regions would spread to encompass the majority of the human population as currently distributed. Eventual warmings of 12 °C are possible from fossil fuel burning. One implication is that recent estimates of the costs of unmitigated climate change are too low unless the range of possible warming can somehow be narrowed. Heat stress also may help explain trends in the mammalian fossil record.

Huh? Temperatures going up by 12 degrees Celsius??? Well, this is a worst-case scenario — the sort of thing that’s only likely to kick in if we keep up ‘business as usual’ for a long, long time:

Recent studies have highlighted the possibility of large global warmings in the absence of strong mitigation measures, for example the possibility of over 7 °C of warming this century alone. Warming will not stop in 2100 if emissions continue. Each doubling of carbon dioxide is expected to produce 1.9–4.5 °C of warming at equilibrium, but this is poorly constrained on the high side and according to one new estimate has a 5% chance of exceeding 7.1 °C per doubling. Because combustion of all available fossil fuels could produce 2.75 doublings of CO2 by 2300, even a 4.5 °C sensitivity could eventually produce 12 °C of warming. Degassing of various natural stores of methane and/or CO2 in a warmer climate could increase warming further. Thus while central estimates of business-as-usual warming by 2100 are 3–4 °C, eventual warmings of 10 °C are quite feasible and even 20 °C is theoretically possible.

A key notion in Sherwood and Huber’s paper is the concept of wet-bulb temperature. Apparently this term has several meanings, but Sherwood and Huber use it to mean “the temperature as measured by covering a standard thermometer bulb with a wetted cloth and fully ventilating it”.

This can be lower than the ‘dry-bulb temperature’, thanks to evaporative cooling. And that’s important, because we sweat to stay cool.

Indeed, this is the big difference between Riverside California (my permanent home) and Singapore (where I’m living now). It’s dry there, and humid here, so my sweat doesn’t evaporate so nicely here — so the wet-bulb temperature tends to be higher. In Riverside air conditioning seems like a bit of an indulgence much of the time, though it’s quite common for shops to let it run blasting until the air is downright frigid. In Singapore I’m afraid I really like it, though when I’m in control, I keep it set at 28 °C — perhaps more for dehumidification than cooling?

Sherwood and Huber write:

A resting human body generates ∼100 W of metabolic heat that (in addition to any absorbed solar heating) must be carried away via a combination of heat conduction, evaporative cooling, and net infrared radiative cooling. Net conductive and evaporative cooling can occur only if an object is warmer than the environmental wet-bulb temperature TW, measured by covering a standard thermometer bulb with a wetted cloth and fully ventilating it. The second law of thermodynamics does not allow an object to lose heat to an environment whose TW exceeds the object’s temperature, no matter how wet or well-ventilated. Infrared radiation under conditions of interest here will usually produce a small additional heating.

[…]

Humans maintain a core body temperature near 37 °C that varies slightly among individuals but does not adapt to local climate. Human skin temperature is strongly regulated at 35 °C or below under normal conditions, because the skin must be cooler than body core in order for metabolic heat to be conducted to the skin. Sustained skin temperatures above 35 °C imply elevated core body temperatures (hyperthermia), which reach lethal values (42–43 °C) for skin temperatures of 37–38 °C even for acclimated and fit individuals. We would thus expect sufficiently long periods of TW > 35 °C to be intolerable.

Now, temperatures of 35 °C (we say 95 degrees Fahrenheit) are entirely routine during the day in Riverside. Of course, it’s much cooler in my un-air-conditioned home because we leave open the windows when it gets cool at night, and the concrete slab under the floor stays cool, and the house has great insulation. Still, after a few years of getting acclimated, walking around in 35 °C weather seems like no big deal. We only think it’s seriously hot when it reaches 40 °C.

But these are not wet-bulb temperatures: the humidity is usually really low! So what’s the wet-bulb temperature when it’s 35 °C and the relative humidity is, say, 20%? I should look it up… but maybe you know where to look?

If you look on page 2 of Sherwood and Huber’s paper you’ll see three graphs. The top graph is the world today. You’ll see histograms of the average temperature (in black), the average annual maximum temperature (in blue), and the average annual maximum wet-bulb temperature (in red). The interesting thing is how the red curve is sharply peaked between 15 °C and 30 °C, dropping off sharply above 31 °C.

The bottom graph shows an imagined world that’s about 12 °C warmer. It’s too hot.

As the authors note:

The highest instantaneous TW anywhere on Earth today is about 30 °C (with a tiny fraction of values reaching 31 °C). The most-common TW, max is 26–27 °C, only a few degrees lower. Thus, peak potential heat stress is surprisingly similar across many regions on Earth. Even though the hottest temperatures occur in subtropical deserts, relative humidity there is so low that TW, max is no higher than in the deep tropics. Likewise, humid mid-latitude regions such as the Eastern United States, China, southern Brazil, and Argentina experience TW, max during summer heat waves comparable to tropical ones, even though annual mean temperatures are significantly lower. The highest values of T in any given region also tend to coincide with low relative humidity.

But what if it gets a lot hotter?

Could humans survive > 35 °C? Periods of net heat storage can be endured, though only for a few hours, and with ample time needed for recovery. Unfortunately, observed extreme-TW events (TW 26 °C) are long-lived: Adjacent nighttime minima of TW are typically within 2–3 °C of the daytime peak, and adjacent daily maxima are typically within 1 °C. Conditions would thus prove intolerable if the peak TW exceeded, by more than 1–2 °C, the highest value that could be sustained for at least a full day. Furthermore, heat dissipation would be very inefficient unless TW were at least 1–2 °C below skin temperature, so to sustain heat loss without dangerously elevated body temperature would require TW of 34 °C or lower. Taking both of these factors into account, we estimate that the survivability limit for peak six-hourly TW is probably close to 35 °C for humans, though this could be a degree or two off. Similar limits would apply to other mammals but at various thresholds depending on their core body temperature and mass.

I find the statement “Adjacent nighttime minima of TW are typically within 2–3 °C of the daytime peak” quite puzzling. Maybe it’s true in extremely humid climates, but in dry climates it tends to cool down significantly at night. Even here in Singapore there seems to be typically a 5 °C difference between day and night. But maybe it’s less during a heat wave.

The paper does not discuss behavioral adaptations, and that makes it a bit misleading. Even without fossil fuels people can do things like living underground during the day and using windcatchers to bring cool underground air into the house. Here’s a windcatcher that my friend Greg Egan photographed in Yazd during his trip to Iran:

But, of course, this sort of world would support far fewer people than live here now!

Another obvious doubt concerns the distant past, when it was a lot warmer than now. I’m talking about the Paleogene, which ended 23 million years ago. If you haven’t heard of the Paleogene — which is term that came into play after I learned my geological time periods back in grade school — maybe you’ll be interested to hear that it’s the beginning of the Cenozoic, consisting of the Paleocene, Eocene, and Oligocene. Since then the Earth has been in a cooling phase:

How did mammals manage back then?

Mammals have survived past warm climates; does this contradict our conclusions? The last time temperatures approached values considered here is the Paleogene, when global-mean temperature was perhaps 10 °C and tropical temperature perhaps 5–6 °C warmer than modern, implying TW of up to 36 °C with a most-common TW, max of 32–33 °C. This would still leave room for the survival of mammals in most locations, especially if their core body temperatures were near the high end of those of today’s mammals (near 39 °C). Transient temperature spikes, such as during the PETM or Paleocene-Eocene Thermal Maximum, might imply intolerable conditions over much broader areas, but tropical terrestrial mammalian records are too sparse to directly test this. We thus find no inconsistency with our conclusions, but this should be revisited when more evidence is available.


High Temperature Superconductivity

29 July, 2010

Here at the physics department of the National University of Singapore, Tony Leggett is about to speak on “Cuprate superconductivity: the current state of play”. I’ll take notes and throw them on this blog in a rough form. As always, my goal is to start some interesting conversations. So, go ahead and ask questions, or fill in some more details. Not everything I write here is something I understand!

Certain copper oxide compounds can be superconductive at relatively high temperatures — for example, above the boiling point of liquid nitrogen, 77 kelvin. These compounds consist of checkerboard layers with four oxygen atoms at the corners of each square and one copper in the middle. It’s believed that the electrons move around in these layers in an essentially two-dimensional way. Two-dimensional physics allows for all sorts of exotic possibilities! But nobody is sure how these superconductors work. The topic has been around for about 25 years, but according to Leggett, there’s no one theory that commands the assent of more than 20% of the theorists.

Here’s the outline of Leggett’s talk:

1. What is superconductivity?

2. Brief overview of cuprate structure and properties.

3. What do we know for sure about high-temperature superconductivity (HTS) in the cuprates? That is, what do we know without relying on any microscopic model, since these models are all controversial?

4. Some existing models.

5. Are we asking the right questions?

1. What is superconductivity?

For starters, he asked: what is superconductivity? It involves at least two phenomena that don’t necessarily need to go together, but seem to always go together in practice, and are typically considered together. One: perfect diamagnetism — in the “Meissner effect“, the medium completely excludes magnetic fields. This is an equilibrium effect. Two: persistent currents — this is an incredibly stable metastable effect.

Note the difference: if we start with a ball of stuff in magnetic field and slowly lower its temperature, once it becomes superconductive it will exclude the magnetic field. There are never any currents present, since we’re in thermodynamic equilibrium at any stage.

On the other hand, a ring of stuff with a current flowing around it is not in thermal equibrium. It’s just a metastable state.

The London-Landau-Ginzburg theory of superconductivity is a ‘phenomenological’ theory: it doesn’t try to describe the underlying microscopic cause, just what seems to happen. Among other things, it says that a superconductor is characterized by a ‘macroscopic wave function’ \psi(r), a complex function with phase \exp(i \phi(r)). The current is given by

J(r) \propto |\psi(r)|^2 (\nabla \phi(r) - e A(r))

where e is a charge (in fact the charge of an electron pair, as was later realized).

This theory explains the Meissner effect and also persistent currents, and it’s probably good for cuprate superconductors.

2. The structure and behavior of cuprate superconductors

The structure of a typical cuprate: there are n planes made of CuO2 and other atoms (typically alkaline earth), and then, between these, a material that serves as a ‘charge reservoir’.

He showed us the phase diagram for a typical cuprate as a function of temperature and the ‘doping’: that is, the number of extra ‘holes’ – missing electrons – per CuO2. No cuprate has yet been shown to have this phase diagram in its entirety! But different ones have been seen to have different parts, so we may guess the story is like this:

There’s an antiferromagnetic insulator phase at low doping. At higher doping there’s a strange ‘pseudogap’ phase. Nobody knows if this ‘pseudogap’ phase extends to zero temperature. At still higher dopings we see a superconductive phase at low temperature and a ‘strange metal’ phase above some temperature. This temperature reaches a max at a doping of about 0.16 — a more or less universal figure — but the value of this maximum temperature depends a lot on the material. At higher dopings the superconductive phase goes away.

There are over 200 superconducting cuprates, but there are some cuprates that can never be made superconducting — those with multilayers spaced by strontium or barium.

Both ‘normal’ and superconducting states are highly anisotropic. But the ‘normal’ states are actually very anomalous — hence the term ‘strange metal’. The temperature-dependence of various properties are very unusual. By comparison the behaviour of the superconducting phase is less strange!

Most (but not all) properties are approximately consistent with the hypothesis that at a given doping, the properties are universal.

The superconducting phase is highly sensitive to doping and pressure.

3. What do we know for sure about superconductivity in the cuprates?

There’s strong evidence that cuprate superconductivity is due to the formation of Cooper pairs, just as for ordinary superconductors.

The ‘universality’ of high-temperature superconductivity in cuprate with very different chemical compositions suggests that the main actors are the electrons in the CuO2 planes. Most researchers believe this.

There’s a lot of NMR experiments suggesting that the spins of the electrons in the Cooper pairs are in the ‘singlet’ state:

up ⊗ down – down ⊗ up

Absence of substantial far-infrared absorption above the gap edge suggests that pairs are formed from time-reversed states (despite the work of Tahir–Kheli).

The ‘radius’ of the Cooper pairs is very small: only 3-10 angstroms, instead of thousands as in an ordinary superconductor!

In ordinary superconductor the wave function of a Cooper pair is in an s state (spherically symmetric state). In a cuprate superconductor it seems to have the symmetry of x^2 - y^2: that is, a d state that’s odd under 90 degree rotation in the plane of the cuprate (the x y plane), but even under reflection in either the x or y axis.

There’s good evidence that the pairs in different multilayers are effectively independent (despite the Anderson Interlayer Tunnelling Theory).

There isn’t a substantial dependence on the isotopes used to make the stuff, so it’s believed that phonons don’t play a major role.

At least 95% of the literature makes all of the above assumptions and a lot more. Most models are specific Hamiltonians that obey all these assumptions.

4. Models of high-temperature superconductivity in cuprates

How will we know when we have a ‘satisfactory’ theory? We should either be able to:

A) give a blueprint for building a room-temperature superconductor using cuprates, or

B) assert with confidence that we will never be able to do this, or at least

C) say exactly why we cannot do either A) or B).

No model can yet do this!

Here are some classes of models, from conservative to exotic:

1. Phonon-induced attraction – the good old BCS mechanism, which explains ordinary superconductors. These models have lots of problems when applied to cuprates, e.g. the fact that we don’t see an isotope effect.

2. Attraction induced by the exchange of some other boson: spin fluctuations, excitons, fluctuations of ‘stripes’ or still more exotic objects.

3. Theories starting from the single-band Hubbard model. These include theories based on the postulate of ‘exotic ordering’ in the ground state, e.g. charge-spin separation.

5. What are the right questions to ask?

The energy is the sum of 3 terms: the kinetic energy, the potential energy of the interaction between the conduction electrons and the static lattice, and the potential energy of the interaction of the conduction electrons among each other (both intra-plane and inter-plane). One of these must go down when Cooper pairs must form! The third term is the obvious suspect.

Then Leggett wrote a lot of equations which I cannot copy fast enough… and concluded that there are two basic possibilities, “Eliashberg” and “overscreening”. The first is that electrons with opposite momentum and spin attract each other in the normal phase. The second is that there’s no attraction required in the normal phase, but the interaction is modified by pairing: pairing can cause “screening” of the Coulomb repulsion. Which one is it?

Another good question: Why does the critical temperature depend on the number of layers in a multilayer? There are various possible explanations. The “boring” explanation is that superconductivity is a single-plane phenomenon, but multi-layering affects properties of individual planes. The “interesting” explanations say that inter-plane effects are essential: for example, as in the Anderson inter-layer tunnelling model, or due to a Kosterlitz-Thouless effect, or due to inter-plane Coulomb interactions.

Leggett clearly likes the last possibility, with the energy savings taking place due to increased screening, and with the energy saving taking place predominantly at long wavelengths and mid-infrared frequencies. This gives a natural explanation of why all known high-temperature superconductors are strongly two-dimensional, and it explains many more of their properties, too. Moreover it’s unambiguously falsifiable in electron energy-loss spectroscopy experiments. He has proposed an experimental test, which will be carried out soon.

He bets that with at least a 50% chance some of the younger members of the audience will live to see room-temperature superconductors.


Overfishing

28 July, 2010

While climate change is the 800-pound gorilla of ecological issues, I don’t want it to completely dominate the conversation here. There are a lot of other issues to think about. For example, overfishing!

My friend the mathematician John Terilla says that after we had dinner together at a friend’s house, he can’t help thinking about overfishing — especially when he eats fish. I’m afraid I have that effect on people these days.

(In case you’re wondering, we didn’t have fish for dinner.)

Anyway, John just pointed out this book review:

• Elizabeth Kolbert, The scales fall: is there any hope for our overfished oceans?, New Yorker, August 2, 2010.

It’s short and very readable. It starts out talking about tuna. In the last 40 years, the numbers of bluefin tuna have dropped by roughly 80 percent. A big part of the problem is ICCAT, which either means the International Commission for the Conservation of Atlantic Tunas, or else the International Conspiracy to Catch All Tunas, depending on whom you ask. In 2008, ICCAT scientists recommended that the bluefin catch in the eastern Atlantic and the Mediterranean be limited to 8500-15,000 tons. ICCAT went ahead and adopted a quota of 22,000 tons! So it’s no surprise that we’re in trouble now.

But it’s not just tuna. Look at what happened to cod off the east coast of Newfoundland:



In fact, there’s evidence that the population of all kinds of big predatory fish has dropped 90% since 1950:

• Ransom A. Myers and Boris Worm, Rapid worldwide depletion of predatory fish communities, Nature 423 May 15, 2003.

Of course you’d expect someone with the name “Worm” to be against fishing, but Myers agrees: “From giant blue marlin to mighty bluefin tuna, and from tropical groupers to Antarctic cod, industrial fishing has scoured the global ocean. There is no blue frontier left. Since 1950, with the onset of industrialized fisheries, we have rapidly reduced the resource base to less than 10 percent—not just in some areas, not just for some stocks, but for entire communities of these large fish species from the tropics to the poles.”

In fact, we’re “fishing down the food chain”: now that the big fish are gone, we’re going after larger and large numbers of smaller and smaller species, with former “trash fish” now available at your local market. It’s a classic tragedy of the commons: with nobody able to own fish, everyone is motivated to break agreements to limit fishing. Here’s a case where I think some intelligent applications of economics and game theory could work wonders. But who has the muscle to forge and enforce agreements? Clearly ICCAT and other existing bodies do not!

But there’s still hope. For starters, learn which fish to avoid eating. And think about this:

It is almost as though we use our military to fight the animals in the ocean. We are gradually winning this war to exterminate them. And to see this destruction happen, for nothing really – for no reason – that is a bit frustrating. Strangely enough, these effects are all reversible, all the animals that have disappeared would reappear, all the animals that were small would grow, all the relationships that you can’t see any more would re-establish themselves, and the system would re-emerge. So that’s one thing to be optimistic about. The oceans, much more so than the land, are reversible…Daniel Pauly


Technology for Azimuth (Part 1)

26 July, 2010

A bunch of us want to set up a wiki associated to Azimuth, so we can more effectively gather and distribute scientific knowledge related to the overarching theme of “how to save the planet”.

It may also good to have a “discussion forum” associated to that wiki, in addition to the blog here, where I — and other people, once I find some good co-bloggers — hold forth. A blog is a good place for a few people to lead discussions. A different sort of discussion forum, more like the nForum, would be a more democratic environment, good for developing a wiki.

But what exactly should we do?

Let’s discuss that question here… I’m going to copy or move some comments from the welcome page to here, to get things going.


Climate Stabilization Targets

25 July, 2010

I thank Walter Blackstock at the Institute of Molecular & Cell Biology here in Singapore for pointing this out:

The most distinguished group of scientists in the United States has released an important report on climate change. You can get the whole thing for free, here:

• National Research Council, National Academy of Science, Climate Stabilization Targets: Emissions, Concentrations, and Impacts over Decades to Millennia, 2010.

But here’s the executive summary, for you executives too busy to read the whole thing. It’s clearly written, short, and earth-shakingly important. I’ve put a few key passages in boldface.

EXECUTIVE SUMMARY

Emissions of carbon dioxide from the burning of fossil fuels have ushered in a new epoch where human activities will largely determine the evolution of Earth’s climate. Because carbon dioxide in the atmosphere is long lived, it can effectively lock the Earth and future generations into a range of impacts, some of which could become very severe. Therefore, emissions reductions choices made today matter in determining impacts experienced not just over the next few decades, but in the coming centuries and millennia. Policy choices can be informed by recent advances in climate science that quantify the relationships between increases in carbon dioxide and global warming, related climate changes, and resulting impacts, such as changes in streamflow, wildfires, crop productivity, extreme hot summers, and sea level rise.

Since the beginning of the industrial revolution, concentrations of greenhouse gases from human activities have risen substantially. Evidence now shows that the increases in these gases very likely (>90 percent chance) account for most of the Earth’s warming over the past 50 years. Carbon dioxide is the greenhouse gas produced in the largest quantities, accounting for more than half of the current impact on Earth’s climate. Its atmospheric concentration has risen about 35% since 1750 and is now at about 390 parts per million by volume, the highest level in at least 800,000 years. Depending on emissions rates, carbon dioxide concentrations could double or nearly triple from today’s level by the end of the century, greatly amplifying future human impacts on climate.

Society is beginning to make important choices regarding future greenhouse gas emissions. One way to inform these choices is to consider the projected climate changes and impacts that would occur if greenhouse gases in the atmosphere were stabilized at a particular concentration level. The information needed to understand such targets is multifaceted: how do emissions affect global atmospheric concentrations and in turn global warming and its impacts?

This report quantifies, insofar as possible, the outcomes of different stabilization targets for greenhouse gas concentrations using analyses and information drawn from the scientific literature. It does not recommend or justify any particular stabilization target. It does provide important scientific insights about the relationships among emissions, greenhouse gas concentrations, temperatures, and impacts.

CLIMATE CHANGE DUE TO CARBON DIOXIDE WILL PERSIST MANY CENTURIES

Carbon dioxide flows into and out of the ocean and biosphere in the natural breathing of the planet, but the uptake of added human emissions depends on the net change between flows, occurring over decades to millennia. This means that climate changes caused by carbon dioxide are expected to persist for many centuries even if emissions were to be halted at any point in time.

Such extreme persistence is unique to carbon dioxide among major agents that warm the planet. Choices regarding emissions of other warming agents, such as methane, black carbon on ice/snow, and aerosols, can affect global warming over coming decades but have little effect on longer-term warming of the Earth over centuries and millennia. Thus, long-term effects are primarily controlled by carbon dioxide.

The report concludes that the world is entering a new geologic epoch, sometimes called the Anthropocene, in which human activities will largely control the evolution of Earth’s environment. Carbon emissions during this century will essentially determine the magnitude of eventual impacts and whether the Anthropocene is a short-term, relatively minor change from the current climate or an extreme deviation that lasts thousands of years. The higher the total, or cumulative, carbon dioxide emitted and the resulting atmospheric concentration, the higher the peak warming that will be experienced and the longer the duration of that warming. Duration is critical; longer warming periods allow more time for key, but slow, components of the Earth system to act as amplifiers of impacts, for example, warming of the deep ocean that releases carbon stored in deep-sea sediments. Warming sustained over thousands of years could lead to even bigger impacts.

IMPACTS CAN BE LINKED TO GLOBAL MEAN TEMPERATURES

To date, climate stabilization goals have been most often discussed in terms of stabilizing atmospheric concentrations of carbon dioxide (e.g., 350 ppmv, 450 ppmv, etc.). This report concludes that, for a variety of conceptual and practical reasons, it is more effective to assess climate stabilization goals by using global mean temperature change as the primary metric. Global temperature change can in turn be linked both to concentrations of atmospheric carbon dioxide and to accumulated carbon emissions.

An important reason for using warming as a reference is that scientific research suggests many key impacts can be quantified for given temperature increases. This is done by scaling local to global warming and by “coupled linkages” that show how other climate changes, such as alterations in the water cycle, scale with temperature. There is now increased confidence in how global warming levels of 1°C, 2°C, 3°C etc. would relate to certain future impacts. This report lists some of these effects per degree (°C) of global warming, including:

• 5-10 percent changes in precipitation in a number of regions

• 3-10 percent increases in heavy rainfall

• 5-15 percent yield reductions of a number of crops

• 5-10 percent changes in streamflow in many river basins worldwide

• About 15 percent and 25 percent decreases in the extent of annually averaged and September Arctic sea ice, respectively

For warming of 2°C to 3°C, summers that are among the warmest recorded or the warmest experienced in people’s lifetimes, would become frequent. For warming levels of 1°C to 2°C, the area burned by wildfire in parts of western North America is expected to increase by 2 to 4 times for each degree (°C) of global warming. Many other important impacts of climate change are difficult to quantify for a given change in global average temperature, in part because temperature is not the only driver of change for some impacts; multiple environmental and other human factors come into play. It is clear from scientific studies, however, that a number of projected impacts scale approximately with temperature. Examples include shifts in the range and abundance of some terrestrial and marine species, increased risk of heat-related human health impacts, and loss of infrastructure in the coastal regions and the Arctic.

STABILIZATION REQUIRES DEEP EMISSIONS REDUCTIONS

The report demonstrates that stabilizing atmospheric carbon dioxide concentrations will require deep reductions in the amount of carbon dioxide emitted. Because human carbon dioxide emissions exceed removal rates through natural carbon “sinks,” keeping emission rates the same will not lead to stabilization of carbon dioxide. Emissions reductions larger than about 80 percent, relative to whatever peak global emissions rate may be reached, are required to approximately stabilize carbon concentrations for a century or so at any chosen target level.

But stabilizing atmospheric concentrations does not mean that temperatures will stabilize immediately. Because of time-lags inherent in the Earth’s climate, warming that occurs in response to a given increase in the concentration of carbon dioxide (“transient climate change”) reflects only about half the eventual total warming (“equilibrium climate change”) that would occur for stabilization at the same concentration. For example, if concentrations reached 550 ppmv, transient warming would be about 1.6°C, but holding concentrations at 550 ppmv would mean that warming would continue over the next several centuries, reaching a best estimate of an equilibrium warming of about 3°C. Estimates of warming are based on models that incorporate ‘climate sensitivities’—the amount of warming expected at different atmospheric concentrations of carbon dioxide. Because there are many factors that shape climate, uncertainty in the climate sensitivity is large; the possibility of greater warming, implying additional risk, cannot be ruled out, and smaller warmings are also possible. In the example given above, choosing a concentration target of 550 ppmv could produce a likely global warming at equilibrium as low as 2.1°C, but warming could be as high as 4.3°C, increasing the severity of impacts. Thus, choices about stabilization targets will depend upon value judgments regarding the degree of acceptable risk.

CONCLUSION

This report provides a scientific evaluation of the implications of various climate stabilization targets. The report concludes that certain levels of warming associated with carbon dioxide emissions could lock the Earth and many future generations of humans into very large impacts; similarly, some targets could avoid such changes. It makes clear the importance of 21st century choices regarding long-term climate stabilization.


Follow

Get every new post delivered to your Inbox.

Join 3,095 other followers