## Carbon Dioxide Puzzles

4 February, 2011

I like it when people do interesting calculations and help me put their results on this blog. Renato Iturriaga has plotted a graph that raises some interesting questions about carbon dioxide in the Earth’s atmosphere. Maybe you can help us out!

The atmospheric CO2 concentration, as measured at Mauna Loa in Hawaii, looks like it’s rising quite smoothly apart from seasonal variations:

However, if you take the annual averages from here:

• NOAA Earth System Laboratory, Global Monitoring Division, Recent Mauna Loa CO2.

and plot how much the average rises each year, the graph is pretty bumpy. You’ll see what I mean in a minute.

In comparison, if you plot the carbon dioxide emissions produced by burning fossil fuels, you get a rather smooth curve, at least according to these numbers:

• U. S. Energy Information Administration Total carbon dioxide emissions from the consumption of energy, 1980-2008.

Renato decided to plot both of these curves and their difference. Here’s his result:

The blue curve shows how much CO2 we put into the atmosphere each year by burning fossil fuels, measured in parts per million.

The red curve shows the observed increase in atmospheric CO2.

The green curve is the difference.

The puzzle is to explain this graph. Why is the red curve roughly 40% lower than the blue one? Why is the red curve so jagged?

Of course, a lot of research has already been done on these issues. There are a lot of subtleties! So if you like, think of our puzzle as an invitation to read the existing literature and tell us how well it does at explaining this graph. You might start here, and then read the references, and then keep digging.

But first, let me explain exactly how Renato Iturriaga created this graph! If he’s making a mistake, maybe you can catch it.

The red curve is straightforward: he took the annual mean growth rate of CO2 from the NOAA website I mentioned above, and graphed it. Let me do a spot check to see if he did it correctly. I see a big spike in the red curve around 1998: it looks like the CO2 went up around 2.75 ppm that year. But then the next year it seems to have gone up just about 1 ppm. On the website it says 2.97 ppm for 1998, and 0.91 for 1999. So that looks roughly right, though I’m not completely happy about 1998.

[Note added later: as you'll see below, he actually got his data from here; this explains the small discrepancy.]

Renato got the blue curve by taking the US Energy Information Administration numbers and converting them from gigatons of CO2 to parts per million moles. He assumed that that the atmosphere weighs 5 × 1015 tons and that CO2 gets well mixed with the whole atmosphere each year. Given this, we can simply say that one gigaton is 0.2 parts per million of the atmosphere’s mass.

But people usually measure CO2 in parts per million volume. Now, a mole is just a certain large number of molecules. Furthermore, the volume of a gas at fixed pressure is almost exactly proportional to the number of molecules, regardless of its composition. So parts per million volume is essentially the same as parts per million moles.

So we just need to do a little conversion. Remember:

• The molecular mass of N2 is 28, and about 79% of the atmosphere’s volume is nitrogen.

• The molecular mass of O2 is 32, and about 21% of the atmosphere’s volume is oxygen.

• By comparison, there’s very little of the other gases.

So, the average molecular mass of air is

28 × .79 + 32 × .21 = 28.84

On the other hand, the molecular mass of CO2 is 44. So one ppm mass of CO2 is less than one ppm volume: it’s just

28.84/44 = 0.655

parts per million volume. So, a gigaton of CO2 is about 0.2 ppm mass, but only about

0.2 × 0.655 = 0.13

parts per million volume (or moles).

So to get the blue curve, Renato took gigatons of CO2 and multiplied by 0.13 to get ppm volume. Let me do another spot check! The blue curve reaches about 4 ppm in 2008. Dividing 4 by 0.13 we get about 30, and that’s good, because energy consumption put about 30 gigatons of CO2 into the atmosphere in 2008.

And then, of course, the green curve is the blue one minus the red one:

One puzzle is why the red curve is so much lower than the blue one. The atmospheric CO2 concentration is only going up by about 60% of the CO2 emitted, on average — though the fluctuations are huge. So, you might ask, where’s the rest of the CO2 going?

Probably into the ocean, plants, and soil:

But at first glance, the fact that only 60% stays in the atmosphere seems to contract this famous graph:

This shows it taking many years for a dose of CO2 added to the atmosphere to decrease to 60% of its original level!

Is the famous graph wrong? There are other possible explanations!

Here’s a non-explanation. Humans are putting CO2 into the atmosphere in other ways besides burning fossil fuels. For example, deforestation and other changes in land use put somewhere between 0.5 and 2.7 gigatons of carbon into the atmosphere each year. There’s a lot of uncertainty here. But this doesn’t help solve our puzzle: it means there’s more carbon to account for.

Here’s a possible explanation. Maybe my estimate of 5 × 1015 tons for the mass of the atmosphere is too high! That would change everything. I got my estimate off the internet somewhere — does anyone know a really accurate figure?

Renato came up with a more interesting possible explanation. It’s very important, and very well-known, that CO2 doesn’t leave the atmosphere in a simple exponential decay process. Imagine for simplicity that carbon stays in three boxes:

• Box A: the atmosphere.

• Box B: places that exchange carbon with the atmosphere quite rapidly.

• Box C: places that exchange carbon with the atmosphere and box B quite slowly.

As we pump CO2 into box A, a lot of it quickly flows into box B. It then slowly flows from boxes A and B into box C.

The quick flow from box A to box B accounts for the large amounts of ‘missing’ CO2 in Renato’s graph. But if we stop putting CO2 into box A, it will soon come into equilibrium with box B. At that point, we will not see the CO2 level continue to quickly drop. Instead, CO2 will continue to slowly flow from boxes A and B into box C. So, it can take many years for the atmospheric CO2 concentration to drop to 60% of its original level — as the famous graph suggests.

This makes sense to me. It shows that the red curve can be a lot lower than the blue one even if the famous graph is right.

But I’m still puzzled by the dramatic fluctuations in the red curve! That’s the other puzzle.

## Stabilization Wedges (Part 4)

16 January, 2011

Okay, here are the last two of Pacala and Socolow’s stabilization wedges. Remember, these wedges are ways to reduce carbon emissions. Each one is supposed to ramp up from 2004 to 2054, so that by the end it reduces carbon emissions by 1 gigaton per year. They claimed that seven wedges would be enough to keep emissions flat:

In Part 1 of this series we talked about four wedges involving increased efficiency and conservation. In Part 2 we covered one about shifting from coal to natural gas, and three about carbon capture and storage. In Part 3 we discussed five involving nuclear power and renewable energy. The last two wedges involve forests and agriculture:

14. Stop deforestation, start reforestation. They say we could stop half a gigaton of carbon emissions per if we completely stopped clear-cutting tropical forests over 50 years, instead of just halving the rate at which they’re getting cut down. For another half gigaton, plant 250 million hectares of new forests in the tropics, or 400 million hectares in the temperate zone!

To get a sense of the magnitude here, note that current areas of tropical and temperate forests are 1500 and 700 million hectares, respectively.

Pacala and Socolow also say that another half gigaton of carbon emissions could be prevented by created by establishing approximately 300 million hectares of plantations on nonforested land.

15. Soil management. When forest or grassland is converted to cropland, up to one-half of the soil carbon gets converted to CO2, mainly because tilling increases the rate of decomposition by aerating undecomposed organic matter. Over the course of history, they claim, 55 gigatons of carbon has gone into the atmosphere this way. That’s the equivalent of two wedges. (Note that one wedge, ramping up linearly to 1 gigaton/year for 50 years, adds up to 25 gigatons of carbon by 2054.)

However, good agricultural practices like no-till farming can reverse these losses — also reduce erosion! By 1995, these practices had been adopted on 110 million of the world’s 1600 million hectares of cropland. If this could be extended to all cropland, accompanied by a verification program that enforces practices that actually work as advertised, somewhere between half and one gigaton of carbon per year could be stored in this way. So: maybe half a wedge, maybe a whole wedge!

I’ve seen a lot of argument about both these topics, and I’d love to learn more facts. Some of the controversy concerns the UN’s <a href="REDD+ program, which got a big boost in Cancún. “REDD” stands for Reducing Emissions from Deforestation and Forest Degradation — while the plus sign hints at the the role of conservation, sustainable management of forests, and enhancement of forest carbon stocks.

Some people think REDD+ is great, while others think it could actually hurt. The Wikipedia article says, among other things:

REDD is presented as an “offset” scheme of the carbon markets and thus, will produce carbon credits. Carbon offsets are “emissions-saving projects” that in theory “compensate” for the polluters’ emissions. Offsets allow polluting governments and corporations, which have the historical responsibility to clean up the atmosphere, to buy their way out of the problem with cheap projects that exacerbate social and environmental conflicts in the South. Moreover, it delays any real domestic action where a historical responsibility lies and allows the expansion of more fossil fuel explorations and extractions. The “carbon credits” generated by these projects can be used by industrialised governments and corporations to meet their targets and/or to be traded within the carbon markets.

There’s also a lot of argument about just how much long-term impact on atmospheric CO2 a standing forest has, though everyone seems to agree that cutting one down releases a lot of CO2.

For a bit more, try:

REDD+: Reducing Emissions from Deforestation and Forest Degradation, Center for International Forestry Research.

Next time I’ll give you a update on the stabilization wedges from Stephen Pacala himself, based on a talk he gave in 2008. It’s a bit scary…

20 December, 2010

It’s for real!

Or — to be more cautious — it might soon be for real!

On Thursday December 16th, 2010, California’s Air Resources Board began a cap and trade system for carbon. This system will implement the state’s law mandating that carbon emissions be reduced back to 1990 levels by 2020.

This will amount to a 15% decrease from current emissions.

The system will let greenhouse gas emitters buy and sell emission allowances. It covers everyone who emits more than 5,000 tons of carbon dioxide per year. That’s about 360 businesses, who taken together emit about 85% of the CO2.

At first these business will receive free allowances that cover most of their emissions, but as time passes, they’ll have to buy those allowances through quarterly auctions. According to the plan, there will be two phases. By 2012, all major industrial sources and utilities will be covered. By 2015, distributors of fuels and natural gas will also be included.

The chair of the Air Resources Board, Mary Nichols, gave a speech. Among other things, she said:

This program is the capstone of our climate policy, and will accelerate California’s progress toward a clean energy economy. It rewards efficiency and provides companies with the greatest flexibility to find innovative solutions that drive green jobs, clean our environment, increase our energy security and ensure that California stands ready to compete in the booming global market for clean and renewable energy.

The governor also showed up at this historic board meeting, and gave a speech.

But I can guess what you’re wondering, or at least one of the many things you should be wondering.

“How much can California do by itself?”

Luckily, California is not doing this by itself. By the time the program gets rolling in 2012, California plans to have built a framework for carbon trading with New Mexico, British Columbia, Ontario and Quebec — some of its partners in the Western Climate Initiative.

The green guys are the ‘partners’; the other guys, blue because they’re watching carefully but sadly not taking part, are the ‘observers’.

Furthermore, ten states of the US — New York, New Jersey, Delaware, Maryland and the New England states — have started up another system, the Regional Greenhouse Gas Initiative, which covers only electric utilities. They are already doing auctions.

So, while in theory it might make sense to institute carbon trading on a national basis, political realities have pushed North America down a different path, where smaller regions take the lead in groupings that may transcend national boundaries! And that is very interesting in itself.

## Stabilization Wedges (Part 3)

17 December, 2010

I bet you thought I’d never get back to this! Sorry, I like to do lots of things.

Remember the idea: in 2004, Stephen Pacala and Robert Socolow wrote a now-famous paper on how we could hold atmospheric carbon dioxide below 500 parts per million. They said that to do this, it would be enough to find 7 ways to reduce carbon emissions, each one ramping up linearly to the point of reducing carbon emissions by 1 gigaton per year by 2054.

They called these stabilization wedges, for the obvious reason:

Their paper listed 15 of these wedges. The idea here is to go through them and critique them. In Part 1 of this series we talked about four wedges involving increased efficiency and conservation. In Part 2 we covered one about shifting from coal to natural gas, and three about carbon capture and storage.

Now let’s do nuclear power and renewable energy!

9. Nuclear power. As Pacala and Socolow already argued in wedge 5, replacing 700 gigawatts of efficient coal-fired power plants with some carbon-neutral form of power would save us a gigaton of carbon per year. This would require 700 gigawatts of nuclear power plants running at 90% capacity (just as assumed for the coal plants). The means doubling the world production of nuclear power. The global pace of nuclear power plant construction from 1975 to 1990 could do this! So, this is one of the few wedges that doesn’t seem to require heroic technical feats. But of course, there’s still a downside: we can only substantially boost the use of nuclear power if people become confident about all aspects of its safety.

10. Wind power. Wind power is intermittent: Pacala and Socolow estimate that the ‘peak’ capacity (the amount you get under ideal circumstances) is about 3 times the ‘baseload’ capacity (the amount you can count on). So, to save a gigaton of carbon per year by replacing 700 gigawatts of coal-fired power plants, we need roughly 2000 gigawatts of peak wind power. Wind power was growing at about 30% per year when they wrote their paper, and it had reached a world total of 40 gigawatts. So, getting to 2000 gigawatts would mean multiplying the world production of wind power by a factor of 50. The wind turbines would “occupy” about 30 million hectares, or about 30-45 square meters per person — some on land and some offshore. But because windmills are widely spaced, land with windmills can have multiple uses.

11. Photovoltaic solar power. This too is intermittent, so to save a gigaton of carbon per year we need 2000 gigawatts of peak photovoltaic solar power to replace coal. Like wind, photovoltaic solar was growing at 30% per year when Pacala and Socolow wrote their paper. However, only 3 gigawatts had been installed worldwide. So, getting to 2000 gigawatts would require multiplying the world production of photovoltaic solar power by a factor of 700. See what I mean about ‘heroic feats’? In terms of land, this would take about 2 million hectares, or 2-3 square meters per person.

12. Renewable hydrogen. You’ve probably heard about hydrogen-powered cars. Of course you’ve got to make the hydrogen. Renewable electricity can produce hydrogen for vehicle fuel. 4000 gigawatts of peak wind power, for example, used in high-efficiency fuel-cell cars, could keep us from burning a gigaton of carbon each year in the form of gasoline or diesel fuel. Unfortunately, this is twice as much wind power as we’d need in wedge 10, where we use wind to eliminate the need for burning some coal. Why? Gasoline and diesel have less carbon per unit of energy than coal does.

13. Biofuels. Fossil-carbon fuels can also be replaced by biofuels such as ethanol. To save a gigaton per year of carbon, we could make 5.4 gigaliters per day of ethanol as a replacement for gasoline — provided the process of making this ethanol didn’t burn fossil fuels! Doing this would require multiplying the world production of bioethanol by a factor of 50. It would require 250 million hectares committed to high-yield plantations, or 250-375 square meters per person. That’s an area equal to about one-sixth of the world’s cropland. An even larger area would be required to the extent that the biofuels require fossil-fuel inputs. Clearly this could cut into the land used for growing food.

There you go… let me hear your critique! Which of these measures seem best to you? Which seem worst? But more importantly: why?

Remember: it takes a total of 7 wedges to save the world, according to this paper by Pacala and Socolow.

Next time I’ll tell you about the final two stabilization wedges… and then I’ll give you an update on their idea.

## Cancún

12 December, 2010

What happened at the United Nations Climate Change Conference in Cancún this year? I’m trying to figure that out, and I could use your help.

But if you’re just as confused as I am, this is an easy place to start:

Climate talks wrap with hope for developing nations, Weekend Edition Saturday, National Public Radio.

Here’s what I’ve learned so far.

The good news is, first, that the negotiations didn’t completely collapse. That was a real fear.

Second, 190 countries agreed to start a Green Climate Fund to raise and disburse \$100 billion per year to help developing countries deal with climate change… starting in 2020.

A good idea, but maybe too late. The World Bank estimates that the cost of adapting to a world that’s 2 °C warmer by 2050 will be about \$75-100 billion per year. The International Energy Agency estimates that the cost of supporting clean energy technology in developing countries is \$110 billion per year if we’re going to keep the temperature rise below 2 °C. But these organizations say we need to start now, not a decade from now!

And how to raise the money? The Prime Minister of Norway, Jens Stoltenberg, leads the UN committee that’s supposed to answer this question. He told the BBC that the best approach would be a price on carbon that begins to reflect the damage it does:

Carbon pricing has a double climate effect — it’s a huge source for revenue, but also gives the right incentives for reducing emissions by making it expensive to pollute. The more ambitious we are, the higher the price will be – so there’s a very close link between the ceiling we set for emissions and the price. We estimate that we need a price of about \$20/25 per tonne to mobilise the \$100bn.

Third, our leaders made some steps towards saving the world’s forests. Every year, forests equal to the area of England get cut down. T This has got to stop, for all sorts of reasons. For one thing, it causes 20% of the world’s greenhouse gas emissions — about the same as transportation worldwide!

Cancun set up a framework called REDD+, which stands for Reducing Emissions from Deforestation and Degrading Emissions, with the cute little + standing for broader ecosystem conservation. This is supposed to create incentives to keep forests standing. But there’s a lot of work left. For example, while a \$4.1 billion start-up fund is already in place, there’s no long-term plan for financing REDD+ yet.

The bad news? Well, the main bad news is that there’s still a gap between what countries have pledged to do to reduce carbon emissions, and what they’d need to do to keep the expected rise in temperature below 2 °C — or if you want a clearer goal, keeping CO2 concentrations below 450 parts per million.

But it’s not as bad as you might think… at least if you believe this chart put out by the Center for American Progress. They say:

We found that even prior to the Copenhagen climate summit, if all parties did everything they claimed they would do at the time, the world was only five gigatons of annual emissions shy of the estimated 17 gigatons of carbon dioxide or CO2 equivalent annual reductions needed to put us on a reasonable 2°C pathway. Since three gigatons of the projected reductions came from the economic downturn and improved projections on deforestation and peat emissions, the actual pledges of countries for additional reductions were slightly less than two-thirds of what was needed. But they were still not sufficient for the 2°C target.

and then:

After the Copenhagen Accord was finalized at the December 2009 climate summit, a January 2010 deadline was established for countries to submit pledges for actions by 2020 consistent with the accord’s 2°C goal. Two breakdowns of the pledges in February, and later in March, by Project Catalyst estimated that the five-gigaton gap had shrunk somewhat and more pledges had come in from developing countries. Part of the reason that pledges increased from developing countries was that the Copenhagen Accord had finally made a significant step forward on establishing a system of cooperation between developed and developing countries that had a chance at providing incentives for additional reductions.

And now, they say, the gap is down to 4 gigatons per year. This chart details it… click to make it bigger:

That 4-gigaton gap doesn’t sound so bad. But of course, this estimate assumes that pledges translate into reality!

So, the fingernail-biting saga of our planet continues…

## Stabilization Wedges (Part 2)

23 November, 2010

Okay. We’re going through this paper, which you can read yourself:

• Stephen Pacala and Robert Socolow, Stabilization wedges: solving the climate problem for the next 50 years using current technologies, Science 305 (2004), 968-972.

The paper lists 15 ‘wedges’, each of which could ramp up to reducing carbon emissions by 1 gigaton/year by 2054. We’re going through all these wedges and discussing them. And the Azimuth Project is lucky to have a new member on board — Frederik De Roo — who is summarizing our discussion here:

• Azimuth Project, Stabilization wedges.

So, let’s get going!

Last time we covered four wedges related to energy conservation and increased efficiency. Wedge 5 is in a category of its own:

5. Shifting from coal to natural gas. Natural gas puts out half as much CO2 as coal does when you burn them to make a given amount of electricity. After all, it’s mainly methane, which is made from hydrogen as well as carbon. Suppose by 2054 we have coal power plants working at 90% of capacity with an efficiency of 50%. 700 gigawatts worth of coal plants like this emit 1 gigaton of carbon per year. So, we can reduce carbon emissions by one ‘wedge’ if we replace 1400 gigawatts of such plants with gas-burning plants. That’s four times the 2004 worldwide total of gas-burning plants.

Wedges 6-8 involve carbon capture and storage:

6. Capturing CO2 at power plants. Carbon capture and storage at power plants can stop about 90% of the carbon from reaching the atmosphere, so we can get a wedge by doing this for 800 GW of coal-burning power plants or 1600 GW of gas-burning power plants by 2054. One way to do carbon capture and storage is to make hydrogen and CO2 from fossil fuels, burn the hydrogen in a power plant, and inject the CO2 into the ground. So, from one viewpoint, building a wedge’s worth of carbon capture and storage would resemble a tenfold expansion of the plants that were manufacturing hydrogen in 2004. But it would also require multiplying by 100 the amount of CO2 injected into the ground.

7. Capturing CO2 at plants that make hydrogen for fuel. You’ve probably heard people dream of a hydrogen economy. But it takes energy to make hydrogen. One way is to copy wedge 6, but then ship the hydrogen off for use as fuel instead of burning it to make electricity at power plants. To capture a wedge’s worth of carbon this way, we’d have to make 250 megatons of hydrogen per year from coal, or 500 megatons per year from natural gas. This would require a substantial scale-up from the 2004 total of 40 megatons of hydrogen manufactured by all methods. There would also be the task of building the infrastructure for a hydrogen economy. The challenge of injecting CO2 into the ground would be the same as in wedge 6.

8. Capturing CO2 at plants that turn coal into synthetic fuels. As the world starts running out of oil, people may start turning coal into synfuels, via a process called coal liquefaction. Of course burning these synfuels will release carbon. But suppose only half of the carbon entering a synfuels plant leaves as fuel, while the other half can be captured as CO2 and injected underground. Then we can capture a wedge’s worth of CO2 from coal synfuels plants that produce 1.8 teraliters of synfuels per year. For comparison, total yearly world oil production in 2004 was 4.7 teraliters.

Now: What are the pros and cons of these four wedges? What is the biggest thing that Pacala and Socolow overlooked?

I’m puzzled about the last wedge. Pacala and Socolow say 1 gigaton carbon/year is the flow of carbon in 24 million barrels/day, or 1.4 teraliters/year. They assume the same value for synfuels and allow for imperfect capture, which leads them to conclude that carbon capture at synfuels plants producing 1.8 teraliters/year of synfuel can catch 1 GtC/year. But this calculation doesn’t make sense to me. If we’re catching just half the carbon, and 1 GtC/year = 1.4 teraliters oil/year, don’t we need to generate at least twice that — 2.8 teraliters synfuel/year — to catch wedge’s worth of carbon?

I’m also unclear what percentage of the carbon you can actually capture while turning coal into synfuels. Can you really capture half of it?

There’s also another funny feature of this last wedge. If we assume people are already committed to making synfuels from coal, then I guess it’s true, they’ll emit less carbon if they use carbon capture and storage as part of the manufacturing process. But compared to making electricity or hydrogen as in wedges 6 and 7, turning coal into synfuels seems bound to emit more carbon, even with the help of carbon capture and storage.

In general, it only makes sense to talk about how much carbon emission some action prevents when we compare it to some alternative action. That’s pretty obvious, but it gets a bit confusing when some of Pacala and Socolow’s wedges look like plausible alternatives to other ones.

Another question: how does carbon capture and storage work, actually? Summarizing Pacala and Socolow, I wrote:

One way to do carbon capture and storage is to make hydrogen and CO2 from fossil fuels, burn the hydrogen in a power plant, and inject the CO2 into the ground.

But I’d like to learn the details!

## Carbon Emissions in 2009

22 November, 2010

A news item relayed to us from David Roberts:

• Richard Black, 2009 carbon emissions fall smaller than expected, BBC News, 21 November 2010.

“What we find is a drop in emissions from fossil fuels in 2009 of 1.3%, which is not dramatic,” said lead researcher Pierre Friedlingstein from the UK’s University of Exeter.

“Based on GDP projections last year, we were expecting much more.

“If you think about it, it’s like four days’ worth of emissions; it’s peanuts,” he told BBC News.

The headline figure masked big differences between trends in different groups of countries.

Broadly, developed nations saw emissions fall – Japan fell by 11.8%, the UK by 8.6%, and Germany by 7% – whereas they continued to rise in developing countries with significant industrial output.

China’s emissions grew by 8%, and India’s by 6.2% – connected to the fact that during the recession, it was the industrialised world that really felt the pinch.

The news story is based on this article, which is apparently not freely available:

• P. Friedlingstein et al., Update on CO2 emissions Nature Geoscience, 21 November 2010.

By the way: how come I can afford to create a link to the original article, while the BBC and other mass media cannot? Is it really so bloody difficult? Isn’t it just basic good journalism?

Also by the way: I really like getting good suggestions for environmental news stories to blog about… but I love it when people join the Azimuth Forum and post links to these news articles under News and Information.

Some puzzles. Guess before you google:

1) Which nation has the highest carbon emissions per person? In 2007 its per capita carbon emissions were almost 3 times that of the USA. I bet it’s still the champion today.

2) Say I make some round-trip flights from Los Angeles to Singapore, with one stop each way. How many flights would it take to burn as much carbon as an average US citizen does in a year? A rough estimate, please!

3) How many such flights would equal the yearly carbon emissions of an average world citizen?

(I am calculating the footprint of a flight using Terrapass. I have no idea how accurate it is or how it works. Also: all my figures only count carbon emissions from burning fossil fuels.)

## Stabilization Wedges (Part 1)

16 November, 2010

Okay, let’s look at some plans for combating global warming! And let’s start with this paper:

• Stephen Pacala and Robert Socolow, Stabilization wedges: solving the climate problem for the next 50 years using current technologies, Science 305 (2004), 968-972.

I won’t try to summarize it all today, just a bit.

Stephen Pacala and Robert Socolow wrote this now-famous paper in 2004. Back then we were emitting about 6.2 gigatons of carbon per year, there were 375 ppm of carbon dioxide in the atmosphere, and many proposals to limit global warming urged that we keep the concentration below 500 ppm. Their paper outlined some strategies for keeping it below 500 ppm.

They estimated that to do this, it would be enough to hold emissions flat at 7 gigatons of carbon per year for 50 years, and then lower it to nothing. On the other hand, in a “business as usual” scenario, they estimate the emissions would ramp up to 14 gigatons per year by 2054. That’s 7 too many.

So, to keep emissions flat it would be enough to find 7 ways to reduce carbon emissions, each one of which ramps up linearly to the point of reducing carbon emissions by 1 gigaton/year in 2054. They called these stabilization wedges, because if you graph them, they look like wedges:

Their paper listed 15 possible stabilization wedges, each one with the potential to reduce carbon emissions by 1 gigaton/year by 2054. This is a nice way to start thinking about a very big problem, so many people have adopted it and modified it and criticized it in various ways, which I hope to discuss later. Right now I’ll only tell you about the original paper.

But before I even list any of their stabilization wedges, I should emphasize: stabilizing emissions at 7 gigatons is not enough to stay below 500 ppm forever! Carbon dioxide stays in the atmosphere a very long time. So, as Pacala and Socolow note:

Stabilization at any level requires that net emissions do not simply remain constant, but eventually drop to zero. For example, in one simple model that begins with the stabilization triangle but looks beyond 2054, 500-ppm stabilization is achieved by 50 years of flat emissions, followed by a linear decline of about two-thirds in the following 50 years, and a very slow decline thereafter that matches the declining ocean sink. To develop the revolutionary technologies required for such large emissions reductions in the second half of the century, enhanced research and development would have to begin immediately.

What’s the “declining ocean sink”? Right now the ocean is absorbing a lot of CO2, temporarily saving us from the full brunt of our carbon emissions — while coral reefs, shellfish and certain forms of plankton suffer from increased acidity. But this won’t go on forever; the ocean has limited capacity.

Pacala and Socolow consider several categories of climate wedges:

• efficiency and conservation
• shifting from coal to gas
• carbon capture and storage
• nuclear fission
• renewable energy sources
• forests and agriculture

Today let me just go through the first category. Here they list four wedges:

1. Efficient vehicles: increase the fuel economy for 2 billion cars from 30 to 60 miles per gallon. Or, for those of you who don’t have the incredible good luck of living in the USA: increasing it from 13 to 26 kilometers per liter. When they wrote their paper, there were 500 million cars on the planet. They expected that by 2054 this number would quadruple. When they wrote their paper, average fuel efficiency was 13 kilometers/liter. To achieve this wedge, we’d need that to double.

2. Reduced use of vehicles: decrease car travel for 2 billion 30-mpg cars from 10,000 to 5000 miles per year. In other words: decreasing the average travel from 16,000 to 8000 kilometers per year. (Clearly this wedge and the previous one are not additive: if we do them both, we don’t save 2 gigatons of carbon per year.)

3. Efficient buildings: cut carbon emissions by one-fourth in buildings and appliances. This could be done by following “known and established approaches” to energy efficient space heating and cooling, water heating, lighting, and refrigeration. Half the potential savings are in the buildings in developing countries.

4. Efficient coal plants: raise the efficiency of coal power plants to 60%. In 2004, when they wrote their paper, “coal plants, operating on average at 32% efficiency, produced about one fourth of all carbon emissions: 1.7 GtC/year out of 6.2 GtC/year.” They expected coal power plants to double their output by 2054. To achieve this wedge, we’d need their average efficiency to reach 60%.

There are lot of questions to ask! Which do you think are the most easily achieved of these wedges? What are the biggest problems with their reasoning so far? And so on…

I would love any interesting information you have on: 1) ways to make vehicles more efficient, 2) ways to coax people to drive less, 3) ways to make buildings more efficient, and 4) ways to make coal power plants more efficient. Please post it here, with references if you can!

I’ll conclude for now with a couple of tiny points. First, they seem to vacillate a bit between saying there were 6.2 and 7 gigatons of carbon emitted in 2004, which is a bit odd, but perhaps just a way of giving the world a bit of slack before levelling off emissions at 7 GtC/year. I guess it’s not really a big deal.

Second, they aren’t idiots: despite the above graph, they don’t really think carbon emissions will increase linearly in a business-as-usual scenario. This is just a deliberate simplification on their part. They also show this supposedly more accurate graph:

They say the top curve is “a representative business as usual emissions path” for global carbon emissions in the form of CO2 from fossil fuel combustion and cement manufacture, assuming 1.5% per year growth starting from 7.0 GtC/year in 2004. Note this ignores carbon emissions from deforestation, other greenhouse gases, etc. This curve is growing exponentially, not linearly.

Similarly, the bottom curve isn’t flat: it slopes down. They say the bottom curve is a “CO2 emissions path consistent with atmospheric CO2 stabilization at 500 ppm by 2125 akin to the Wigley, Richels, and Edmonds (WRE) family of stabilization curves described in [11], modified as described in Section 1 of the SOM text.”

Here reference [11] is:

• T. M. L. Wigley, in The Carbon Cycle, eds. T. M. L. Wigley and D. S. Schimel, Cambridge U. Press, Cambridge, 2000, pp. 258–276.

and the “SOM text” is the supporting online material for their paper, which unfortunately doesn’t seem to be available for free.

## Our Future

11 November, 2010

I want to start talking about plans for cutting back carbon emissions, and some scenarios for what may happen, depending on what we do. We’ve got to figure this stuff out!

You’ve probably heard of 350.org, the grassroots organization that’s trying to cut CO2 levels from their current level of about 390 parts per million back down to 350. That’s a noble goal. However, even stabilizing at some much higher level will require a massive effort, given how long CO2 stays in the atmosphere:

In a famous 2004 paper, Pacala and Socolow estimated that in a “business-as-usual” scenario, carbon emissions would rise to 14 gigatons per year by 2054… while to keep CO2 below 500 ppm, they’d need to be held to 7 gigatons/year.

Alas, we’ve already gone up to 8 gigatons of carbon per year! How can we possibly keep things from getting much worse? Pacala and Socolow listed 15 measures, each of which could cut 1 gigaton of carbon per year:

(Click for a bigger image.)

Each one of these measures is big. For example, if you like nuclear power: build 700 gigawatts of nuclear power plants, doubling what we have now. But if you prefer wind: build turbines with 2000 gigawatts of peak capacity, multiplying by 50 what we have now. Or: build photovoltaic solar power plants with 2000 gigawatts of peak capacity, multiplying by 700 what we have now!

Now imagine doing lots of these things…

What if we do nothing? Some MIT scientists estimate that in a business-as-usual scenario, by 2095 there will be about 890 parts per million of CO2 in the atmosphere, and a 90% chance of a temperature increase between 3.5 and 7.3 degrees Celsius. Pick your scenario! The Stern Review on the Economics of Climate Change has a chart of the choices:

(Again, click for a bigger image.)

Of course the Stern Review has its detractors. I’m not claiming any of these issues are settled: I’m just trying to get the discussion started here. In the weeks to come, I want to go through plans and assessments in more detail, to compare them and try to find the truth.

Here are some assessments and projections I want us to discuss:

• International Panel on Climate Change Fourth Assessment Report, Climate Change 2007.

• The Dutch Government, Assessing an IPCC Assessment.

The Copenhagen Diagnosis. Summary on the Azimuth Project.

• National Research Council, Climate Stabilization Targets: Emissions, Concentrations, and Impacts over Decades to Millennia. Summary on the Azimuth Project

• K. Anderson and A. Bows, Reframing the climate change challenge in light of post-2000 emission trends. Summary on the Azimuth Project.

• William D. Norhaus, A Question of Balance: Weighing the Options on Global Warming Policies.

And here are some “plans of action”:

• World Nuclear Association, Nuclear Century Outlook. Summary and critique on the Azimuth Project.

• Mark Z. Jacobson and Mark A. Delucchi, A path to sustainable energy: how to get all energy from wind, water and solar power by 2030. Summary and critique on the Azimuth Project.

• Joe Romm, How the world can (and will) stabilize at 350 to 450 ppm: The full global warming solution. Summary on the Azimuth Project.

• Robert Pacala and Stephen Socolow, Stabilization wedges: solving the climate problem for the next 50 years with current technologies. Summary on the Azimuth Project

• New Economics Foundation, The Great Transition: A Tale of How it Turned Out Right. Summary on the Azimuth Project.

• The Union of Concerned Scientists, Climate 2030: A National Blueprint for a Clean Energy Economy.

• The Scottish Government, Renewables Action Plan.

• Bjorn Lømborg and the Copenhagen Business School, Smart Solutions to Climate Change.

As you can see, there’s already a bit about some of these on the Azimuth Project. I want more.

What are the most important things I’m missing on this list? I want broad assessments and projections of the world-wide situation on carbon emissions and energy, and even better, global plans of action. I want us to go through these, compare them, and try to understand where we stand.

## This Week’s Finds (Week 303)

30 September, 2010

Now for the second installment of my interview with Nathan Urban, a colleague who started out in quantum gravity and now works on "global climate change from an Earth system perspective, with an emphasis on Bayesian data-model calibration, probabilistic prediction, risk assessment, and decision analysis".

But first, a word about Bali. One of the great things about living in Singapore is that it’s close to a lot of interesting places. My wife and I just spent a week in Ubud. This town is the cultural capital of Bali — full of dance, music, and crafts. It’s also surrounded by astounding terraced rice paddies:

In his book Whole Earth Discipline, Stewart Brand says "one of the finest examples of beautifully nuanced ecosystem engineering is the thousand-year-old terraced rice irrigation complex in Bali".

Indeed, when we took a long hike with a local guide, Made Dadug, we learned that that all the apparent "weeds" growing in luxuriant disarray near the rice paddies were in fact carefully chosen plants: cacao, coffee, taro, ornamental flowers, and so on. "See this bush? It’s citronella — people working on the fields grab a pinch and use it for mosquito repellent." When a paddy loses its nutrients they plant sweet potatos there instead of rice, to restore the soil.

Irrigation is managed by a system of local water temples, or "subaks". It’s not a top-down hierarchy: instead, each subak makes decisions in a more or less democratic way, while paying attention to what neighboring subaks do. Brand cites the work of Steve Lansing on this subject:

• J. Stephen Lansing, Perfect Order: Recognizing Complexity in Bali, Princeton U. Press, Princeton, New Jersey, 2006.

Physicists interested in the spontaneous emergence of order will enjoy this passage:

This book began with a question posed by a colleague. In 1992 I gave a lecture at the Santa Fe Institute, a recently created research center devoted to the study of "complex systems." My talk focused on a simulation model that my colleague James Kremer and I had created to investigate the ecological role of water temples. I need to explain a little about how this model came to be built; if the reader will bear with me, the relevance will soon become clear.

Kremer is a marine scientist, a systems ecologist, and a fellow surfer. One day on a California beach I told him the story of the water temples, and of my struggles to convince the consultants that the temples played a vital role in the ecology of the rice terraces. I asked Jim if a simulation model, like the ones he uses to study coastal ecology, might help to clarify the issue. It was not hard to persuade him to come to Bali to take a look. Jim quickly saw that a model of a single water temple would not be very useful. The whole point about water temples is that they interact. Bali is a steep volcanic island, and the rivers and streams are short and fast. Irrigation systems begin high up on the volcanoes, and follow one after another at short intervals all the way to the seacoast. The amount of water each subak gets depends less on rainfall than on how much water is used by its upstream neighbors. Water temples provide a venue for the farmers to plan their irrigation schedules so as to avoid shortages when the paddies need to be flooded. If pests are a problem, they can synchronize harvests and flood a block of terraces so that there is nothing for the pests to eat. Decisions about water taken by each subak thus inevitably affect its neighbors, altering both the availability of water and potential levels of pest infestations.

Jim proposed that we build a simulation model to capture all of these processes for an entire watershed. Having recently spent the best part of a year studying just one subak, the idea of trying to model nearly two hundred of them at once struck me as rather ambitious. But as Jim pointed out, the question is not whether flooding can control pests, but rather whether the entire collection of temples in a watershed can strike an optimal balance between water sharing and pest control.

We set to work plotting the location of all 172 subaks lying between the Oos and Petanu rivers in central Bali. We mapped the rivers and irrigation systems, and gathered data on rainfall, river flows, irrigation schedules, water uptake by crops such as rice and vegetables, and the population dynamics of the major rice pests. With these data Jim constructed a simulation model. At the beginning of each year the artificial subaks in the model are given a schedule of crops to plant for the next twelve months, which defines their irrigation needs. Then, based on historic rainfall data, we simulate rainfall, river flow, crop growth, and pest damage. The model keeps track of harvest data and also shows where water shortages or pest damage occur. It is possible to simulate differences in rainfall patterns or the growth of different kinds of crops, including both native Balinese rice and the new rice promoted by the Green Revolution planners. We tested the model by simulating conditions for two cropping seasons, and compared its predictions with real data on harvest yields for about half the subaks. The model did surprisingly well, accurately predicting most of the variation in yields between subaks. Once we knew that the model’s predictions were meaningful, we used it to compare different scenarios of water management. In the Green Revolution scenario, every subak tries to plant rice as often as possible and ignores the water temples. This produces large crop losses from pest outbreaks and water shortages, much like those that were happening in the real world. In contrast, the “water temple” scenario generates the best harvests by minimizing pests and water shortages.

Back at the Santa Fe Institute, I concluded this story on a triumphant note: consultants to the Asian Development Bank charged with evaluating their irrigation development project in Bali had written a new report acknowledging our conclusions. There would be no further opposition to management by water temples. When I finished my lecture, a researcher named Walter Fontana asked a question, the one that prompted this book: could the water temple networks self-organize? At first I did not understand what he meant by this. Walter explained that if he understood me correctly, Kremer and I had programmed the water temple system into our model, and shown that it had a functional role. This was not terribly surprising. After all, the farmers had had centuries to experiment with their irrigation systems and find the right scale of coordination. But what kind of solution had they found? Was there a need for a Great Designer or an Occasional Tinkerer to get the whole watershed organized? Or could the temple network emerge spontaneously, as one subak after another came into existence and plugged in to the irrigation systems? As a problem solver, how well could the temple networks do? Should we expect 10 percent of the subaks to be victims of water shortages at any given time because of the way the temple network interacts with the physical hydrology? Thirty percent? Two percent? Would it matter if the physical layout of the rivers were different? Or the locations of the temples?

Answers to most of these questions could only be sought if we could answer Walter’s first large question: could the water temple networks self-organize? In other words, if we let the artificial subaks in our model learn a little about their worlds and make their own decisions about cooperation, would something resembling a water temple network emerge? It turned out that this idea was relatively easy to implement in our computer model. We created the simplest rule we could think of to allow the subaks to learn from experience. At the end of a year of planting and harvesting, each artificial subak compares its aggregate harvests with those of its four closest neighbors. If any of them did better, copy their behavior. Otherwise, make no changes. After every subak has made its decision, simulate another year and compare the next round of harvests. The first time we ran the program with this simple learning algorithm, we expected chaos. It seemed likely that the subaks would keep flipping back and forth, copying first one neighbor and then another as local conditions changed. But instead, within a decade the subaks organized themselves into cooperative networks that closely resembled the real ones.

Lansing describes how attempts to modernize farming in Bali in the 1970′s proved problematic:

To a planner trained in the social sciences, management by water temples looks like an arcane relic from the premodern era. But to an ecologist, the bottom-up system of control has some obvious advantages. Rice paddies are artificial aquatic ecosystems, and by adjusting the flow of water farmers can exert control over many ecological processes in their fields. For example, it is possible to reduce rice pests (rodents, insects, and diseases) by synchronizing fallow periods in large contiguous blocks of rice terraces. After harvest, the fields are flooded, depriving pests of their habitat and thus causing their numbers to dwindle. This method depends on a smoothly functioning, cooperative system of water management, physically embodied in proportional irrigation dividers, which make it possible to tell at a glance how much water is flowing into each canal and so verify that the division is in accordance with the agreed-on schedule.

Modernization plans called for the replacement of these proportional dividers with devices called "Romijn gates," which use gears and screws to adjust the height of sliding metal gates inserted across the entrances to canals. The use of such devices makes it impossible to determine how much water is being diverted: a gate that is submerged to half the depth of a canal does not divert half the flow, because the velocity of the water is affected by the obstruction caused by the gate itself. The only way to accurately estimate the proportion of the flow diverted by a Romijn gate is with a calibrated gauge and a table. These were not supplied to the farmers, although \$55 million was spent to install Romijn gates in Balinese irrigation canals, and to rebuild some weirs and primary canals.

The farmers coped with the Romijn gates by simply removing them or raising them out of the water and leaving them to rust.

On the other hand, Made said that the people village really appreciated this modern dam:

Using gears, it takes a lot less effort to open and close than the old-fashioned kind:

Later in this series of interviews we’ll hear more about sustainable agriculture from Thomas Fischbacher.

But now let’s get back to Nathan!

JB: Okay. Last time we were talking about the things that altered your attitude about climate change when you started working on it. And one of them was how carbon dioxide stays in the atmosphere a long time. Why is that so important? And is it even true? After all, any given molecule of CO2 that’s in the air now will soon get absorbed by the ocean, or taken up by plants.

NU: The longevity of atmospheric carbon dioxide is important because it determines the amount of time over which our actions now (fossil fuel emissions) will continue to have an influence on the climate, through the greenhouse effect.

You have heard correctly that a given molecule of CO2 doesn’t stay in the atmosphere for very long. I think it’s about 5 years. This is known as the residence time or turnover time of atmospheric CO2. Maybe that molecule will go into the surface ocean and come back out into the air; maybe photosynthesis will bind it in a tree, in wood, until the tree dies and decays and the molecule escapes back to the atmosphere. This is a carbon cycle, so it’s important to remember that molecules can come back into the air even after they’ve been removed from it.

But the fate of an individual CO2 molecule is not the same as how long it takes for the CO2 content of the atmosphere to decrease back to its original level after new carbon has been added. The latter is the answer that really matters for climate change. Roughly, the former depends on the magnitude of the gross carbon sink, while the latter depends on the magnitude of the net carbon sink (the gross sink minus the gross source).

As an example, suppose that every year 100 units of CO2 are emitted to the atmosphere from natural sources (organic decay, the ocean, etc.), and each year (say with a 5 year lag), 100 units are taken away by natural sinks (plants, the ocean, etc). The 5 years actually doesn’t matter here; the system is in steady-state equilibrium, and the amount of CO2 in the air is constant. Now suppose that humans add an extra 1 unit of CO2 each year. If nothing else changes, then the amount of carbon in the air will increase every year by 1 unit, indefinitely. Far from the carbon being purged in 5 years, we end up with an arbitrarily large amount of carbon in the air.

Even if you only add carbon to the atmosphere for a finite time (e.g., by running out of fossil fuels), the CO2 concentration will ultimately reach, and then perpetually remain at, a level equivalent to the amount of new carbon added. Individual CO2 molecules may still get absorbed within 5 years of entering the atmosphere, and perhaps fewer of the carbon atoms that were once in fossil fuels will ultimately remain in the atmosphere. But if natural sinks are only removing an amount of carbon equal in magnitude to natural sources, and both are fixed in time, you can see that if you add extra fossil carbon the overall atmospheric CO2 concentration can never decrease, regardless of what individual molecules are doing.

In reality, natural carbon sinks tend to grow in proportion to how much carbon is in the air, so atmospheric CO2 doesn’t remain elevated indefinitely in response to a pulse of carbon into the air. This is kind of the biogeochemical analog to the "Planck feedback" in climate dynamics: it acts to restore the system to equilibrium. To first order, atmospheric CO2 decays or "relaxes" exponentially back to the original concentration over time. But this relaxation time (variously known as a "response time", "adjustment time", "recovery time", or, confusingly, "residence time") isn’t a function of the residence time of a CO2 molecule in the atmosphere. Instead, it depends on how quickly the Earth’s carbon removal processes react to the addition of new carbon. For example, how fast plants grow, die, and decay, or how fast surface water in the ocean mixes to greater depths, where the carbon can no longer exchange freely with the atmosphere. These are slower processes.

There are actually a variety of response times, ranging from years to hundreds of thousands of years. The surface mixed layer of the ocean responds within a year or so; plants within decades to grow and take up carbon or return it to the atmosphere through rotting or burning. Deep ocean mixing and carbonate chemistry operate on longer time scales, centuries to millennia. And geologic processes like silicate weathering are even slower, tens of thousands of years. The removal dynamics are a superposition of all these processes, with a fair chunk taken out quickly by the fast processes, and slower processes removing the remainder more gradually.

To summarize, as David Archer put it, "The lifetime of fossil fuel CO2 in the atmosphere is a few centuries, plus 25 percent that lasts essentially forever." By "forever" he means "tens of thousands of years" — longer than the present age of human civilization. This inspired him to write this pop-sci book, taking a geologic view of anthropogenic climate change:

• David Archer, The Long Thaw: How Humans Are Changing the Next 100,000 Years of Earth’s Climate, Princeton University Press, Princeton, New Jersey, 2009.

A clear perspective piece on the lifetime of carbon is:

• Mason Inman, Carbon is forever, Nature Reports Climate Change, 20 November 2008.

which is based largely on this review article:

• David Archer, Michael Eby, Victor Brovkin, Andy Ridgwell, Long Cao, Uwe Mikolajewicz, Ken Caldeira, Katsumi Matsumoto, Guy Munhoven, Alvaro Montenegro, and Kathy Tokos, Atmospheric lifetime of fossil fuel carbon dioxide, Annual Review of Earth and Planetary Sciences 37 (2009), 117-134.

For climate implications, see:

• Susan Solomon, Gian-Kasper Plattner, Reto Knutti and Pierre Friedlingstein, Irreversible climate change due to carbon dioxide emissions, PNAS 106 (2009), 1704-1709.

• M. Eby, K. Zickfeld, A. Montenegro, D. Archer, K. J. Meissner and A. J. Weaver, Lifetime of anthropogenic climate change: millennial time scales of potential CO2 and surface temperature perturbations, Journal of Climate 22 (2009), 2501-2511.

• Long Cao and Ken Caldeira, Atmospheric carbon dioxide removal: long-term consequences and commitment, Environmental Research Letters 5 (2010), 024011.

For the very long term perspective (how CO2 may affect the glacial-interglacial cycle over geologic time), see:

• David Archer and Andrey Ganopolski, A movable trigger: Fossil fuel CO2 and the onset of the next glaciation, Geochemistry Geophysics Geosystems 6 (2005), Q05003.

JB: So, you’re telling me that even if we do something really dramatic like cut fossil fuel consumption by half in the next decade, we’re still screwed. Global warming will keep right on, though at a slower pace. Right? Doesn’t that make you feel sort of hopeless?

NU: Yes, global warming will continue even as we reduce emissions, although more slowly. That’s sobering, but not grounds for total despair. Societies can adapt, and ecosystems can adapt — up to a point. If we slow the rate of change, then there is more hope that adaptation can help. We will have to adapt to climate change, regardless, but the less we have to adapt, and the more gradual the adaptation necessary, the less costly it will be.

What’s even better than slowing the rate of change is to reduce the overall amount of it. To do that, we’d need to not only reduce carbon emissions, but to reduce them to zero before we consume all fossil fuels (or all of them that would otherwise be economically extractable). If we emit the same total amount of carbon, but more slowly, then we will get the same amount of warming, just more slowly. But if we ultimately leave some of that carbon in the ground and never burn it, then we can reduce the amount of final warming. We won’t be able to stop it dead, but even knocking a degree off the extreme scenarios would be helpful, especially if there are "tipping points" that might otherwise be crossed (like a threshold temperature above which a major ice sheet will disintegrate).

So no, I don’t feel hopeless that we can, in principle, do something useful to mitigate the worst effects of climate change, even though we can’t plausibly stop or reverse it on normal societal timescales. But sometimes I do feel hopeless that we lack the public and political will to actually do so. Or at least, that we will procrastinate until we start seeing extreme consequences, by which time it’s too late to prevent them. Well, it may not be too late to prevent future, even more extreme consequences, but the longer we wait, the harder it is to make a dent in the problem.

I suppose here I should mention the possibility of climate geoengineering, which is a proposed attempt to artificially counteract global warming through other means, such as reducing incoming sunlight with reflective particles in the atmosphere, or space mirrors. That doesn’t actually cancel all climate change, but it can negate a lot of the global warming. There are many risks involved, and I regard it as a truly last-ditch effort if we discover that we really are "screwed" and can’t bear the consequences.

There is also an extreme form of carbon cycle geoengineering, known as air capture and sequestration, which extracts CO2 from the atmosphere and sequesters it for long periods of time. There are various proposed technologies for this, but it’s highly uncertain whether this can feasibly be done on the necessary scales.

JB: Personally, I think society will procrastinate until we see extreme climate changes. Recently millions of Pakistanis were displaced by floods: a quarter of their country was covered by water. We can’t say for sure this was caused by global warming — but it’s exactly the sort of thing we should expect.

But you’ll notice, this disaster is nowhere near enough to make politicians talk about cutting fossil fuel usage! It’ll take a lot of disasters like this to really catch people’s attention. And by then we’ll be playing a desperate catch-up game, while people in many countries are struggling to survive. That won’t be easy. Just think how little attention the Pakistanis can spare for global warming right now.

Anyway, this is just my own cheery view. But I’m not hopeless, because I think there’s still a lot we can do to prevent a terrible situation from becoming even worse. Since I don’t think the human race will go extinct anytime soon, it would be silly to "give up".

Now, you’re just started a position at the Woodrow Wilson School at Princeton. When I was an undergrad there, this school was the place for would-be diplomats. What’s a nice scientist like you doing in a place like this? I see you’re in the Program in Science, Technology and Environmental Policy, or "STEP program". Maybe it’s too early for you to give a really good answer, but could you say a bit about what they do?

NU: Let me pause to say that I don’t know whether the Pakistan floods are "exactly the sort of thing we should expect" to happen to Pakistan, specifically, as a result of climate change. Uncertainty in the attribution of individual events is one reason why people don’t pay attention to them. But it is true that major floods are examples of extreme events which could become more (or less) common in various regions of the world in response to climate change.

Returning to your question, the STEP program includes a number of scientists, but we are all focused on policy issues because the Woodrow Wilson School is for public and international affairs. There are physicists who work on nuclear policy, ecologists who study environmental policy and conservation biology, atmospheric chemists who look at ozone and air pollution, and so on. Obviously, climate change is intimately related to public and international policy. I am mostly doing policy-relevant science but may get involved in actual policy to some extent. The STEP program has ties to other departments such as Geosciences, interdisciplinary umbrella programs like the Atmospheric and Ocean Sciences program and the Princeton Environmental Institute, and NOAA’s nearby Geophysical Fluid Dynamics Laboratory, one of the world’s leading climate modeling centers.

JB: How much do you want to get into public policy issues? Your new boss, Michael Oppenheimer, used to work as chief scientist for the Environmental Defense Fund. I hadn’t known much about them, but I’ve just been reading a book called The Climate War. This book says a lot about the Environmental Defense Fund’s role in getting the US to pass cap-and-trade legislation to reduce sulfur dioxide emissions. That’s quite an inspiring story! Many of the same people then went on to push for legislation to reduce greenhouse gases, and of course that story is less inspiring, so far: no success yet. Can you imagine yourself getting into the thick of these political endeavors?

NU: No, I don’t see myself getting deep into politics. But I am interested in what we should be doing about climate change, specifically, the economic assessment of climate policy in the presence of uncertainties and learning. That is, how hard should we be trying to reduce CO2 emissions, accounting for the fact that we’re unsure what climate the future will bring, but expect to learn more over time. Michael is very interested in this question too, and the harder problem of "negative learning":

• Michael Oppenheimer, Brian C. O’Neill and Mort Webster, Negative learning, Climatic Change 89 (2008), 155-172.

"Negative learning" occurs if what we think we’re learning is actually converging on the wrong answer. How fast could we detect and correct such an error? It’s hard enough to give a solid answer to what we might expect to learn, let alone what we don’t expect to learn, so I think I’ll start with the former.

I am also interested in the value of learning. How will our policy change if we learn more? Can there be any change in near-term policy recommendations, or will we learn slowly enough that new knowledge will only affect later policies? Is it more valuable — in terms of its impact on policy — to learn more about the most likely outcomes, or should we concentrate on understanding better the risks of the worst-case scenarios? What will cause us to learn the fastest? Better surface temperature observations? Better satellites? Better ocean monitoring systems? What observables should they we looking at?

The question "How much should we reduce emissions" is, partially, an economic one. The safest course of action from the perspective of climate impacts is to immediately reduce emissions to a much lower level. But that would be ridiculously expensive. So some kind of cost-benefit approach may be helpful: what should we do, balancing the costs of emissions reductions against their climate benefits, knowing that we’re uncertain about both. I am looking at so-called "economic integrated assessment" models, which combine a simple model of the climate with an even simpler model of the world economy to understand how they influence each other. Some argue these models are too simple. I view them more as a way of getting order-of-magnitude estimates of the relative values of different uncertainty scenarios or policy options under specified assumptions, rather than something that can give us "The Answer" to what our emissions targets should be.

In a certain sense it may be moot to look at such cost-benefit analyses, since there is a huge difference between "what may be economically optimal for us to do" and "what we will actually do". We have not yet approached current policy recommendations, so what’s the point of generating new recommendations? That’s certainly a valid argument, but I still think it’s useful to have a sense of the gap between what we are doing and what we "should" be doing.

Economics can only get us so far, however (and maybe not far at all). Traditional approaches to economics have a very narrow way of viewing the world, and tend to ignore questions of ethics. How do you put an economic value on biodiversity loss? If we might wipe out polar bears, or some other species, or a whole lot of species, how much is it "worth" to prevent that? What is the Great Barrier Reef worth? Its value in tourism dollars? Its value in "ecosystem services" (the more nebulous economic activity which indirectly depends on its presence, such as fishing)? Does it have intrinsic value, and is worth something (what?) to preserve, even if it has no quantifiable impact on the economy whatsoever?

You can continue on with questions like this. Does it make sense to apply standard economic discounting factors, which effectively value the welfare of future generations less than that of the current generation? See for example:

• John Quiggin, Stern and his critics on discounting and climate change: an editorial essay, Climatic Change 89 (2008), 195-205.

Economic models also tend to preserve present economic disparities. Otherwise, their "optimal" policy is to immediately transfer a lot of the wealth of developed countries to developing countries — and this is without any climate change — to maximize the average "well-being" of the global population, on the grounds that a dollar is worth more to a poor person than a rich person. This is not a realistic policy and arguably shouldn’t happen anyway, but you do have to be careful about hard-coding potential inequities into your models:

• Seth D. Baum and William E. Easterling, Space-time discounting in climate change adaptation, Mitigation and Adaptation Strategies for Global Change 15 (2010), 591-609.

More broadly, it’s possible for economics models to allow sea level rise to wipe out Bangladesh, or other extreme scenarios, simply because some countries have so little economic output that it doesn’t "matter" if they disappear, as long as other countries become even more wealthy. As I said, economics is a narrow lens.

After all that, it may seem silly to be thinking about economics at all. The main alternative is the "precautionary principle", which says that we shouldn’t take suspected risks unless we can prove them safe. After all, we have few geologic examples of CO2 levels rising as far and as fast as we are likely to increase them — to paraphrase Wally Broecker, we are conducting an uncontrolled and possibly unprecedented experiment on the Earth. This principle has some merits. The common argument, "We should do nothing unless we can prove the outcome is disastrous", is a strange burden of proof from a decision analytic point of view — it has little to do with the realities of risk management under uncertainty. Nobody’s going to say "You can’t prove the bridge will collapse, so let’s build it". They’re going to say "Prove it’s safe (to within a certain guarantee) before we build it". Actually, a better analogy to the common argument might be: you’re driving in the dark with broken headlights, and insist “You’ll have to prove there are no cliffs in front of me before I’ll consider slowing down.” In reality, people should slow down, even if it makes them late, unless they know there are no cliffs.

But the precautionary principle has its own problems. It can imply arbitrarily expensive actions in order to guard against arbitrarily unlikely hazards, simply because we can’t prove they’re safe, or precisely quantify their exact degree of unlikelihood. That’s why I prefer to look at quantitative cost-benefit analysis in a probabilistic framework. But it can be supplemented with other considerations. For example, you can look at stabilization scenarios: where you "draw a line in the sand" and say we can’t risk crossing that, and apply economics to find the cheapest way to avoid crossing the line. Then you can elaborate that to allow for some small but nonzero probability of crossing it, or to allow for temporary "overshoot", on the grounds that it might be okay to briefly cross the line, as long as we don’t stay on the other side indefinitely. You can tinker with discounting assumptions and the decision framework of expected utility maximization. And so on.

JB: This is fascinating stuff. You’re asking a lot of really important questions — I think I see about 17 question marks up there. Playing the devil’s advocate a bit, I could respond: do you known any answers? Of course I don’t expect "ultimate" answers, especially to profound questions like how much we should allow economics to guide our decision, versus tempering it with other ethical considerations. But it would be nice to see an example where thinking about these issues turned up new insights that actually changed people’s behavior. Cases where someone said "Oh, I hadn’t thought of that…", and then did something different that had a real effect.

You see, right now the world as it is seems so far removed from the world as it should be that one can even start to doubt the usefulness of pondering the questions you’re raising. As you said yourself, "We’re not yet even coming close to current policy recommendations, so what’s the point of generating new recommendations?"

I think the cap-and-trade idea is a good example, at least as far as sulfur dioxide emissions go: the Clean Air Act Amendments of 1990 managed to reduce SO2 emissions in the US from about 19 million tons in 1980 to about 7.6 million tons in 2007. Of course this idea is actually a bunch of different ideas that need to work together in a certain way… but anyway, some example related to global warming would be a bit more reassuring, given our current problems with that.

NU: Climate change economics has been very influential in generating momentum for putting a price on carbon (through cap-and-trade or otherwise), in Europe and the U.S., in showing that such policy had the potential to be a net benefit considering the risks of climate change. SO2 emissions markets are one relevant piece of this body of research, although the CO2 problem is much bigger in scope and presents more problems for such approaches. Climate economics has been an important synthesis of decision analysis and scientific uncertainty quantification, which I think we need more of. But to be honest, I’m not sure what immediate impact additional economic work may have on mitigation policy, unless we begin approaching current emissions targets. So from the perspective of immediate applications, I also ponder the usefulness of answering these questions.

That, however, is not the only perspective I think about. I’m also interested in how what we should do is related to what we might learn — if not today, then in the future. There are still important open questions about how well we can see something potentially bad coming, the answers to which could influence policies. For example, if a major ice sheet begins to substantially disintegrate within the next few centuries, would we be able to see that coming soon enough to step up our mitigation efforts in time to prevent it? In reality that’s a probabilistic question, but let’s pretend it’s a binary outcome. If the answer is "yes", that could call for increased investment in "early warning" observation systems, and a closer coupling of policy to the data produced by such systems. (Well, we should be investing more in those anyway, but people might get the point more strongly, especially if research shows that we’d only see it coming if we get those systems in place and tested soon.) If the answer is "no", that could go at least three ways. One way it could go is that the precautionary principle wins: if we think that we could put coastal cities under water, and we wouldn’t see it coming in time to prevent it, that might finally prompt more preemptive mitigation action. Another is that we start looking more seriously at last-ditch geoengineering approaches, or carbon air capture and sequestration. Or, if people give up on modifying the climate altogether, then it could prompt more research and development into adaptation. All of those outcomes raise new policy questions, concerning how much of what policy response we should aim for.

Which brings me to the next policy option. The U.S. presidential science advisor, John Holdren, has said that we have three choices for climate change: mitigate, adapt, or suffer. Regardless of what we do about the first, people will likely be doing some of the other two; the question is how much. If you’re interested in research that has a higher likelihood of influencing policy in the near term, adaptation is probably what you should work on. (That, or technological approaches like climate/carbon geoengineering, energy systems, etc.) People are already looking very seriously at adaptation (and in some cases are already putting plans into place). For example, the Port Authority of Los Angeles needs to know whether, or when, to fortify their docks against sea level rise, and whether a big chunk of their business could disappear if the Northwest Passage through the Arctic Ocean opens permanently. They have to make these investment decisions regardless of what may happen with respect to geopolitical emissions reduction negotiations. The same kinds of learning questions I’m interested in come into play here: what will we know, and when, and how should current decisions be structured knowing that we will be able to periodically adjust those decisions?

So, why am I not working on adaptation? Well, I expect that I will be, in the future. But right now, I’m still interested in a bigger question, which is how well can we bound the large risks and our ability to prevent disasters, rather than just finding the best way to survive them. What is the best and the worst that can happen, in principle? Also, I’m concerned that right now there is too much pressure to develop adaptation policies to a level of detail which we don’t yet have the scientific capability to develop. While global temperature projections are probably reasonable within their stated uncertainty ranges, we have a very limited ability to predict, for example, how precipitation may change over a particular city. But that’s what people want to know. So scientists are trying to give them an answer. But it’s very hard to say whether some of those answers right now are actionably credible. You have to choose your problems carefully when you work in adaptation. Right now I’m opting to look at sea level rise, partly because it is less affected by the some of the details of local meteorology.

JB: Interesting. I think I’m going to cut our conversation here, because at this point it took a turn that will really force me to do some reading! And it’s going to take a while. But it should be fun!

The climatic impacts of releasing fossil fuel CO2 to the atmosphere will last longer than Stonehenge, longer than time capsules, longer than nuclear waste, far longer than the age of human civilization so far. – David Archer