Climate Reality Project

14 September, 2011

The Climate Reality Project is planning a presentation called “24 Hours of Reality” beginning at 7 pm Central Time on September 14th, arguing for the connection between more extreme weather and climate change. “There will be a full-on assault on climate skeptics, exploring where they get their funding from.”

The Washington Post has an interview with Al Gore about this project:

• Brad Plumer, Al Gore: ‘The message still has to be about the reality we’re facing’ , Washington Post, 12 September 2011.

I’ll quote a bit:

Brad Plumer: “An Inconvenient Truth” was basically a primer on global warming—the causes, the problems it creates, the ways we can avert it. So what more is there to add? How will this new presentation be different?

Al Gore: It’s very different—a few of the images are the same, but 95 percent of the slides are completely new. The science linking the increased frequency and severity of extreme weather to the climate crisis has matured tremendously in the last couple of years. Think about the last year, we’ve had floods in Pakistan displacing 20 million people and further destabilizing a nuclear-armed country. We’ve had drought and wildfires in Russia. In Australia you’ve got floods the size of France and Germany combined. Then there’s drought in Texas—out of 254 counties in Texas, 252 are on fire. I’m talking to you from Nashville, where the city lost the equivalent of an entire year’s budget from recent floods—the area has never been flooded like this before, so no one had flood insurance.

That’s the reality we’ve got to focus on. This presentation is a defense of the science and the scientists, against the timeworn claims by deniers.

BP: Now, whenever a natural disaster happens—say, a flood or a wildfire—you typically see scientists quoted in the press saying, “Well, it’s hard to attribute any single event to global warming, although this is the sort of event we should see more of as the planet warms.” As I understand it, this sort of extra-careful hedge is becoming outdated. Scientists actually are making tighter connections between current disasters and climate change, correct?

AG: Yes, that shift in the way scientists describe the linkage is one of the elements of this new slideshow. It’s a subtle but extremely important shift. They used to say that the climate crisis changes the odds of extreme weather events—this was the old metaphor of “loading the dice.” Now, they say there’s not only a greater likelihood of rolling 12s, but we’re actually loading 13s and could soon be rolling 15s and 16s. As scientists like James Hansen [of NASA’s Goddard Institute for Space Studies] and Kevin Trenberth [of the National Center for Atmospheric Research] point out, the changes brought about by man-made global-warming pollution have reached the stage that every event is now being affected by it in some way.

In the last 30 years, for instance, we’ve seen water vapor above the oceans increase by 4 percent, and many storms reach as far as 2,000 miles out to collect water vapor. So when you have a 4 percent increase over such a large area, the storms are now fueled with more water vapor than was the case 30 years ago. That means we’re getting larger downpours. And in drought-prone areas, we’re seeing increasing intervals between downpours, which is one of several reasons why we’re seeing extreme droughts.

BP: Now, you’re talking about presenting the stark facts as a way of persuading people that climate change is a problem. Yet when you look at polls on climate belief, one thing that stands out is that the people most dismissive of global warming tend to be the most confident that they have all the information they need. Doesn’t that suggest there’s a point at which more information doesn’t actually persuade anyone?

AG: Well, that logic hasn’t led deniers to stop pressing the inaccurate disinformation about climate science. And the fact is that quite a few of the large carbon polluters and their allies in the ideological right wing have been spending hundreds of millions of dollars per year to mislead people. Have you read Naomi Oreske’s book Merchants of Doubt? The tobacco companies a few decades ago pioneered this organized disinformation technique to create artificial doubt about the science of their product—they hired actors to dress them up as doctors and had them say, “I’m a doctor, there’s nothing wrong with smoking cigarettes; in fact, it’ll make you feel better.” And some of the same people who took money from tobacco companies to lie about tobacco science are now taking money from large carbon polluters to lie about the reality of the climate crisis.

BP: Okay, but taking that opposition is a given, there’s been a lot of discussion about whether something more is needed to fight it than yet another recital of climate science facts.

AG: Right, you hear a lot of people giving advice on how to talk about climate science—how you need to dress differently or stand on your head and deliver the message in rhyme. And I respect all that, and I hope a lot of people will present the message in their own way. But my message is about presenting the reality. I have faith in the United States and our ability to make good decisions based on the facts. And I believe Mother Nature is speaking very loudly and clearly. We’ve had ten disasters in the United States this year alone costing more than $1 billion and which were climate-related. It’s only a matter of time before reality sinks in, and we need both parties involved. And the only way to get the right answer is to understand the question.

Australian Carbon Tax

13 July, 2011

Australians burn a lot of carbon. Per person, they’re right up there with Americans:

The map here is based on data from 2000. In 2008, Australians spewed out 18.9 tonnes of CO2 per person in the process of burning fossil fuels and making cement. Americans spewed 17.5 tonnes per person. The world average was just 4.4.

Australians also mine a lot of coal. It’s their biggest export! On top of that, coal exports have more than doubled in recent years:

Last Sunday, however, Prime Minister Julia Gillard announced a tax on carbon!

In this scheme, the 500 biggest polluters in Australia will be taxed at AU $23 per tonne of carbon emissions starting in July 2012. The price will increase 2.5% each year until 2015, and then a carbon trading scheme will be introduced. The hope is that by 2020, Australian carbon emissions will drop 5% below 2000 levels.

Of course, the further we go into the future, the less sure we can be of anything. What if Gillard’s party gets voted out of power? There’s already considerable dissatisfaction with Gillard’s plan, in part because she had earlier said:

There will be no carbon tax under the Government I lead.

but mainly, of course, because taxes are unpopular and the coal lobby is very strong in Australia. There’s been a lot of talk about how the carbon tax will hurt the economy.

These objections are to be expected, and thus not terribly interesting (even if they’re valid). However, some more interesting objections are posed here:

• Annabel Crab, Australia’s diabolical carbon pricing scheme, ABC News, 13 July 2010.

First, it seems that Prime Minister Gillard favors continuing to sell lots of coal to other countries. As she recently said:

Tony Abbott was predicting Armageddon for the coal mining industry But the future of coal mining in Australia is bright.

But coal mining can’t really have a ‘bright future’ in a decarbonized world unless we capture and store the carbon dioxide emitted by coal-burning plants.

Second, in the planned carbon trading scheme beginning in 2015, Australian companies will be allowed to account for half of their emissions reductions by simply buying permits from overseas. I’m not sure this is bad: it could simply be efficient. However, Annabel Crab points out that it has some seemingly paradoxical effects. She quotes a Treasury document saying:

In a world where other countries pursue more ambitious abatement targets, the carbon price will be higher, and this increases the cost in terms of domestic production and income foregone.

Is this really bad? I’m not sure. I hope however that the Australian carbon tax goes forward to the point where we can see its effects instead of merely speculate about them.

Outsourcing Carbon Emissions

25 May, 2011

George Monbiot points out that Britain is accomplishing some of its reductions in carbon emissions by the simple expedient of outsourcing them to other countries:

• George Monbiot, Pass the Parcel, 23 May, 2011.

This gets around the spirit but not the letter of the Kyoto Protocol, since some these other countries, notably China, aren’t required to limit their carbon emissions! He writes:

It could have been worse. After the Treasury and the business department tried to scupper the UK’s long-term carbon targets, David Cameron stepped in to rescue them. The government has now promised to cut greenhouse gases by 50% by 2027, which means that, with a following wind, the UK could meet its legally-binding target of 80% by 2050. For this we should be grateful. But the coalition has resolved the tension between green and growth in a less than convincing fashion: by dumping responsibility for the environmental impacts on someone else.

The carbon cut we have made so far, and the carbon cut we are likely to make by 2027, have been achieved by means of a simple device: allowing other countries, principally China, to run polluting industries on our behalf.

Officially, the UK’s greenhouse gas emissions have fallen from 788 million tonnes in 1990 to 566mt in 2009. Unofficially, another 253 megatonnes should be added to our account. That’s the difference between the greenhouse gases released when manufacturing the goods we export and those released when manufacturing the goods we import. The reason why our official figures look better than those of most other nations is that so much of our manufacturing industry has moved overseas. It is this which allows the government to meet its targets. If the stuff we buy is made in China, China gets the blame.

This would be less of an issue if China were obliged to restrict its emissions. But under the only global treaty in force at the moment—the Kyoto Protocol—developing countries have no need to reduce their impacts. That suits the governments of both rich and poorer nations. Governments like ours can pretend that there is no conflict between green and growth. They avoid unpopular decisions, allowing people to consume whatever they fancy, and they keep business sweet by promising endless expansion. Governments like China’s can keep supplying us with the goods we couldn’t produce at home without breaking our obligations.

The “unofficial” calculation of 253 extra megatonnes of CO2 comes from here:

• Steven J. Davis and Ken Caldeira, Consumption-based accounting of CO2 emissions, Proceedings of the National Academy of Sciences, 8 March 2011.

This paper claims that in wealthy countries such as Switzerland, Sweden, Austria, the United Kingdom, and France, more than 30% of consumption-related CO2 emissions were “imported”. In other words, a lot of their CO2 emissions weren’t actually done in those country: they happened during the production and shipping of goods that got imported to those countries!

You can see a bit of what’s going on from this picture (click to enlarge):

But be careful! For example: see the big fat arrow pointing from China to the US, with the number ’395′ next to it? As far as I can tell, they got that number by working out how many megatonnes of CO2 were created by manufacturing goods in China and shipping them to the US during the year 2004… but then subtracting the megatonnes of CO2 created by manufacturing goods in the US and shipping them to China during that year.

So if I understand this correctly, there’s a lot of ‘cancellation’ going on in this picture. And that could fool the casual reader. After all, it’s not like CO2 produced in the US while making goods for export to China really helps cancel out the CO2 produced in China while making goods for export to the US! So, I’d prefer to see a picture that had labelled arrows pointing both ways between China and the US, and similarly for other countries or groups of countries.

(By the way, the EU is counted as one lump for the purposes of this picture.)

But that’s a small nitpick: this article is full of interesting things. For example, the authors say that the surge of carbon emissions since 2000 has been driven

not only by growth of the global population and per-capita GDP, but also by unanticipated global increases in the energy intensity of GDP (energy per unit GDP) and the carbon intensity of energy (emissions per unit energy).

And, they say that in 2004, 23% of world-wide CO2 emissions, or 6.2 gigatonnes of carbon dioxide, were associated to international trade, primarily exports from China and other developing countries to rich countries.

(As you can see, the numbers labelling those arrows in the picture above don’t add up to anything like 6,200. That’s what made me suspect that there’s a lot of ‘cancellation’ going on in that picture.)

Stabilization Wedges (Part 5)

21 April, 2011

In 2004, Pacala and Socolow laid out a list of ways we can battle global warming using current technologies. They said that to avoid serious trouble, we need to choose seven ‘stabilization wedges’: that is, seven ways to cut carbon emissions by 1 gigatonne per year within 50 years. They listed 15 wedges to choose from, and I’ve told you about them here:

Part 1 – efficiency and conservation.

Part 2 – shifting from coal to natural gas, carbon capture and storage.

Part 3 – nuclear power and renewable energy.

Part 4 – reforestation, good soil management.

According to Pacala:

The message was a very positive one: “gee, we can solve this problem: there are lots of ways to solve it, and lots of ways for the marketplace to solve it.”

I find that interesting, because to me each wedge seems like a gargantuan enterprise—and taken together, they seem like the Seven Labors of Hercules. They’re technically feasible, but who has the stomach for them? I fear things need to get worse before we come to our senses and take action at the scale that’s required.

Anyway, that’s just me. But three years ago, Pacala publicly reconsidered his ideas for a very different reason. Based on new evidence, he gave a talk at Stanford where he said:

It’s at least possible that we’ve already let this thing go too far, and that the biosphere may start to fall apart on us, even if we do all this. We may have to fall back on some sort of dramatic Plan B. We have to stay vigilant as a species.

You can watch his talk here:

It’s pretty damned interesting: he’s a good speaker.

Here’s a dry summary of a few key points. I won’t try to add caveats: I’m sure he would add some himself in print, but I’d rather keep the message simple. I also won’t try to update his information! Not in this blog entry, anyway. But I’ll ask some questions, and I’ll be delighted if you help me out on those.

Emissions targets

First, Pacala’s review of different carbon emissions targets.

The old scientific view, circa 1998: if we could keep the CO2 from doubling from its preindustrial level of 280 parts per million, that would count as a success. Namely, most of the ‘monsters behind the door’ would not come out: continental ice sheets falling into the sea and swamping coastal cities, the collapse of the Atlantic ocean circulation, a drought in the Sahel region of Africa, etcetera.

Many experts say we’d be lucky to get away with CO2 merely doubling. At current burn rates we’ll double it by 2050, and quadruple it by the end of this century. We’ve got enough fossil fuels to send it to seven times its preindustrial levels.

Doubling it would take us to 560 parts per million. A lot of people think that’s too high to be safe. But going for lower levels gets harder:

• In Pacala and Socolow’s original paper, they talked about keeping CO2 below 500 ppm. This would require keeping CO2 emissions constant until 2050. This could be achieved by a radical decarbonization of the economies of rich countries, while allowing carbon emissions in poor countries to grow almost freely until that time.

• For a long time the IPCC and many organizations advocated keeping CO2 below 450 ppm. This would require cutting CO2 emissions by 50% by 2050, which could be achieved by a radical decarbonization in rich countries, and moderate decarbonization in poor countries.

• But by 2008 the IPCC and many groups wanted a cap of 2°C global warming, or keeping CO2 below 430 ppm. This would mean cutting CO2 emissions by 80% by 2050, which would require a radical decarbonization in both rich and poor countries.

The difference here is what poor people have to do. The rich countries need to radically cut carbon emissions in all these scenarios. In the USA, the Lieberman-Warner bill would have forced the complete decarbonization of the economy by 2050.

Then, Pacala spoke about 3 things that make him nervous:

1. Faster emissions growth

A 2007 paper by Canadell et al pointed out that starting in 2000, fossil fuel emissions started growing at 3% per year instead of the earlier figure of 1.5%. This could be due to China’s industrialization. Will this keep up in years to come? If so, the original Pacala-Socolow plan won’t work.

(How much, exactly, did the economic recession change this story?)

2. The ocean sink

Each year fossil fuel burning puts about 8 gigatons of carbon in the atmosphere. The ocean absorbs about 2 gigatons and the land absorbs about 2, leaving about 4 gigatons in the atmosphere.

However, as CO2 emissions rise, the oceanic CO2 sink has been growing less than anticipated. This seems to be due to a change in wind patterns, itself a consequence of global warming.

(What’s the latest story here?)

3. The land sink

As the CO2 levels go up, people expected plants to grow better and suck up more CO2. In the third IPCC report, models predicted that by 2050, plants will be drawing down 6 gigatonnes more carbon per year than they do now! The fourth IPCC report was similar.

This is huge: remember that right now we emit about 8 gigatonnes per year. Indeed, this effect, called CO2 fertilization, could be the difference between the land being a big carbon sink and a big carbon source. Why a carbon source? For one thing, without the plants sucking up CO2, temperatures will rise faster, and the Amazon rainforest may start to die, and permafrost in the Arctic may release more greenhouse gases (especially methane) as it melts.

In a simulation run by Pacala, where he deliberately assumed that plants fail to suck up more carbon dioxide, these effects happened and the biosphere dumped a huge amount of extra CO2 into the atmosphere: the equivalent of 26 stabilization wedges.

So, plans based on the IPCC models are essentially counting on plants to save us from ourselves.

But is there any reason to think plants might not suck up CO2 at the predicted rates?

Maybe. First, people have actually grown forests in doubled CO2 conditions to see how much faster plants grow then. But the classic experiment along these lines used young trees. In 2005, Körner et al did an experiment using mature trees… and they didn’t see them growing any faster!

Second, models in the third IPCC report assumed that as plants grew faster, they’d have no trouble getting all the nitrogen they need. But Hungate et al have argued otherwise. On the other hand, Alexander Barron discovered that some tropical plants were unexpectedly good at ramping up the rate at which they grab ahold of nitrogen from the atmosphere. But on the third hand, that only applies to the tropics. And on the fourth hand—a complicated problem like this requires one of those Indian gods with lots of hands—nitrogen isn’t the only limiting factor to worry about: there’s also phosphorus, for example.

Pacala goes on and discusses even more complicating factors. But his main point is simple. The details of CO2 fertilization matter a lot. It could make the difference between their original plan being roughly good enough… and being nowhere near good enough!

(What’s the latest story here?)

Energy, the Environment, and What Mathematicians Can Do (Part 2)

20 March, 2011

A couple of days ago I begged for help with a math colloquium talk I’m giving this Wednesday at Hong Kong University.

The response was immediate and wonderfully useful. Thanks, everyone! If my actual audience is as knowledgeable and critical as you folks, I’ll be shocked and delighted.

But I only showed you the first part of the talk… because I hadn’t written the second part yet! And the second part is the hard part: it’s about “what mathematicians can do”.

Here’s a version including the second part:

Energy, the Environment, and What Mathematicians Can Do.

I include just one example of what you’re probably dying to see: a mathematician proving theorems that are relevant to environmental and energy problems. And you’ll notice that this guy is not doing work that will directly help solve these problems.

That’s sort of on purpose: I think we mathematicians sit sort of near the edge of the big conversation about these problems. We do important things, now and then, but their importance tends to be indirect. And I think that’s okay.

But it’s also a bit unsatisfying. What’s your most impressive example of a mathematically exciting result that also directly impacts environmental and energy issues?

I have a bunch of my own examples, but I’d like to hear yours. I want to start creating a list.

(By the way: research is just part of the story! One of the easier ways mathematicians can help save the planet is to teach well. And I do discuss that.)

Mathematics of Planet Earth

20 March, 2011

While struggling to prepare my talk on “what mathematicians can do”, I remembered this website pointed out by Tom Leinster:

Mathematics of Planet Earth 2013.

The idea is to get lots of mathematicians involved in programs on these topics:

• Weather, climate, and environment
• Health, human and social services
• Planetary resources
• Population dynamics, ecology and genomics of species
• Energy utilization and efficiency
• Connecting the planet together
• Geophysical processes
• Global economics, safety and stability

There are already a lot of partner societies (including the American Mathematical Society) and partner institutes. I would love to see more details, but this website seems directed mainly at getting more organizations involved, rather than saying what any of them are going to do.

There is a call for proposals, but it’s a bit sketchy. It says:

A call to join is sent to the planet.

which makes me want to ask “From where?”

(That must be why I’m sitting here blogging instead of heading an institute somewhere. I never fully grew up.)

I guess the details will eventually become clearer. Does anyone know some activities that have been planned?

Energy, the Environment, and What Mathematicians Can Do (Part 1)

18 March, 2011

I’m preparing a talk to give at Hong Kong University next week. It’s only half done, but I could use your feedback on this part while I work on the rest:

Energy, The Environment, and What Mathematicians Can Do.

So far it makes a case for why mathematicians should get involved in these issues… but doesn’t say what they can to help! That’ll be the second part. So, you’ll just have to bear with the suspense for now.

By the way, all the facts and graphs should have clickable links that lead you to online references. The links aren’t easy to see, but if you hover the cursor over a fact or graph, and click, it should work.

Guess Who Wrote This?

3 March, 2011

Guess who wrote this report. I’ll quote a bunch of it:

The climate change crisis is far from over. The decade 2000-2010 is the hottest ever recorded and data reveals each decade over the last 50 years to be hotter than the previous one. The planet is enduring more and more heat waves and rain levels—high and low—that test the outer bounds of meteorological study.

The failure of the USA, Australia and Japan to implement relevant legislation after the Copenhagen Accord, as well as general global inaction, might lead people to shrug off the climate issue. Many are quick to doubt the science. Amid such ambiguity a discontinuity is building as expert and public opinion diverge.

This divergence is not sustainable!

Society continues to face a dilemma posed here: a failure to reduce emissions now will mean considerably greater cost in the future. But concerted global action is still too far off given the extreme urgency required.

CO2 price transparency needed

Some countries forge ahead with national and local measures but many are moving away from market-based solutions and are punishing traditional energy sources. Cap-and-trade systems risk being discredited. The EU-Emissions Trading System (EU-ETS) has failed to deliver an adequate CO2 price. Industry lobbying for free allowance allocations is driving demands for CO2 taxes to eliminate perceived industry windfalls. In some cases this has led to political stalemate.

The transparency of a CO2 price is central to delivering least-cost emission reductions, but it also contributes to growing political resistance to cap-and-trade
systems. Policy makers are looking to instruments – like mandates – where emissions value is opaque. This includes emission performance standards (EPSs) for electricity plants and other large fixed sources. Unfortunately, policies aimed at building renewable energy capacity are also displacing more natural gas than coal where the CO2 price is low or absent. This is counter-productive when it comes to reducing emissions. Sometimes the scale of renewables capacity also imposes very high system costs. At other times, policy support for specific renewables is maintained even after the technology reaches its efficient scale, as is the case in the US.

The recession has raised a significant issue for the EU-ETS: how to design cap-and-trade systems in the face of economic and technological uncertainty? Phase III of the ETS risks delivering a structurally low CO2 price due to the impact of the recession on EU emissions. A balanced resetting of the cap should be considered. It is more credible to introduce a CO2 price floor ahead of such shocks than engage in the ad hoc recalibration of the cap in response to them. This would signal to investors that unexpected shortfalls in emissions would be used in part to step up reductions and reduce uncertainty in investments associated with the CO2 price. This is an important issue for the design of Phase IV of the ETS.

Climate too low a priority

Structural climate policy problems aside, the global recession has moved climate concerns far down the hierarchy of government objectives. The financial crisis and Gulf of Mexico oil spill have also hurt trust in the private sector, spawning tighter regulation and leading to increased risk aversion. This hits funding and political support for new technologies, in particular Carbon Capture and Sequestration (CCS) where industry needs indemnification from some risk. Recent moves by the EU and the US regarding long-term liabilities show this support is far from secured. Government support for technology development may also be hit as they work to cut deficits.

In this environment of policy drift and increasing challenge to market-based solutions, it is important to remain strongly focused on least-cost solutions today and advances in new technologies for the future. Even if more pragmatic policy choices prevail, it is important that they are consistent with, and facilitate the eventual implementation of market-based solutions.

Interdependent ecosystems approach

Global policy around environmental sustainability focuses almost exclusively on climate change and CO2 emissions reduction. But since 2008, an approach which considers interdependent ecosystems has emerged and gradually gained influence.

This approach argues that targeting climate change and CO2 alone is insufficient. The planet is a system of inextricably inter-related environmental processes and each must be managed in balance with the others to sustain stability.

Research published by the Stockholm Resilience Centre in early 2009 consolidates this thinking and proposes a framework based on ‘biophysical environmental subsystems’. The Nine Planetary Boundaries collectively define a safe operating space for humanity where social and economic development does not create lasting and catastrophic environmental change.

According to the framework, planetary boundaries collectively determine ecological stability. So far, limits have been quantified for seven boundaries which, if surpassed, could result in more ecological volatility and potentially disastrous consequences. As Table 1 shows, three boundaries have already been exceeded. Based on current trends, the limits of others are fast approaching.

For the energy industry, CO2 management and reduction is the chief concern and the focus of much research and investment. But the interdependence of the other systems means that if one limit is reached, others come under intense pressure. The climate-change boundary relies on careful management of freshwater, land use, atmospheric aerosol concentration, nitrogen–phosphorus, ocean and stratospheric boundaries. Continuing to pursue an environmental policy centered on climate change will fail to preserve the planet’s environmental stability unless the other defined boundaries are addressed with equal vigour.

This Week’s Finds (Week 310)

28 February, 2011

I first encountered Gregory Benford through his science fiction novels: my favorite is probably In the Ocean of Night.

Later I learned that he’s an astrophysicist at U.C. Irvine, not too far from Riverside where I teach. But I only actually met him through my wife. She sometimes teaches courses on science fiction, and like Benford, she has some involvement with the Eaton Collection at U.C. Riverside—the largest publicly accessible SF library in the world. So, I was bound to eventually bump into him.

When I did, I learned about his work on electromagnetic filaments near the center of our galaxy—see “week252″ for more. I also learned he was seriously interested in climate change, and that he was going to the Asilomar International Conference on Climate Intervention Technologies—a controversial get-together designed to hammer out some policies for research on geoengineering.

Benford is a friendly but no-nonsense guy. Recently he sent me an email mentioning my blog, and said: "Your discussions on what to do are good, though general, while what we need is specifics NOW." Since I’d been meaning to interview him for a while, this gave me the perfect opening.

JB: You’ve been thinking about the future for a long time, since that’s part of your job as a science fiction writer.  For example, you’ve written a whole series about the expansion of human life through the galaxy.  From this grand perspective, global warming might seem like an annoying little road-bump before the ride even gets started.  How did you get interested in global warming? 

GB: I liked writing about the far horizons of our human prospect; it’s fun. But to get even above the envelope of our atmosphere in a sustained way, we have to stabilize the planet. Before we take on the galaxy, let’s do a smaller problem .

JB: Good point. We can’t all ship on out of here, and the way it’s going now, maybe none of us will, unless we get our act together.

Can you remember something that made you think "Wow, global warming is a really serious problem"?  As you know, not everyone is convinced yet.

GB: I looked at the migration of animals and then the steadily northward march of trees. They don’t read newspapers—the trees become newspapers—so their opinion matters more. Plus the retreat of the Arctic Sea ice in summer, the region of the world most endangered by the changes coming. I first focused on carbon capture using the CROPS method. I’m the guy who first proposed screening the Arctic with aerosols to cool it in summer.

JB: Let’s talk about each in turn. "CROPS" stands for Crop Residue Oceanic Permanent Sequestration. The idea sounds pretty simple: dump a lot of crop residues—stalks, leaves and stuff—on the deep ocean floor. That way, we’d be letting plants suck CO2 out of the atmosphere for us.

GB: Agriculture is the world’s biggest industry; we should take advantage of it. That’s what gave Bob Metzger and me the idea: collect farm waste and sink it to the bottom of the ocean, whence it shall not return for 1000 years. Cheap, easy, doable right now.

JB: But we have to think about what’ll happen if we dump all that stuff into the ocean, right? After all, the USA alone creates half a gigatonne of crop residues each year, and world-wide it’s ten times that. I’m getting these numbers from your papers:

• Robert A. Metzger and Gregory Benford, Sequestering of atmospheric carbon through permanent disposal of crop residue, Climatic Change 49 (2001), 11-19.

• Stuart E. Strand and Gregory Benford, Ocean sequestration of crop residue carbon: recycling fossil fuel carbon back to deep sediments, Environmental Science and Technology 43 (2009), 1000-1007.

Since we’re burning over 7 gigatonnes of carbon each year, burying 5 gigatonnes of crop waste is just enough to make a serious dent in our carbon footprint. But what’ll that much junk do at the bottom of the ocean?

GB: We’re testing the chemistry of how farm waste interacts with deep ocean sites offshore Monterey Bay right now. Here’s a picture of a bale 3.2 km down:

JB: I’m sure our audience will have more questions about this… but the answers to some are in your papers, and I want to spend a bit more time on your proposal to screen the Arctic. There’s a good summary here:

• Gregory Benford, Climate controls, Reason Magazine, November 1997.

But in brief, it sounds like you want to test the results of spraying a lot of micron-sized dust into the atmosphere above the Arctic Sea during the summer. You suggest diatomaceous earth as an option, because it’s chemically inert: just silica. How would the test work, exactly, and what would you hope to learn?

GB: The US has inflight refueling aircraft such as the KC-10 Extender that with minor changes spread aerosols at relevant altitudes, and pilots who know how to fly big sausages filled with fluids.

Rather than diatomaceous earth, I now think ordinary SO2 or H2S will work, if there’s enough water at the relevant altitudes. Turns out the pollutant issue is minor, since it would be only a percent or so of the SO2 already in the Arctic troposphere. The point is to spread aerosols to diminish sunlight and look for signals of less sunlight on the ground, changes in sea ice loss rates in summer, etc. It’s hard to do a weak experiment and be sure you see a signal. Doing regional experiments helps, so you can see a signal before the aerosols spread much. It’s a first step, an in-principle experiment.

Simulations show it can stop the sea ice retreat. Many fear if we lose the sea ice in summer ocean currents may alter; nobody really knows. We do know that the tundra is softening as it thaws, making roads impassible and shifting many wildlife patterns, with unforeseen long term effects. Cooling the Arctic back to, say, the 1950 summer temperature range would cost maybe $300 million/year, i.e., nothing. Simulations show to do this globally, offsetting say CO2 at 500 ppm, might cost a few billion dollars per year. That doesn’t help ocean acidification, but it’s a start on the temperature problem.

JB: There’s an interesting blog on Arctic political, military and business developments:

• Anatoly Karlin, Arctic Progress.

Here’s the overview:

Today, global warming is kick-starting Arctic history. The accelerating melting of Arctic sea ice promises to open up circumpolar shipping routes, halving the time needed for container ships and tankers to travel between Europe and East Asia. As the ice and permafrost retreat, the physical infrastructure of industrial civilization will overspread the region [...]. The four major populated regions encircling the Arctic Ocean—Alaska, Russia, Canada, Scandinavia (ARCS)—are all set for massive economic expansion in the decades ahead. But the flowering of industrial civilization’s fruit in the thawing Far North carries within it the seeds of its perils. The opening of the Arctic is making border disputes more serious and spurring Russian and Canadian military buildups in the region. The warming of the Arctic could also accelerate global warming—and not just through the increased economic activity and hydrocarbons production. One disturbing possibility is that the melting of the Siberian permafrost will release vast amounts of methane, a greenhouse gas that is far more potent than CO2, into the atmosphere, and tip the world into runaway climate change.

But anyway, unlike many people, I’m not mentioning risks associated with geoengineering in order to instantly foreclose discussion of it, because I know there are also risks associated with not doing it. If we rule out doing anything really new because it’s too expensive or too risky, we might wind up locking ourselves in a "business as usual" scenario. And that could be even more risky—and perhaps ultimately more expensive as well.

GB: Yes, no end of problems. Most impressive is how they look like a descending spiral, self-reinforcing.

Certainly countries now scramble for Arctic resources, trade routes opened by thawing—all likely to become hotly contested strategic assets. So too melting Himalayan glaciers can perhaps trigger "water wars" in Asia—especially India and China, two vast lands of very different cultures. Then, coming on later, come rising sea levels. Florida starts to go away. The list is endless and therefore uninteresting. We all saturate.

So droughts, floods, desertification, hammering weather events—they draw ever less attention as they grow more common. Maybe Darfur is the first "climate war." It’s plausible.

The Arctic is the canary in the climate coalmine. Cutting CO2 emissions will take far too long to significantly affect the sea ice. Permafrost melts there, giving additional positive feedback. Methane release from the not-so-perma-frost is the most dangerous amplifying feedback in the entire carbon cycle. As John Nissen has repeatedly called attention to, the permafrost permamelt holds a staggering 1.5 trillion tons of frozen carbon, about twice as much carbon as is in the atmosphere. Much would emerge as methane. Methane is 25 times as potent a heat-trapping gas as CO2 over a century, and 72 times as potent over the first 20 years! The carbon is locked in a freezer. Yet that’s the part of the planet warming up the fastest. Really bad news:

• Kevin Schaefer, Tingjun Zhang, Lori Bruhwiler and Andrew P. Barrett, Amount and timing of permafrost carbon release in response to climate warming, Tellus, 15 February 2011.

Abstract: The thaw and release of carbon currently frozen in permafrost will increase atmospheric CO2 concentrations and amplify surface warming to initiate a positive permafrost carbon feedback (PCF) on climate. We use surface weather from three global climate models based on the moderate warming, A1B Intergovernmental Panel on Climate Change emissions scenario and the SiBCASA land surface model to estimate the strength and timing of the PCF and associated uncertainty. By 2200, we predict a 29-59% decrease in permafrost area and a 53-97 cm increase in active layer thickness. By 2200, the PCF strength in terms of cumulative permafrost carbon flux to the atmosphere is 190±64 gigatonnes of carbon. This estimate may be low because it does not account for amplified surface warming due to the PCF itself and excludes some discontinuous permafrost regions where SiBCASA did not simulate permafrost. We predict that the PCF will change the arctic from a carbon sink to a source after the mid-2020s and is strong enough to cancel 42-88% of the total global land sink. The thaw and decay of permafrost carbon is irreversible and accounting for the PCF will require larger reductions in fossil fuel emissions to reach a target atmospheric CO2 concentration.

Particularly interesting is the slowing of thermohaline circulation.  In John Nissen’s "two scenarios" work there’s an uncomfortably cool future—if the Gulf Stream were to be diverted by meltwater flowing into NW Atlantic. There’s also an unbearably hot future, if the methane from not-so-permafrost and causes global warming to spiral out of control. So we have a terrifying menu.

JB: I recently interviewed Nathan Urban here. He explained a paper where he estimated the chance that the Atlantic current you’re talking about could collapse. (Technically, it’s the Atlantic meridional overturning circulation, not quite the same as the Gulf Stream.) They got a 10% chance of it happening in two centuries, assuming a business as usual scenario. But there are a lot of uncertainties in the modeling here.

Back to geoengineering. I want to talk about some ways it could go wrong, how soon we’d find out if it did, and what we could do then.

For example, you say we’ll put sulfur dioxide in the atmosphere below 15 kilometers, and most of the ozone is above 20 kilometers. That’s good, but then I wonder how much sulfur dioxide will diffuse upwards. As the name suggests, the stratosphere is "stratified" —there’s not much turbulence. That’s reassuring. But I guess one reason to do experiments is to see exactly what really happens.

GB: It’s really the only way to go forward. I fear we are now in the Decade of Dithering that will end with the deadly 2020s. Only then will experiments get done and issues engaged. All else, as tempting as ideas and simulations are, spell delay if they do not couple with real field experiments—from nozzle sizes on up to albedo measures —which finally decide.

JB: Okay. But what are some other things that could go wrong with this sulfur dioxide scheme? I know you’re not eager to focus on the dangers, but you must be able to imagine some plausible ones: you’re an SF writer, after all. If you say you can’t think of any, I won’t believe you! And part of good design is looking for possible failure modes.

GB: Plenty can go wrong with so vast an idea. But we can learn from volcanoes, that give us useful experiments, though sloppy and noisy ones, about putting aerosols into the air. Monitoring those can teach us a lot with little expense.

We can fail to get the aerosols to avoid clumping, so they fall out too fast. Or we can somehow trigger a big shift in rainfall patterns—a special danger in a system already loaded with surplus energy, as is already displaying anomalies like the bitter winters in Europe, floods in Pakistan, drought in Darfur. Indeed, some of Alan Robock’s simulations of Arctic aerosol use show a several percent decline in monsoon rain—though that may be a plus, since flooding is the #1 cause of death and destruction during the Indian monsoon.

Mostly, it might just plain fail to work. Guessing outcomes is useless, though.  Here’s where experiment rules, not simulations. This is engineering, which learns from mistakes. Consider the early days of aviation. Having more time to develop and test a system gives more time to learn how to avoid unwanted impacts. Of course, having a system ready also increases the probability of premature deployment; life is about choices and dangers.

More important right now than developing capability, is understanding the consequences of deployment of that capability by doing field experiments. One thing we know: both science and engineering advance most quickly by using the dance of theory with experiment. Neglecting this, preferring only experiment, is a fundamental mistake.

JB: Switching gears slightly: in March last year you went to the Asilomar Conference on climate intervention technologies. I’ve read the report:

• Asilomar Scientific Organizing Committee, The Asilomar Conference Recommendations on Principles for Research into Climate Engineering Techniques, Climate Institute, Washington DC, 2010.

It seems unobjectionable and a bit bland, no doubt deliberately so, with recommendations like this:

"Public participation and consultation in research planning and oversight, assessments, and development of decision-making mechanisms and processes must be provided."

What were some interesting things that you learned there? And what’ll happen next?

GB: It was the Woodstock of the policy wonks. I found it depressing. Not much actual science got discussed, and most just fearlessly called for more research, forming of panels and committees, etc. This is how bureaucracy digests a problem, turning it quite often into fertilizer.

I’m a physicist who does both theory and experiment. I want to see work that combines those to give us real information and paths to follow. I don’t see that anywhere now. Congress might hand out money for it but after the GAO report on geoengineering last September there seems little movement.

I did see some people pushing their carbon capture companies, to widespread disbelief. The simple things we could do right now like our CROPS carbon capture proposal are neglected, while entrepreneur companies hope for a government scheme to pay for sucking CO2 from the air. That’ll be the day!—far into the crisis, I think, maybe several decades from now. I also saw fine ideas pushed aside in favor of policy wonk initiatives. It was a classic triumph of process over results. As is many areas dominated by social scientists, people seemed to be saying, "Nobody can blame us if we go through the motions.”

That Decade of Dithering is upon us now. The great danger is that tipping points may not be obvious, even as we cross them. They may present as small events that nonetheless take us over an horizon from which we can never return.

For example, the loss of Greenland ice. Once the ice sheet melts down to an altitude below that needed to maintain it, we’ve lost it. The melt lubricates the glacier base and starts a slide we cannot stop. There are proposals of how to block that—essentially, draw the water out from the base as fast as it appears—but nobody’s funding such studies.

A reasonable, ongoing climate control program might cost $100 million annually. That includes small field experiments, trials with spraying aerosols, etc. We now spend about $5 billion per year globally studying the problem, so climate control studies would be 1/50 of that.

Even now, we may already be too late for a tipping point—we still barely glimpse the horrors we could be visiting on our children and their grandchildren’s grandchildren.

JB: I think a lot of young people are eager to do something. What would be your advice, especially to future scientists and engineers? What should they do? The problems seem so huge, and most so-called "adults" are shirking their responsibilities—perhaps hoping they’ll be dead before things get too bad.

GB: One reason people are paralyzed is simple: major interests would get hurt—coal, oil, etc. The fossil fuel industry is the second largest in the world; #1 is agriculture. We have ~50 trillion dollars of infrastructure invested in it. That and inertia—we’ve made the crucial fuel of our world a Bad Thing, and prohibition never works with free people. Look at the War on Drugs, now nearing its 40th anniversary.

That’s why I think adaptation—dikes, water conservation, reflecting roofs and blacktop to cool cities and lower their heating costs, etc.— is a smart way to prepare. We should also fund research in mineral weathering as a way to lock up CO2, which not only consumes CO2 but it can also generate ocean alkalinity. The acidification of the oceans is undeniable, easily measured, and accelerating. Plus geoengineering, which is probably the only fairly cheap, quick way to damp the coming chaos for a while. A stopgap, but we’re going to need plenty of those.

JB: And finally, what about you? What are you doing these days? Science fiction? Science? A bit of both?

Both, plus. Last year I published a look at how we viewed the future in the 20th Century, The Wonderful Future We Never Had, and have a novel in progress now cowritten with Larry Niven—about a Really Big Object. Plus some short stories and journalism.

My identical twin brother Jim & I published several papers looking at SETI from the perspective of those who would pay the bills for a SETI beacon, and reached conclusions opposite from what the SETI searches of the last half century have sought. Instead of steady, narrowband signals near 1 GHz, it is orders of magnitude cheaper to radiate pulsed, broadband beacon signals nearer 10 GHz. This suggests new way to look for pulsed signals, which some are trying to find. We may have been looking for the wrong thing all along. The papers are on the arXiv:

• James Benford, Gregory Benford and Dominic Benford, Messaging with cost optimized interstellar beacons.

• Gregory Benford, James Benford and Dominic Benford, Searching for cost optimized interstellar beacons.

For math types, David Wolpert and I have shown that Newcomb’s paradox arises from confusions in the statement, so is not a paradox:

• David H. Wolpert and Gregory Benford, What does Newcomb’s paradox teach us?

JB: The next guest on this show, Eliezer Yudkowsky, has also written about Newcomb’s paradox. I should probably say what it is, just for folks who haven’t heard yet. I’ll quote Yudkowsky’s formulation, since it’s nice and snappy:

A superintelligence from another galaxy, whom we shall call Omega, comes to Earth and sets about playing a strange little game. In this game, Omega selects a human being, sets down two boxes in front of them, and flies away.

Box A is transparent and contains a thousand dollars.
Box B is opaque, and contains either a million dollars, or nothing.

You can take both boxes, or take only box B.

And the twist is that Omega has put a million dollars in box B if and only if Omega has predicted that you will take only box B.

Omega has been correct on each of 100 observed occasions so far—everyone who took both boxes has found box B empty and received only a thousand dollars; everyone who took only box B has found B containing a million dollars. (We assume that box A vanishes in a puff of smoke if you take only box B; no one else can take box A afterward.)

Before you make your choice, Omega has flown off and moved on to its next game. Box B is already empty or already full.

Omega drops two boxes on the ground in front of you and flies off.

Do you take both boxes, or only box B?

If you say you’d take both boxes, I’ll argue that’s stupid: everyone who did that so far got just a thousand dollars, while the folks who took only box B got a million!

If you say you’d take only box B, I’ll argue that’s stupid: there has got to be more money in both boxes than in just one of them!

So, this puzzle has a kind of demonic attraction. Lots of people have written about it, though personally I’m waiting until a superintelligence from another galaxy actually shows up and performs this stunt.

Hmm—I see your paper uses Bayesian networks! I’ve been starting to think about those lately.

But I know that’s not all you’ve been doing.

GB: I also started several biotech companies 5 years ago, spurred in part by the agonizing experience of watching my wife die of cancer for decades, ending in 2002. They’re genomics companies devoted to extending human longevity by upregulating genes we know confer some defenses against cardio, neurological and other diseases. Our first product just came out, StemCell100, and did well in animal and human trials.

So I’m staying busy. The world gets more interesting all the time. Compared with growing up in the farm country of Alabama, this is a fine way to live.

JB: It’s been great to hear what you’re up to. Best of luck on all these projects, and thanks for answering my questions!

Few doubt that our climate stands in a class by itself in terms of complexity. Though much is made of how wondrous our minds are, perhaps the most complex entity known is our biosphere, in which we are mere mayflies. Absent a remotely useful theory of complexity in systems, we must proceed cautiously. – Gregory Benford

Carbon Dioxide Puzzles

4 February, 2011

I like it when people do interesting calculations and help me put their results on this blog. Renato Iturriaga has plotted a graph that raises some interesting questions about carbon dioxide in the Earth’s atmosphere. Maybe you can help us out!

The atmospheric CO2 concentration, as measured at Mauna Loa in Hawaii, looks like it’s rising quite smoothly apart from seasonal variations:

However, if you take the annual averages from here:

• NOAA Earth System Laboratory, Global Monitoring Division, Recent Mauna Loa CO2.

and plot how much the average rises each year, the graph is pretty bumpy. You’ll see what I mean in a minute.

In comparison, if you plot the carbon dioxide emissions produced by burning fossil fuels, you get a rather smooth curve, at least according to these numbers:

• U. S. Energy Information Administration Total carbon dioxide emissions from the consumption of energy, 1980-2008.

Renato decided to plot both of these curves and their difference. Here’s his result:

The blue curve shows how much CO2 we put into the atmosphere each year by burning fossil fuels, measured in parts per million.

The red curve shows the observed increase in atmospheric CO2.

The green curve is the difference.

The puzzle is to explain this graph. Why is the red curve roughly 40% lower than the blue one? Why is the red curve so jagged?

Of course, a lot of research has already been done on these issues. There are a lot of subtleties! So if you like, think of our puzzle as an invitation to read the existing literature and tell us how well it does at explaining this graph. You might start here, and then read the references, and then keep digging.

But first, let me explain exactly how Renato Iturriaga created this graph! If he’s making a mistake, maybe you can catch it.

The red curve is straightforward: he took the annual mean growth rate of CO2 from the NOAA website I mentioned above, and graphed it. Let me do a spot check to see if he did it correctly. I see a big spike in the red curve around 1998: it looks like the CO2 went up around 2.75 ppm that year. But then the next year it seems to have gone up just about 1 ppm. On the website it says 2.97 ppm for 1998, and 0.91 for 1999. So that looks roughly right, though I’m not completely happy about 1998.

[Note added later: as you'll see below, he actually got his data from here; this explains the small discrepancy.]

Renato got the blue curve by taking the US Energy Information Administration numbers and converting them from gigatons of CO2 to parts per million moles. He assumed that that the atmosphere weighs 5 × 1015 tons and that CO2 gets well mixed with the whole atmosphere each year. Given this, we can simply say that one gigaton is 0.2 parts per million of the atmosphere’s mass.

But people usually measure CO2 in parts per million volume. Now, a mole is just a certain large number of molecules. Furthermore, the volume of a gas at fixed pressure is almost exactly proportional to the number of molecules, regardless of its composition. So parts per million volume is essentially the same as parts per million moles.

So we just need to do a little conversion. Remember:

• The molecular mass of N2 is 28, and about 79% of the atmosphere’s volume is nitrogen.

• The molecular mass of O2 is 32, and about 21% of the atmosphere’s volume is oxygen.

• By comparison, there’s very little of the other gases.

So, the average molecular mass of air is

28 × .79 + 32 × .21 = 28.84

On the other hand, the molecular mass of CO2 is 44. So one ppm mass of CO2 is less than one ppm volume: it’s just

28.84/44 = 0.655

parts per million volume. So, a gigaton of CO2 is about 0.2 ppm mass, but only about

0.2 × 0.655 = 0.13

parts per million volume (or moles).

So to get the blue curve, Renato took gigatons of CO2 and multiplied by 0.13 to get ppm volume. Let me do another spot check! The blue curve reaches about 4 ppm in 2008. Dividing 4 by 0.13 we get about 30, and that’s good, because energy consumption put about 30 gigatons of CO2 into the atmosphere in 2008.

And then, of course, the green curve is the blue one minus the red one:

Now, more about the puzzles.

One puzzle is why the red curve is so much lower than the blue one. The atmospheric CO2 concentration is only going up by about 60% of the CO2 emitted, on average — though the fluctuations are huge. So, you might ask, where’s the rest of the CO2 going?

Probably into the ocean, plants, and soil:

But at first glance, the fact that only 60% stays in the atmosphere seems to contract this famous graph:

This shows it taking many years for a dose of CO2 added to the atmosphere to decrease to 60% of its original level!

Is the famous graph wrong? There are other possible explanations!

Here’s a non-explanation. Humans are putting CO2 into the atmosphere in other ways besides burning fossil fuels. For example, deforestation and other changes in land use put somewhere between 0.5 and 2.7 gigatons of carbon into the atmosphere each year. There’s a lot of uncertainty here. But this doesn’t help solve our puzzle: it means there’s more carbon to account for.

Here’s a possible explanation. Maybe my estimate of 5 × 1015 tons for the mass of the atmosphere is too high! That would change everything. I got my estimate off the internet somewhere — does anyone know a really accurate figure?

Renato came up with a more interesting possible explanation. It’s very important, and very well-known, that CO2 doesn’t leave the atmosphere in a simple exponential decay process. Imagine for simplicity that carbon stays in three boxes:

• Box A: the atmosphere.

• Box B: places that exchange carbon with the atmosphere quite rapidly.

• Box C: places that exchange carbon with the atmosphere and box B quite slowly.

As we pump CO2 into box A, a lot of it quickly flows into box B. It then slowly flows from boxes A and B into box C.

The quick flow from box A to box B accounts for the large amounts of ‘missing’ CO2 in Renato’s graph. But if we stop putting CO2 into box A, it will soon come into equilibrium with box B. At that point, we will not see the CO2 level continue to quickly drop. Instead, CO2 will continue to slowly flow from boxes A and B into box C. So, it can take many years for the atmospheric CO2 concentration to drop to 60% of its original level — as the famous graph suggests.

This makes sense to me. It shows that the red curve can be a lot lower than the blue one even if the famous graph is right.

But I’m still puzzled by the dramatic fluctuations in the red curve! That’s the other puzzle.


Get every new post delivered to your Inbox.

Join 2,712 other followers