Stabilization Wedges (Part 5)

21 April, 2011

In 2004, Pacala and Socolow laid out a list of ways we can battle global warming using current technologies. They said that to avoid serious trouble, we need to choose seven ‘stabilization wedges’: that is, seven ways to cut carbon emissions by 1 gigatonne per year within 50 years. They listed 15 wedges to choose from, and I’ve told you about them here:

Part 1 – efficiency and conservation.

Part 2 – shifting from coal to natural gas, carbon capture and storage.

Part 3 – nuclear power and renewable energy.

Part 4 – reforestation, good soil management.

According to Pacala:

The message was a very positive one: “gee, we can solve this problem: there are lots of ways to solve it, and lots of ways for the marketplace to solve it.”

I find that interesting, because to me each wedge seems like a gargantuan enterprise—and taken together, they seem like the Seven Labors of Hercules. They’re technically feasible, but who has the stomach for them? I fear things need to get worse before we come to our senses and take action at the scale that’s required.

Anyway, that’s just me. But three years ago, Pacala publicly reconsidered his ideas for a very different reason. Based on new evidence, he gave a talk at Stanford where he said:

It’s at least possible that we’ve already let this thing go too far, and that the biosphere may start to fall apart on us, even if we do all this. We may have to fall back on some sort of dramatic Plan B. We have to stay vigilant as a species.

You can watch his talk here:

It’s pretty damned interesting: he’s a good speaker.

Here’s a dry summary of a few key points. I won’t try to add caveats: I’m sure he would add some himself in print, but I’d rather keep the message simple. I also won’t try to update his information! Not in this blog entry, anyway. But I’ll ask some questions, and I’ll be delighted if you help me out on those.

Emissions targets

First, Pacala’s review of different carbon emissions targets.

The old scientific view, circa 1998: if we could keep the CO2 from doubling from its preindustrial level of 280 parts per million, that would count as a success. Namely, most of the ‘monsters behind the door’ would not come out: continental ice sheets falling into the sea and swamping coastal cities, the collapse of the Atlantic ocean circulation, a drought in the Sahel region of Africa, etcetera.

Many experts say we’d be lucky to get away with CO2 merely doubling. At current burn rates we’ll double it by 2050, and quadruple it by the end of this century. We’ve got enough fossil fuels to send it to seven times its preindustrial levels.

Doubling it would take us to 560 parts per million. A lot of people think that’s too high to be safe. But going for lower levels gets harder:

• In Pacala and Socolow’s original paper, they talked about keeping CO2 below 500 ppm. This would require keeping CO2 emissions constant until 2050. This could be achieved by a radical decarbonization of the economies of rich countries, while allowing carbon emissions in poor countries to grow almost freely until that time.

• For a long time the IPCC and many organizations advocated keeping CO2 below 450 ppm. This would require cutting CO2 emissions by 50% by 2050, which could be achieved by a radical decarbonization in rich countries, and moderate decarbonization in poor countries.

• But by 2008 the IPCC and many groups wanted a cap of 2°C global warming, or keeping CO2 below 430 ppm. This would mean cutting CO2 emissions by 80% by 2050, which would require a radical decarbonization in both rich and poor countries.

The difference here is what poor people have to do. The rich countries need to radically cut carbon emissions in all these scenarios. In the USA, the Lieberman-Warner bill would have forced the complete decarbonization of the economy by 2050.

Then, Pacala spoke about 3 things that make him nervous:

1. Faster emissions growth

A 2007 paper by Canadell et al pointed out that starting in 2000, fossil fuel emissions started growing at 3% per year instead of the earlier figure of 1.5%. This could be due to China’s industrialization. Will this keep up in years to come? If so, the original Pacala-Socolow plan won’t work.

(How much, exactly, did the economic recession change this story?)

2. The ocean sink

Each year fossil fuel burning puts about 8 gigatons of carbon in the atmosphere. The ocean absorbs about 2 gigatons and the land absorbs about 2, leaving about 4 gigatons in the atmosphere.

However, as CO2 emissions rise, the oceanic CO2 sink has been growing less than anticipated. This seems to be due to a change in wind patterns, itself a consequence of global warming.

(What’s the latest story here?)

3. The land sink

As the CO2 levels go up, people expected plants to grow better and suck up more CO2. In the third IPCC report, models predicted that by 2050, plants will be drawing down 6 gigatonnes more carbon per year than they do now! The fourth IPCC report was similar.

This is huge: remember that right now we emit about 8 gigatonnes per year. Indeed, this effect, called CO2 fertilization, could be the difference between the land being a big carbon sink and a big carbon source. Why a carbon source? For one thing, without the plants sucking up CO2, temperatures will rise faster, and the Amazon rainforest may start to die, and permafrost in the Arctic may release more greenhouse gases (especially methane) as it melts.

In a simulation run by Pacala, where he deliberately assumed that plants fail to suck up more carbon dioxide, these effects happened and the biosphere dumped a huge amount of extra CO2 into the atmosphere: the equivalent of 26 stabilization wedges.

So, plans based on the IPCC models are essentially counting on plants to save us from ourselves.

But is there any reason to think plants might not suck up CO2 at the predicted rates?

Maybe. First, people have actually grown forests in doubled CO2 conditions to see how much faster plants grow then. But the classic experiment along these lines used young trees. In 2005, Körner et al did an experiment using mature trees… and they didn’t see them growing any faster!

Second, models in the third IPCC report assumed that as plants grew faster, they’d have no trouble getting all the nitrogen they need. But Hungate et al have argued otherwise. On the other hand, Alexander Barron discovered that some tropical plants were unexpectedly good at ramping up the rate at which they grab ahold of nitrogen from the atmosphere. But on the third hand, that only applies to the tropics. And on the fourth hand—a complicated problem like this requires one of those Indian gods with lots of hands—nitrogen isn’t the only limiting factor to worry about: there’s also phosphorus, for example.

Pacala goes on and discusses even more complicating factors. But his main point is simple. The details of CO2 fertilization matter a lot. It could make the difference between their original plan being roughly good enough… and being nowhere near good enough!

(What’s the latest story here?)


Energy, the Environment, and What Mathematicians Can Do (Part 2)

20 March, 2011

A couple of days ago I begged for help with a math colloquium talk I’m giving this Wednesday at Hong Kong University.

The response was immediate and wonderfully useful. Thanks, everyone! If my actual audience is as knowledgeable and critical as you folks, I’ll be shocked and delighted.

But I only showed you the first part of the talk… because I hadn’t written the second part yet! And the second part is the hard part: it’s about “what mathematicians can do”.

Here’s a version including the second part:

Energy, the Environment, and What Mathematicians Can Do.

I include just one example of what you’re probably dying to see: a mathematician proving theorems that are relevant to environmental and energy problems. And you’ll notice that this guy is not doing work that will directly help solve these problems.

That’s sort of on purpose: I think we mathematicians sit sort of near the edge of the big conversation about these problems. We do important things, now and then, but their importance tends to be indirect. And I think that’s okay.

But it’s also a bit unsatisfying. What’s your most impressive example of a mathematically exciting result that also directly impacts environmental and energy issues?

I have a bunch of my own examples, but I’d like to hear yours. I want to start creating a list.

(By the way: research is just part of the story! One of the easier ways mathematicians can help save the planet is to teach well. And I do discuss that.)


Mathematics of Planet Earth

20 March, 2011

While struggling to prepare my talk on “what mathematicians can do”, I remembered this website pointed out by Tom Leinster:

Mathematics of Planet Earth 2013.

The idea is to get lots of mathematicians involved in programs on these topics:

• Weather, climate, and environment
• Health, human and social services
• Planetary resources
• Population dynamics, ecology and genomics of species
• Energy utilization and efficiency
• Connecting the planet together
• Geophysical processes
• Global economics, safety and stability

There are already a lot of partner societies (including the American Mathematical Society) and partner institutes. I would love to see more details, but this website seems directed mainly at getting more organizations involved, rather than saying what any of them are going to do.

There is a call for proposals, but it’s a bit sketchy. It says:

A call to join is sent to the planet.

which makes me want to ask “From where?”

(That must be why I’m sitting here blogging instead of heading an institute somewhere. I never fully grew up.)

I guess the details will eventually become clearer. Does anyone know some activities that have been planned?


Energy, the Environment, and What Mathematicians Can Do (Part 1)

18 March, 2011

I’m preparing a talk to give at Hong Kong University next week. It’s only half done, but I could use your feedback on this part while I work on the rest:

Energy, The Environment, and What Mathematicians Can Do.

So far it makes a case for why mathematicians should get involved in these issues… but doesn’t say what they can to help! That’ll be the second part. So, you’ll just have to bear with the suspense for now.

By the way, all the facts and graphs should have clickable links that lead you to online references. The links aren’t easy to see, but if you hover the cursor over a fact or graph, and click, it should work.


Guess Who Wrote This?

3 March, 2011

Guess who wrote this report. I’ll quote a bunch of it:

The climate change crisis is far from over. The decade 2000-2010 is the hottest ever recorded and data reveals each decade over the last 50 years to be hotter than the previous one. The planet is enduring more and more heat waves and rain levels—high and low—that test the outer bounds of meteorological study.

The failure of the USA, Australia and Japan to implement relevant legislation after the Copenhagen Accord, as well as general global inaction, might lead people to shrug off the climate issue. Many are quick to doubt the science. Amid such ambiguity a discontinuity is building as expert and public opinion diverge.

This divergence is not sustainable!

Society continues to face a dilemma posed here: a failure to reduce emissions now will mean considerably greater cost in the future. But concerted global action is still too far off given the extreme urgency required.

CO2 price transparency needed

Some countries forge ahead with national and local measures but many are moving away from market-based solutions and are punishing traditional energy sources. Cap-and-trade systems risk being discredited. The EU-Emissions Trading System (EU-ETS) has failed to deliver an adequate CO2 price. Industry lobbying for free allowance allocations is driving demands for CO2 taxes to eliminate perceived industry windfalls. In some cases this has led to political stalemate.

The transparency of a CO2 price is central to delivering least-cost emission reductions, but it also contributes to growing political resistance to cap-and-trade
systems. Policy makers are looking to instruments – like mandates – where emissions value is opaque. This includes emission performance standards (EPSs) for electricity plants and other large fixed sources. Unfortunately, policies aimed at building renewable energy capacity are also displacing more natural gas than coal where the CO2 price is low or absent. This is counter-productive when it comes to reducing emissions. Sometimes the scale of renewables capacity also imposes very high system costs. At other times, policy support for specific renewables is maintained even after the technology reaches its efficient scale, as is the case in the US.

The recession has raised a significant issue for the EU-ETS: how to design cap-and-trade systems in the face of economic and technological uncertainty? Phase III of the ETS risks delivering a structurally low CO2 price due to the impact of the recession on EU emissions. A balanced resetting of the cap should be considered. It is more credible to introduce a CO2 price floor ahead of such shocks than engage in the ad hoc recalibration of the cap in response to them. This would signal to investors that unexpected shortfalls in emissions would be used in part to step up reductions and reduce uncertainty in investments associated with the CO2 price. This is an important issue for the design of Phase IV of the ETS.

Climate too low a priority

Structural climate policy problems aside, the global recession has moved climate concerns far down the hierarchy of government objectives. The financial crisis and Gulf of Mexico oil spill have also hurt trust in the private sector, spawning tighter regulation and leading to increased risk aversion. This hits funding and political support for new technologies, in particular Carbon Capture and Sequestration (CCS) where industry needs indemnification from some risk. Recent moves by the EU and the US regarding long-term liabilities show this support is far from secured. Government support for technology development may also be hit as they work to cut deficits.

In this environment of policy drift and increasing challenge to market-based solutions, it is important to remain strongly focused on least-cost solutions today and advances in new technologies for the future. Even if more pragmatic policy choices prevail, it is important that they are consistent with, and facilitate the eventual implementation of market-based solutions.

Interdependent ecosystems approach

Global policy around environmental sustainability focuses almost exclusively on climate change and CO2 emissions reduction. But since 2008, an approach which considers interdependent ecosystems has emerged and gradually gained influence.

This approach argues that targeting climate change and CO2 alone is insufficient. The planet is a system of inextricably inter-related environmental processes and each must be managed in balance with the others to sustain stability.

Research published by the Stockholm Resilience Centre in early 2009 consolidates this thinking and proposes a framework based on ‘biophysical environmental subsystems’. The Nine Planetary Boundaries collectively define a safe operating space for humanity where social and economic development does not create lasting and catastrophic environmental change.

According to the framework, planetary boundaries collectively determine ecological stability. So far, limits have been quantified for seven boundaries which, if surpassed, could result in more ecological volatility and potentially disastrous consequences. As Table 1 shows, three boundaries have already been exceeded. Based on current trends, the limits of others are fast approaching.

For the energy industry, CO2 management and reduction is the chief concern and the focus of much research and investment. But the interdependence of the other systems means that if one limit is reached, others come under intense pressure. The climate-change boundary relies on careful management of freshwater, land use, atmospheric aerosol concentration, nitrogen–phosphorus, ocean and stratospheric boundaries. Continuing to pursue an environmental policy centered on climate change will fail to preserve the planet’s environmental stability unless the other defined boundaries are addressed with equal vigour.


Stabilization Wedges (Part 3)

17 December, 2010

I bet you thought I’d never get back to this! Sorry, I like to do lots of things.

Remember the idea: in 2004, Stephen Pacala and Robert Socolow wrote a now-famous paper on how we could hold atmospheric carbon dioxide below 500 parts per million. They said that to do this, it would be enough to find 7 ways to reduce carbon emissions, each one ramping up linearly to the point of reducing carbon emissions by 1 gigaton per year by 2054.

They called these stabilization wedges, for the obvious reason:



Their paper listed 15 of these wedges. The idea here is to go through them and critique them. In Part 1 of this series we talked about four wedges involving increased efficiency and conservation. In Part 2 we covered one about shifting from coal to natural gas, and three about carbon capture and storage.

Now let’s do nuclear power and renewable energy!

9. Nuclear power. As Pacala and Socolow already argued in wedge 5, replacing 700 gigawatts of efficient coal-fired power plants with some carbon-neutral form of power would save us a gigaton of carbon per year. This would require 700 gigawatts of nuclear power plants running at 90% capacity (just as assumed for the coal plants). The means doubling the world production of nuclear power. The global pace of nuclear power plant construction from 1975 to 1990 could do this! So, this is one of the few wedges that doesn’t seem to require heroic technical feats. But of course, there’s still a downside: we can only substantially boost the use of nuclear power if people become confident about all aspects of its safety.

10. Wind power. Wind power is intermittent: Pacala and Socolow estimate that the ‘peak’ capacity (the amount you get under ideal circumstances) is about 3 times the ‘baseload’ capacity (the amount you can count on). So, to save a gigaton of carbon per year by replacing 700 gigawatts of coal-fired power plants, we need roughly 2000 gigawatts of peak wind power. Wind power was growing at about 30% per year when they wrote their paper, and it had reached a world total of 40 gigawatts. So, getting to 2000 gigawatts would mean multiplying the world production of wind power by a factor of 50. The wind turbines would “occupy” about 30 million hectares, or about 30-45 square meters per person — some on land and some offshore. But because windmills are widely spaced, land with windmills can have multiple uses.

11. Photovoltaic solar power. This too is intermittent, so to save a gigaton of carbon per year we need 2000 gigawatts of peak photovoltaic solar power to replace coal. Like wind, photovoltaic solar was growing at 30% per year when Pacala and Socolow wrote their paper. However, only 3 gigawatts had been installed worldwide. So, getting to 2000 gigawatts would require multiplying the world production of photovoltaic solar power by a factor of 700. See what I mean about ‘heroic feats’? In terms of land, this would take about 2 million hectares, or 2-3 square meters per person.

12. Renewable hydrogen. You’ve probably heard about hydrogen-powered cars. Of course you’ve got to make the hydrogen. Renewable electricity can produce hydrogen for vehicle fuel. 4000 gigawatts of peak wind power, for example, used in high-efficiency fuel-cell cars, could keep us from burning a gigaton of carbon each year in the form of gasoline or diesel fuel. Unfortunately, this is twice as much wind power as we’d need in wedge 10, where we use wind to eliminate the need for burning some coal. Why? Gasoline and diesel have less carbon per unit of energy than coal does.

13. Biofuels. Fossil-carbon fuels can also be replaced by biofuels such as ethanol. To save a gigaton per year of carbon, we could make 5.4 gigaliters per day of ethanol as a replacement for gasoline — provided the process of making this ethanol didn’t burn fossil fuels! Doing this would require multiplying the world production of bioethanol by a factor of 50. It would require 250 million hectares committed to high-yield plantations, or 250-375 square meters per person. That’s an area equal to about one-sixth of the world’s cropland. An even larger area would be required to the extent that the biofuels require fossil-fuel inputs. Clearly this could cut into the land used for growing food.

There you go… let me hear your critique! Which of these measures seem best to you? Which seem worst? But more importantly: why?

Remember: it takes a total of 7 wedges to save the world, according to this paper by Pacala and Socolow.

Next time I’ll tell you about the final two stabilization wedges… and then I’ll give you an update on their idea.


Archimede

3 December, 2010

You may have heard the legend of how in 212 BC, Archimedes defended the port city of Syracuse against the invading Romans by setting their ships afire with the help of mirrors that concentrated the sun’s light. It sounds a bit implausible…

However, maybe you’ve heard of Comte Buffon — the guy who figured out how to compute the number pi by dropping needles on the floor. According to Michael Lahans, Buffon also did an experiment to see if Archmedes’ idea was practical. He got a lot of mirrors, each 8 × 10 inches in size, adjusted to focus their light at a distance of 150 feet. And according to Lahans:

The array turned out to be a formidable weapon. At 66 feet 40 mirrors ignited a creosoted plank and at 150 feet, 128 mirrors ignited a pine plank instantly. In another experiment 45 mirrors melted six pounds of tin at 20 feet.

Should we believe this? I don’t know. Some calculations could probably settle it. Or you could try the experiment yourself. If you do, tell us how it goes.

But there are also non-military uses of concentrated solar power. For example, the new power plant named after Archimedes, located in Sicily, fairly near Syracuse:

Archimede website.

Archimede solar power plant, Wikipedia.

It started operations on July 14th of this year. It produces 5 megawatts of electricity, enough for 4,500 families. That’s not much compared to the 1 gigawatt from a typical coal- or gas-powered plant. But it’s an interesting experiment.

It consists of about 50 parabolic trough mirrors, each 100 meters long, with a total area of around 30,000 square meters. They concentrate sunlight onto 5,400 metres of pipe. This pipe carries molten salts — potassium nitrate and sodium nitrate — at a temperature of up to 550 °C. This goes on to produce steam, which powers an electrical generator.

The news is the use of molten salt instead of oil to carry the heat. Molten salt works at higher temperatures than oils, which only go up to about 390° C. So, the system is more efficient. The higher temperature also lets you use steam turbines of the sort already common in gas-fired power plants. That could make it easier to replace conventional power plants with solar ones.

The project is being run by Enel, Europe’s third-largest energy provider. It was developed with the help of ENEA, an Italian agency that deals with new technologies, energy and sustainable economic development. At the Guardian, Carlo Ombello writes:

So why hasn’t this technology come before? There are both political and technical issues behind this. Let’s start with politics. The concept dates back to 2001, when Italian nuclear physicist and Nobel prize winner Carlo Rubbia, ENEA’s President at the time, first started Research & Development on molten salt technology in Italy. Rubbia has been a preminent CSP [concentrated solar power] advocate for a long time, and was forced to leave ENEA in 2005 after strong disagreements with the Italian Government and its lack of convincing R&D policies. He then moved to CIEMAT, the Spanish equivalent of ENEA. Under his guidance, Spain has now become world leader in the CSP industry. Luckily for the Italian industry, the Archimede project was not abandoned and ENEA continued its development till completion.

There are also various technical reasons that have prevented an earlier development of this new technology. Salts tend to solidify at temperatures around 220°C, which is a serious issue for the continuous operation of a plant. ENEA and Archimede Solar Energy, a private company focusing on receiver pipes, developed several patents in order to improve the pipes’ ability to absorbe heat, and the parabolic mirrors’ reflectivity, therefore maximising the heat transfer to the fluid carrier. The result of these and several other technological improvements is a top-notch world’s first power plant with a price tag of around 60 million euros. It’s a hefty price for a 5 MW power plant, even compared to other CSP plants, but there is overwhelming scope for a massive roll-out of this new technology at utility scale in sunny regions like Northern Africa, the Middle East, Australia, the US.

The last sentence is probably a reference to DESERTEC. We’ll have to talk about that sometime, too.

If you know anything about Archimede, or DESERTEC, or concentrated solar power, or you have any questions, let us know!


Follow

Get every new post delivered to your Inbox.

Join 3,094 other followers