Stabilization Wedges (Part 5)

In 2004, Pacala and Socolow laid out a list of ways we can battle global warming using current technologies. They said that to avoid serious trouble, we need to choose seven ‘stabilization wedges’: that is, seven ways to cut carbon emissions by 1 gigatonne per year within 50 years. They listed 15 wedges to choose from, and I’ve told you about them here:

Part 1 – efficiency and conservation.

Part 2 – shifting from coal to natural gas, carbon capture and storage.

Part 3 – nuclear power and renewable energy.

Part 4 – reforestation, good soil management.

According to Pacala:

The message was a very positive one: “gee, we can solve this problem: there are lots of ways to solve it, and lots of ways for the marketplace to solve it.”

I find that interesting, because to me each wedge seems like a gargantuan enterprise—and taken together, they seem like the Seven Labors of Hercules. They’re technically feasible, but who has the stomach for them? I fear things need to get worse before we come to our senses and take action at the scale that’s required.

Anyway, that’s just me. But three years ago, Pacala publicly reconsidered his ideas for a very different reason. Based on new evidence, he gave a talk at Stanford where he said:

It’s at least possible that we’ve already let this thing go too far, and that the biosphere may start to fall apart on us, even if we do all this. We may have to fall back on some sort of dramatic Plan B. We have to stay vigilant as a species.

You can watch his talk here:

It’s pretty damned interesting: he’s a good speaker.

Here’s a dry summary of a few key points. I won’t try to add caveats: I’m sure he would add some himself in print, but I’d rather keep the message simple. I also won’t try to update his information! Not in this blog entry, anyway. But I’ll ask some questions, and I’ll be delighted if you help me out on those.

Emissions targets

First, Pacala’s review of different carbon emissions targets.

The old scientific view, circa 1998: if we could keep the CO2 from doubling from its preindustrial level of 280 parts per million, that would count as a success. Namely, most of the ‘monsters behind the door’ would not come out: continental ice sheets falling into the sea and swamping coastal cities, the collapse of the Atlantic ocean circulation, a drought in the Sahel region of Africa, etcetera.

Many experts say we’d be lucky to get away with CO2 merely doubling. At current burn rates we’ll double it by 2050, and quadruple it by the end of this century. We’ve got enough fossil fuels to send it to seven times its preindustrial levels.

Doubling it would take us to 560 parts per million. A lot of people think that’s too high to be safe. But going for lower levels gets harder:

• In Pacala and Socolow’s original paper, they talked about keeping CO2 below 500 ppm. This would require keeping CO2 emissions constant until 2050. This could be achieved by a radical decarbonization of the economies of rich countries, while allowing carbon emissions in poor countries to grow almost freely until that time.

• For a long time the IPCC and many organizations advocated keeping CO2 below 450 ppm. This would require cutting CO2 emissions by 50% by 2050, which could be achieved by a radical decarbonization in rich countries, and moderate decarbonization in poor countries.

• But by 2008 the IPCC and many groups wanted a cap of 2°C global warming, or keeping CO2 below 430 ppm. This would mean cutting CO2 emissions by 80% by 2050, which would require a radical decarbonization in both rich and poor countries.

The difference here is what poor people have to do. The rich countries need to radically cut carbon emissions in all these scenarios. In the USA, the Lieberman-Warner bill would have forced the complete decarbonization of the economy by 2050.

Then, Pacala spoke about 3 things that make him nervous:

1. Faster emissions growth

A 2007 paper by Canadell et al pointed out that starting in 2000, fossil fuel emissions started growing at 3% per year instead of the earlier figure of 1.5%. This could be due to China’s industrialization. Will this keep up in years to come? If so, the original Pacala-Socolow plan won’t work.

(How much, exactly, did the economic recession change this story?)

2. The ocean sink

Each year fossil fuel burning puts about 8 gigatons of carbon in the atmosphere. The ocean absorbs about 2 gigatons and the land absorbs about 2, leaving about 4 gigatons in the atmosphere.

However, as CO2 emissions rise, the oceanic CO2 sink has been growing less than anticipated. This seems to be due to a change in wind patterns, itself a consequence of global warming.

(What’s the latest story here?)

3. The land sink

As the CO2 levels go up, people expected plants to grow better and suck up more CO2. In the third IPCC report, models predicted that by 2050, plants will be drawing down 6 gigatonnes more carbon per year than they do now! The fourth IPCC report was similar.

This is huge: remember that right now we emit about 8 gigatonnes per year. Indeed, this effect, called CO2 fertilization, could be the difference between the land being a big carbon sink and a big carbon source. Why a carbon source? For one thing, without the plants sucking up CO2, temperatures will rise faster, and the Amazon rainforest may start to die, and permafrost in the Arctic may release more greenhouse gases (especially methane) as it melts.

In a simulation run by Pacala, where he deliberately assumed that plants fail to suck up more carbon dioxide, these effects happened and the biosphere dumped a huge amount of extra CO2 into the atmosphere: the equivalent of 26 stabilization wedges.

So, plans based on the IPCC models are essentially counting on plants to save us from ourselves.

But is there any reason to think plants might not suck up CO2 at the predicted rates?

Maybe. First, people have actually grown forests in doubled CO2 conditions to see how much faster plants grow then. But the classic experiment along these lines used young trees. In 2005, Körner et al did an experiment using mature trees… and they didn’t see them growing any faster!

Second, models in the third IPCC report assumed that as plants grew faster, they’d have no trouble getting all the nitrogen they need. But Hungate et al have argued otherwise. On the other hand, Alexander Barron discovered that some tropical plants were unexpectedly good at ramping up the rate at which they grab ahold of nitrogen from the atmosphere. But on the third hand, that only applies to the tropics. And on the fourth hand—a complicated problem like this requires one of those Indian gods with lots of hands—nitrogen isn’t the only limiting factor to worry about: there’s also phosphorus, for example.

Pacala goes on and discusses even more complicating factors. But his main point is simple. The details of CO2 fertilization matter a lot. It could make the difference between their original plan being roughly good enough… and being nowhere near good enough!

(What’s the latest story here?)

42 Responses to Stabilization Wedges (Part 5)

  1. Nathan Urban says:

    The recession did cut into the global emissions growth rate, as described in Friedlingstein et al. (2010) (considering emissions through 2009). The 3%/year acceleration is largely due to China, from what I’ve read. They can’t keep that growth up forever, but they don’t have to in order to require more wedges.

    The story on the ocean carbon sink, particularly the wind-driven mixing in the Southern Ocean, is still ambiguous, as is the story on terrestrial CO2 fertilization. For the ocean sink, you could start with Le Quéré et al. (2007) and scan forward through the papers that cite it. I haven’t yet sorted out the CO2 fertilization debate for myself.

    By the way, your last link points to co2science.org (also here), which is one of those slick climate-skeptic dissemination organizations you sometimes hear about. Of course the papers they cite speak for themselves, but you can’t always trust the interpretation (or selection of papers) chosen by that site to be impartial. I can’t really comment on this further since, as I said, I haven’t managed to sort out the literature for myself in this area.

    • John Baez says:

      Whoops! I was actually suspicious of that ‘co2science.org’ site, because I’d never heard of it, and because of the sidebar asking ‘do plants like more CO2?’. But their mission statement did not seem suspicious to me, so with some mild misgivings I linked to their page.

      I’ve changed the link to one describing Alexander Barron’s research – he’s the guy whose work on nitrogen-fixing tropical plants Pacala mentioned. I can’t instantly figure out what paper Pacala was talking about.

  2. I still think the ‘wedges’ idea is a great way to get people to think about the problem, as it decomposes a seemingly intractable challenge into a smaller number of difficult, but doable challenges. The idea that the wedges start small and build year by year is particularly attractive.

    The problem now is that we seem to need about twice as many wedges as Pacala and Socolow originally envisaged. So, exercises based on choosing wedges (such as this one) now seem misguided – the problem isn’t to select wedges, it’s how to implement all the ones that have been identified so far, and how to think up some more to use as backup.

    The other problem is that none of the wedges are really unitary – all of them could be scaled up or down to some extent to give a larger (or smaller) contribution. So some of the discussion ought to be on what the limiting factors are that might prevent various wedges from being expanded.

    All of this pre-supposes that we can get to the point where political leaders even understand the urgency of the problem. If we can solve *that* problem, implementing the wedges will probably seem easy by comparison.

    • John Baez says:

      Great to see you here, Steve!

      I agree that our focus should not be on choosing wedges, as if choosing among items on a menu: what would taste better, a slice of pumpkin pie or a slice of cherry pie? A better rough rule of thumb would be: we need to slow the carbon dioxide buildup every way we can, as much as we can, as fast as we can.

      The image of a ‘menu’ is especially destructive when it leads proponents of different promising strategies to battle with each other instead of the common enemy. Here I’m especially thinking about solar versus nuclear.

      The main good thing about this wedge business is that it starts people thinking quantitatively about how much we need to do and what various actions would accomplish. I’ve been working through the Pacala-Socolow paper as a way to start building up a repository of numbers and facts on the Azimuth Wiki.

      All of this pre-supposes that we can get to the point where political leaders even understand the urgency of the problem. If we can solve *that* problem, implementing the wedges will probably seem easy by comparison.

      Right.

  3. Phil Henshaw says:

    There’s a long list of problems with this strategy, but the main one is that we have yet to make sense of the the main purpose, which as stands is ill-conceived.

    The purpose of preventing climate change is, effectively, to continue doubling all our other environmental impacts every 20 years or so… There is just a huge long list of other systemic threats from other environmental impacts. Perhaps the biggest “sleeper” in the bunch is that the world economy has started responding over the past 10 years as if the rapidly growing resource demand cannot be met by increasing supply. The food and fuel resource markets are all panicking, *together*, demonstrating a systemic global market response to being pushed beyond their point of adaptive resilience, and reacting rigidly to demand my escalating price instead of creating supply… “A decisive moment for Investing in Sustainability”

    That’s a really serious unacknowledged problem like the 50 other really serious unacknowledged problems we should be dealing with but generally are not. If we don’t solve them all together, we’re just making them all worse, is the combined problem.

    Hope that makes one or the other of those understandable. Ask for clarification if not.

    • John Baez says:

      Phil wrote:

      The purpose of preventing climate change is, effectively, to continue doubling all our other environmental impacts every 20 years or so…

      Who thinks that? I suppose plenty of people implicitly expect exponential economic growth, but surely many people who think seriously about the future must hope that:

      1) world population peaks around 2050 and then drops down to some considerably lower level, perhaps less than a billion.

      2) energy production shifts from fossil fuels to renewables with nuclear serving as a bridge,

      3) energy consumption per capita does not continue to increase but instead converges to some sustainable level.

      4) similarly, other use of resources converges to some sustainable level.

      This is what I hope, anyway. People who want to use a lot more energy and other stuff should leave this planet. Some actually will, I hope. They may mess up the rest of the universe, but I hope some planets including Earth stay nice for a long time.

      • Phil Henshaw says:

        You’re overlooking economic growth, and that the precondition for climate mitigation, and other environmental action, is to not reduce the rate of growth appreciably. The problem is that to stabilize our financial system compound growth in resource use is absolutely necessary, and that’s why it’s a precondition for financing climate mitigation. That’s also why climate mitigation will perpetuate the compound growth in all other impacts.

        Lots of people fictionalize perpetual motion and multiplying machines of all kinds, but the data and thermodynamics both point to real financial earnings and real resource use being connected.

        If the world economy expands its real product every 20 years or so it then is necessarily also doubling its impacts. One could imagine growing impacts are growing slower, but they would also hit more and more sensitive environments, etc. Even applying every trick you can pull, though, it won’t change the best straight line approximation to the scale of impacts being as close to a vertical line as you can draw starting at the time you ask the question…

        That’s the math, right?

      • John Baez says:

        Phil wrote:

        You’re overlooking economic growth…

        No, I’m not. The idea of 1)-4) is move towards a steady state of resource consumption, where resources are consumed no faster than the Earth produces them. This means the end of economic growth as we know it, which means of course the end of the financial-economic system as we know it.

        It might not spell the end to growth of some subtler kind, but economic growth as we know it involves increased use of resources.

        • Phil Henshaw says:

          John, I think it’s that you consider the economy to be a collection of trends, like data, rather than a collection of mechanisms that operate on their own. That or something like it keeps you from identify the right mechanism driving growth, and making it necessary for financial stability. It’s not just a social belief that drives it, but built into the rules of finance, and the defined purposes of regulation. They are arranged to direct the management of the physical world to allow the financial world to produce continually multiplying real returns for money saved in the past.

          When you start studying the active mechanisms you start finding what really must change in our rules, to then guide you to discovering what really can (because the necessary change is a real problem). That would be real system steering. Just using social policy to favor desirable trends, while part of it of course, will have quite the opposite of the intended effect by itself, is the catch.

      • DavidTweed says:

        John wrote

        I suppose plenty of people implicitly expect exponential economic growth, but surely many people who think seriously about the future…

        The impression that I have is that a sizeable proportion of people who think seriously about the future expect technological developments that will change things in ways which will allow economic growth to keep increasing, eg, some energy breakthrough that will make currently non-viable recycling viable, etc. The biggest issue is that a lot of the people who say that aren’t actively trying to create these amazing new technologies; they expect “human ingenuity”, which presumably means “other people”, to do that.

        • Phil Henshaw says:

          David, Yes indeed, a majority of the well paid “problem solvers” are only solving them hypothetically!! …and not paying attention as the earth is making every single step more and more costly and complex, the way nature signals the limits of things…

          What we have is a shortage of “problem finders” for discovering what problems are not worth trying to solve, and a waste of time and money taken from others that are!

          This article of mine will appear in the UK next week I’m promised. http://www.synapse9.com/pub/ASustInvestMoment.pdf

  4. Frederik De Roo says:

    Somewhat off topic:

    Hercules did twelve labours.

  5. jack fuller says:

    Interesting introduction to the recent PBS NOVA program and Pacala interview illustrating the modern human reversal of the original vegetative scrubbing of primeval CO2. And now we refuse to ‘let sleeping carbon lie’.

    The nuclear option presentation seemed to me a bit iffy. From what one hears about all the underground facilities these days, what might we think about constructing all future nuclear reactor vessels underground?

  6. In the third IPCC report, models predicted that by 2050, plants will be drawing down 6 gigatonnes more carbon per year than they do now! The fourth IPCC report was similar.

    That would be a 10% increase! (According to this, global terrestial NPP is 48-69Gt.)

    Currently it looks we can completely forget about this:

    From satellite observations, Zhao & Running (2010) estimate a 0.55 Gt (ca. 1%) decline in global terrestrial NPP (net primary production) from 2000 to 2009. Between 1982 and 1999 the increase was up to 6%.

    3 other observations:

    • Shilong Piao, Xuhui Wang, Philitppe Ciais, Biao Zhuz, Tao Wang, and Jiu Liu, Changes in satellite-derived vegetation growth trend in temperate and boreal Eurasia from 1982 to 2006, Global Change Biology, preview 31 March 2011.

    Abstract (…) although a statistically significant positive trend of average growing season NDVI is observed (0.5 × 10−3 year−1, P= 0.03) during the entire study period, there are two distinct periods with opposite trends in growing season NDVI. Growing season NDVI has first significantly increased from 1982 to 1997 (1.8 × 10−3 year−1, P < 0.001), and then decreased from 1997 to 2006 (−1.3 × 10−3 year−1, P= 0.055). (…)

    • NASA Earth Observatory, April 22, 2006: Northern Forest Affected by Global Warming.

    reporting on:

    Scott J. Goetz, Andrew G. Bunn, Gregory J. Fiske, R. A. Houghton, Satellite-observed photosynthetic trends across boreal North America associated with climate and fire disturbance, PNAS September 20, 2005 vol. 102.

    For parts of the region, growth has not changed (gray), but in interior Alaska and a wide swath of Canada, growth has declined (brown). Only in the far north, regions of tundra, has growth increased (green).

    • Jofre Carnicera, Marta Colla, Miquel Ninyerolac, Xavier Ponsd, Gerardo Sáncheze, Josep Peñuelasa, Widespread crown condition decline, food web disruption, and amplified tree mortality with increased climate change-type drought, PNAS January 25, 2011 vol. 108.

    Abstract. Climate change is progressively increasing severe drought events in the Northern Hemisphere, causing regional tree die-off events and contributing to the global reduction of the carbon sink efficiency of forests. (…) Here we report a generalized increase in crown defoliation in southern European forests occurring during 1987–2007. Forest tree species have consistently and significantly altered their crown leaf structures, with increased percentages of defoliation in the drier parts of their distributions in response to increased water deficit. We assessed (…) Our results reveal a complex geographical mosaic of species-specific responses to climate change–driven drought pressures on the Iberian Peninsula, with an overwhelmingly predominant trend toward increased drought damage.

    • John Baez says:

      Thanks for your detailed comment, Florifulgurator!

      This passage was a bit confusing to me:

      Currently it looks we can completely forget about this:

      From satellite observations, Zhao & Running (2010) estimate a 0.55 Gt (ca. 1%) decline in global terrestrial NPP from 2000 to 2009. Between 1982 and 1999 the increase was up to 6%.

      Forget about what? Forget about the decline mentioned here, forget about the increase mentioned here, or…?

      I actually guess you meant “forget about the massive increase in net primary production predicted by the IPCC report”. Is that what you meant?

      I didn’t know what “NPP” meant, so I added a link to the Wikipedia article. For those too lazy to click:

      Gross primary production (GPP) is the rate at which an ecosystem’s producers capture and store a given amount of chemical energy as biomass in a given length of time. Some fraction of this fixed energy is used by primary producers for cellular respiration and maintenance of existing tissues (i.e., “growth respiration” and “maintenance respiration”). The remaining fixed energy (i.e., mass of photosynthate) is referred to as net primary production (NPP).

      NPP = GPP – respiration [by plants]

      Net primary production is the rate at which all the plants in an ecosystem produce net useful chemical energy; it is equal to the difference between the rate at which the plants in an ecosystem produce useful chemical energy (GPP) and the rate at which they use some of that energy during respiration. Some net primary production goes toward growth and reproduction of primary producers, while some is consumed by herbivores.

      Both gross and net primary production are in units of mass / area / time. In terrestrial ecosystems, mass of carbon per unit area per year (g C/m2/yr) is most often used as the unit of measurement.

    • Florifulgurator says:

      If these hints weren’t enough, here’s another hint from paleoclimatology that we can quite possibly forget about the plant fertilization effect:

      • Gabriel J. Bowen, James C. Zachos. Rapid carbon sequestration at the termination of the Palaeocene–Eocene Thermal Maximum. Nature Geoscience 3 (2010), 866–869

      From an interview with Bowen in Science Daily:

      “At the beginning of the event we see a shift indicating that a lot of organic-derived carbon dioxide had been added to the atmosphere, and at the end of the event we see a shift indicating that a lot of carbon dioxide was taken up as organic carbon and thus removed from the atmosphere.”

      “Expansion of the biosphere is one plausible mechanism for the rapid recovery, but in order to take up this much carbon in forests and soils there must have first been a massive depletion of these carbon stocks,” he said. “We don’t currently know where all the carbon that caused this event came from, and our results suggest the troubling possibility that widespread decay or burning of large parts of the continental biosphere may have been involved.

      (My emph.)

      (Thanks to commenter AR18 for the link. Contrary to his “skeptic” view, the “rapid recovery” is no excuse, since it happened on the timescale of tens of thousands of years. It took much longer than the epoch we just messed up, the holocene got old.)

  7. Other article gone with the internets wind. I post it on forum.

    • John Baez says:

      Please don’t keep reposting stuff – an email to me and I can easily dig it out of the spam box. When you post several slightly different versions of the same thing, it’s more work for you and me: I have to look at every one and try to figure out which one(s) to post.

      • Did you get several re-posts in your folder? I tried on my wordpress blog (where I have a test thread) – but nothing in the spam folder. Actually at the end my comment surfaced on my test blog – but not here.

        • John Baez says:

          Yes, I got a pile of re-posts in my spam folder. I suspect that everything that didn’t appear on my blog appeared in my spam folder. By now I have gotten rid of everything except your first attempt. I tried to make it a bit prettier.

          I don’t know why your test blog would act differently from my blog here!

  8. Florifulgurator says:

    Sorry for the mess. My spam folder never caught anything.

    Thanks for making stuff prettier!

    I don’t know why your test blog would act differently from my blog here!

    Well, that’s how today’s computers work: Accidentally. Programmers are not mathematicians (except exceptions). They don’t need proofs. If it works, it works. Why it works nobody cares.

  9. John F says:

    Plan B has to involve industrial consumption of atmospheric CO2. It is currently feasible to produce nontoxic polycarbonate materials from atmospheric CO2. Scaling it up to a gigatonne per year would mainly require finding a market for all that plastic.

    • John Baez says:

      Just out of curiosity, how many tonnes of plastic are produced each year now?

      What would the effect be on atmospheric CO2 if I were elected king of the planet and decreed that henceforth all plastics were to be made of recycled CO2—assuming for a second that this were possible?

      • John F says:

        I think the total production is about 1 gigatonne, but China’s numbers are all over the place.

        If we used CO2 recycled products everywhere even for durable items – furniture, construction materials (people who live in plastic houses), etc. – I would assume multiple gigatonnes could be used. But still not enough by itself.

    • Florifulgurator says:

      My Plan C involves manual sequestration of atmospheric CO2. The plan is robust against civilization collapse since the basic technique is pure stone-age (to the insult of Homo Sapiens Colossus). (However it is not robust against the three mind poisons, so I remain pessimistic.) Scaling it up to a gigatonne per year would mainly require the will of Homo Sapiens to survive and not starve by the billions. The “market” for CO2 (plus nitrogen) fixated in char coal is agriculture.

      • John F says:

        I think so. There is no reason not to mix in char in soil, and done properly it supposedly can sequester more carbon by humification.

  10. AR18 says:

    How about doing nothing? The IPCC forecasts for the end-of-the-world nonsense assume that CO2 will continue to escalate at current rates, but that in turn requires the blind-faith assumption that humans will continue to burn fossil fuels at same escalating rate they are now — which is patently absurd since we are now nearing peak oil output and the rate will decrease, not increase.

    Then we all need to remember how NASA told us to fear for our lives for Global Cooling in the 70’s, but when they had “more and better data” changed their minds and told us to fear for our lives for Global Warming. I don’t believe we are headed for a world wide end-of-the-world disaster. Those stories are as credible as “I’ve seen a UFO from outer space” stories.

    “[The current anthropomorphic global warming nonsense is based on] inherently untrustworthy climate models, similar to those that cannot accurately forecast the weather a week from now” (Dr. Richard Lindzen)

    The Earth recovered from prehistoric global warming episodes, very similar to today’s situation, much faster than today’s climate models represent. See http://www.sciencedaily.com/releases/2011/04/110421151919.htm or “Rapid carbon sequestration at the termination of the Palaeocene–Eocene Thermal Maximum”.

    We just exited the Little Ice Age, so would you expect the temperatures to fall or rise after a Little Ice Age? You can’t credibly claim that all the warming since the Little Ice Age ended is solely due to human influence.

    Unless these people know what causes Ice Ages and can reliably predict the climate more than two days in advance, don’t meddle with something you don’t understand because you can only make things worse.

    • DavidTweed says:

      I presume you’re actually aware that both oil is not the only fossil fuel and that the rate of decline in output of oil is likely to be relatively slow for the next decade or two? (The effect of oil decline may be greater than the production decline rate because there are there’s now both much greater purchasing capacity and demand for oil in the china, india and the middle east, so westerners with a sense of their “entitled” level of consumption may need to adjust their views.)

      • Eric says:

        the rate of decline in output of oil is likely to be relatively slow for the next decade or two

        This is an important and underestimated point. When I mention “peak oil” in a room of finance professionals, I often need to spend the next 5-10 minutes clearing up this misperception. It’s not like anyone is saying oil is going to suddenly run out like a tap switching off. The last barrel of oil will never be extracted because the costs will be so high that people will have moved away from oil long before that happens.

        • Phil Henshaw says:

          Eric & Dave, with still growing demand in many places, though,… gradual supply reduction creates market shocks that are severe. Continued increasing demand from some will need to be supplied by raising the price to levels that are prohibitive for others, is the hitch. That’s the characteristic of “systemic demand exceeding supply”.

          It’s the real world look of “the big crunch”, I’m afraid, that so many people have been predicting for so long, but didn’t understand what it would look like when it came.

        • Eric says:

          Hi Phil,

          I agree with you 100%. That is why I talk about it to rooms full of financial professionals. My current occupation is risk manager for a large financial services firm.

          You do not need oil to run out in order to have macroeconomic consequence. The second derivative going to zro will be bad enough.

        • Phil Henshaw says:

          Eric, I’m glad to meet someone who recognizes that, and might understand why I’ve been jumping around pointing to those inflection points in all kinds of things. They are indeed when profit principles reverse sign for a great many investment strategies. There’s the inflection point in oil reserve discoveries in the 1950’s, for example… It should be profitable to help people see them and understand what to do, certainly, and I’d like to help. From a scientific approach it comes with expanding the language of science to include what business people see in the world… and that’s kind of tough. It feels like persuading both that we live in a natural world not their imagined theoretical one, and having to drag them kicking and screaming!! ;-)

    • Nathan Urban says:

      AR18: I don’t find inflammatory language like “nonsense”, “blind faith”, “patently absurd”, to be very helpful, nor your past comments such as how people who work on AI are “the stupidest people in the world”, “no one here understands elementary probability”, etc.

      “[peak oil]”

      As David pointed out, conventional oil is not the only fossil fuel. There is quite a bit of coal, as well as nonconventional oil in shales and tar sands. How close we are to peak oil, let alone peak fossil fuel production, is the subject of much debate. It certainly is not a foregone conclusion, as you seem to assume, that fossil fuel production will peak soon at some level near today’s production.

      “[NASA and global cooling in the 1970s]”

      NASA, the organization, never advanced a public position on global cooling in the 1970s. To my knowledge, neither did any individual NASA researchers (who of course do not speak for NASA). Indeed, the most prominent NASA climate scientist at that time, James Hansen, predicted the opposite.

      Setting NASA aside, while global cooling gained media attention in the 1970s, the idea of imminent cooling (as opposed to long-term glaciation over thousands or tens of thousandso of years) never reached consensus in the peer reviewed literature. Again, the literature is close to the opposite of that position (see here).

      “I don’t believe we are headed for a world wide end-of-the-world disaster.”

      What, exactly, constitutes a “world wide end-of-the-world disaster”, and who is predicting it? The IPCC, for example, is not predicting extinction of the human race or anything like that.

      “`[The current anthropomorphic global warming nonsense is based on] inherently untrustworthy climate models, similar to those that cannot accurately forecast the weather a week from now’ (Dr. Richard Lindzen)”

      I am sure Lindzen does not appreciate having elided text inserted into his quotes, especially ones that can’t even spell “anthropogenic” correctly. As for the actual argument, chaos theory places limits on how far ahead the state of the system can be predicted. However, climate prediction does not attempt to predict the state of the system, but rather statistics such as averages, which are governed more by energy balance considerations.

      “The Earth recovered from prehistoric global warming episodes, very similar to today’s situation, much faster than today’s climate models represent.”

      An atmospheric carbon half life of 30,000-40,000 years is not really that rapid as far as impacts on a human timescale are concerned, nor would the policy-relevant impacts be much different if the half life were, say, 120,000 years. Indeed, a 30,000 year half life is an argument for the risks of elevated CO2, not an argument against it.

      “You can’t credibly claim that all the warming since the Little Ice Age ended is solely due to human influence.”

      Who claims that? Certainly not the IPCC. The IPCC merely claims that “most” (more than half) of the warming “since the mid-20th century” is due to human influence.

      I’m not sure why you feel the need to bring up a series of strawman arguments that no one here has advanced.

      “Unless these people know what causes Ice Ages and can reliably predict the climate more than two days in advance, don’t meddle with something you don’t understand because you can only make things worse.”

      Here you continue to confuse weather and climate prediction. As for ice ages, we know quite a bit about what causes them, but the fact that they are an ongoing subject of research does not imply that we don’t know anything about what the future climate is likely to look like.

      Finally, your claim “Don’t meddle with something you don’t understand because you can only make things worse” is not logically supported, since if we don’t understand the implications, “meddling” can a-priori either make things either worse or better.

      But the precautionary principle has some validity to it, which is precisely why it is unwise for us to massively increase the CO2 content of the atmosphere. Human civilization has existed within a fairly narrow range of CO2 levels, and we know that civilization can thrive in that range. We don’t know whether civilization can thrive outside that range, so how can we say it’s safe to conduct that experiment? Especially when there are many reasons to believe that the resulting climate changes will be nontrivial on the scale of human experience.

    • John Baez says:

      AR18 wrote:

      The IPCC forecasts for the end-of-the-world nonsense assume that CO2 will continue to escalate at current rates, but that in turn requires the blind-faith assumption that humans will continue to burn fossil fuels at same escalating rate they are now…

      That’s false. The IPCC reports don’t make a “blind-faith assumption that humans will continue to burn fossil fuels at same escalating rate they are now”. They consider a variety of scenarios, all of which take into account the limited reserves of fossil fuels, and all of which assume an increasing usage of carbon-free energy. Here’s what you want to read:

      • IPCC, Special Report on Emissions Scenarios (SRES), 2000.

      This describes the four scenarios, commonly called ‘SRES scenarios’, used in the fourth IPCC report (the most recent one). You probably want to focus on Section 3.4.3: Energy Resources.

      — which is patently absurd since we are now nearing peak oil output and the rate will decrease, not increase.

      It’s not clear why you focus on oil, because there’s a lot more carbon in the form of coal: roughly an order of magnitude more. If you look at Section 3.4.3: Energy Resources, you’ll see that as far as oil goes, they say:

      In terms of exploration, the oil industry is relatively mature and the quantity of additional reserves that remain to be discovered is unclear . One group argues that few new oil fields are being discovered, despite the surge in drilling activity from 1978 to 1986, and that most of the increases in reserves results from revisions of underestimated existing reserves (Ivanhoe and Leckie, 1993; Laherrere, 1994; Campbell, 1997; Hatfield, 1997). Laherrere (1994) puts ultimately recoverable oil resources at about 10 ZJ (1800 billion barrels), including production to date. Adelman and Lynch (1997), while accepting some aspects in the propositions behind the pessimistic view of reserves, point to previous pessimistic estimates that have been wrong. They argue that “there are huge amounts of hydrocarbons in the earth’s crust” and that “estimates of declining reserves and production are incurably wrong because they treat as a quantity what is really a dynamic process driven by growing knowledge.” Smith and Robinson (1997) note improvements in technology, such as 3D seismic surveys and extended reach (e.g. horizontal) drilling, that have improved recovery rates from existing reservoirs and made profitable the development of fields previously regarded as uneconomic. Both of these increase reserves and lower costs. The various arguments and assessments are reviewed in greater detail in Gregory and Rogner (1998). To include all these views and to reflect uncertainty, future reserves availability cannot be represented by single numbers. Instead, a range of values that reflect the optimistic and pessimistic assumptions on extent and success rates of exploration activities, as well as the future evolution of prices and technology, needs to be considered for a scenario approach. To this end, the estimates of Masters et al. (1994) reflect the current state of knowledge as to the uncertainties in future potentials for conventional oil resources. These estimates assess conventional oil reserves at slightly above 6 ZJ, and a corresponding range of additionally recoverable resources between 1.6 and 5.9 ZJ.

      A ‘ZJ’ is a zettajoule, i.e. 1021 joules. To convert this into more familiar units, I’ll guess (with help) that a barrel of fuel oil yields about 6.4 gigajoules of energy when burnt. So, they were assuming reserves of about 1 trillion barrels of oil, plus a bunch of ‘additionally recoverable resources’. This looks about right, though a little low compared to 2011 Wikipedia data taken from the Energy Information Administration.

      But note: this pales in comparison to coal reserves! The IPCC writes:

      Coal reserves are different in character to oil and gas – coal occurs in seams, often covers large areas, and relatively limited exploration is required to provide a reasonable estimate of coal in place. Total coal in place is estimated at about 220-280 ZJ (WEC, 1995a; Rogner, 1996; 1997; Gregory and Rogner, 1998). Of this total, about 22.9 ZJ are classified as recoverable reserves (WEC, 1995a; 1998), over 200 times current production levels. The question is the extent to which additional resources can be upgraded to reserves. WEC (1995a; 1998) estimates additional recoverable reserves at about 80 ZJ, although it is not clear under what conditions these reserves would become economically attractive. Over 90% of their estimate of total reserves occur in just six countries, with 70% in the Russian Federation alone. Further coal resources are known to exist in various countries, some of which might be exploitable in the future, perhaps at high cost. However, in some countries the environmental damage from coal mining will prevent possible additional reserves being developed. In the IPCC WGII SAR, Nakicenovic et al (1996) estimate that, in addition to today’s reserves, a further 89 ZJ could, at least in principle, be mined with technological advances, a figure in agreement with the WEC (1998) estimates.

      So, maybe 22 zettajoules of reserves and 89 zettajoules that could be mined with technological advances. The figures on reserves here are roughly in line with 2011 Wikipedia data.

      In short: nothing makes me think the IPCC is making a “blind-faith assumption” that we’ll burn carbon at rates beyond what reserves actually allow.

      I think Nathan handled your other points pretty well. I find them equally unconvincing.

    • Web Hub Tel says:

      Let’s use the power of the minds here on Azimuth to attack this from a different perspective. Besides our knowledge of CO2 as a greenhouse gas, the unusual feature of C02 is its long residence time in the atmosphere. In a way it is so persistent as to appear inert over many decades. The time constant for CO2 sequestering has both a short period and a long period, very characteristic of a reaction controlled by a potentially fractal process.

      Now I can ask, how does this square with the excellent set of posts that John presented on modeling via Petri Nets? With that kind of model, we are dealing with homogeneous reaction kinetics, and assume uniform mixing so we can solve the equations in a more straightforward manner. But with huge amounts of disorder in the mix, what do we do? I would suggest to apply a simple first-order reaction kinetics model but then use something like superstatistics (ala C.Beck et al) and integrate over all possible reaction pathways to model the actual CO2 residence time.

      What you will find is that superstatistics will in fact generate these “fat-tail” residence time curves that atmospheric CO2 demonstrates. The tails drop-off as 1/sqrt(time), so it is just a matter of coming up with a reasonable model for the superstatistics, whether is is driven by some sort of disordered random walk or other diffusion barrier. (same thing happens with the heat decay from radioactive waste dumps, where the disorder is in the half-lifes of radioactive species)

      I only suggest that it is perfect applied math (eek!) problem for this crowd to think about. I have already given it a go, but would be interested to see what other people think.
      I challenge AR18 to also get involved in this, because it is a very objective question, completely divorced from any kind of agenda that you might imagine to be occurring.

  11. I did a rough calculation of the amount of carbon if we burn through all the remaining estimated coal, oil and natural gas reserves. There’s about a trillion tonnes of carbon, assuming no major new discoveries:
    http://www.easterbrook.ca/steve/?p=977
    And it’s fairly clear that as the fuel prices rise (which is inevitable once supply starts to decline), you can be fairly sure that every last drop will be extracted, because it becomes increasingly profitable to do so. The key point is that much of the remaining reserves have to remain buried in the ground. I can’t imagine there is ever likely to be the political will to ensure this, given how much our entire political system depends on the profits from the fossil fuel industry.

    • John Baez says:

      Thanks again, Steve!

      For those too lazy to click the link, here’s the upshot of Steve’s calculation:

      That all adds up to about 1 trillion tonnes of carbon from estimated fossil fuel reserves, the vast majority of which is coal. If we want a 50:50 chance of staying below 2ºC temperature rise, we can only burn half this much over the next few centuries. If we want better odds, say a 1-in-4 chance of exceeding 2ºC, we can only burn a quarter of it.

      Conclusion: More than one half of all remaining fossil fuel reserves must remain unused. So peak oil and peak coal won’t save us. I would even go so far as to say that the peak oil folks are only about half as worried as they should be!

      Let me also point people toward this paper, which makes a significantly higher guess about how much more carbon will be burnt:

      • Nordhaus, W. D. 2007. The challenge of global warming: Economic models and environmental policy, Technical report, http://nordhaus.econ.yale.edu/DICE2007.htm, accessed May 2, 2007, model version: DICE-2007.delta.v7.

      A book describing Nordhaus’ 2007 model is here:

      • William D. Nordhaus, A Question of Balance: Weighing the Options on Global Warming Policies, Yale University Press, New Haven, 2008.

      A version of this book is free online — just click!

      On page 127 of his book, Nordhaus estimates that a total of 6±1.2 trillion metric tons of carbon are available to be burnt. Currently we’ve burnt about 0.54 trillion tons.

      Let me also take this opportunity to quote a comment by Nathan Urban in our discussion of week 305:

      Fossil fuel emissions in the “business as usual” (BAU) scenarios that are usually considered are not driven primarily by population growth. They’re mostly driven by an assumption of continued economic growth, particularly that the rest of the developing world will eventually grow to consume energy at intensities similar to European, or U.S., consumption patterns (barring additional economic incentives to strive for low energy intensities).

      This is coupled to an assumption that there is a large amount of fossil carbon available in coal, tar sands, and oil shales (thousands of gigatons), and that eventual high energy demand will make it economically worthwhile to extract most or all of that carbon.

      This doesn’t preclude growth in alternative energy, simply that energy demand will be high enough that we’ll eventually want to dig up all that fossil carbon anyway, in addition to whatever alternative energy we deploy.

      Nordhaus’s DICE model is one way to turn these assumptions into an emissions trajectory. I should also point people toward the “Representative Concentration Pathway” (RCP) scenarios (overview here), which is what the IPCC will be using in its next assessment report.

      You can view (preliminary versions of?) these scenarios with this browser. The emissions projections go out to 2100, and they have “extension scenarios” (ECPs) for CO2 concentrations (not emissions) out to 2300.

      Their BAU scenario is called RCP8.5, and it is based on (but not identical to?) work in this paper, which outlines growth scenarios. (They don’t explicitly discuss fossil carbon constraints because in this scenario they implicitly assume that there is enough carbon to avoid peaking before 2100, the last date they consider.)

      All the other RCP scenarios are “stabilization” or mitigation scenarios where society opts to stabilize at below-BAU CO2 concentrations, or reduce emissions even further.

      The RCPs and ECPs look like this:

      RCP8.5 appears to have a slightly larger and sooner peak than the DICE BAU scenario, but is fairly comparable. Looking at the ECP concentrations, they seem to be assuming a similar total fossil resource constraint (~5000 GtC). The other ECPs assume stabilization at some CO2 concentration around 2150, or even a decline.

      I don’t know that I personally believe that we will go after ever last scrap of carbon we think may be in the ground. As I said in the interview, we intentionally considered a “worst case” scenario.

      I do think there’s a serious risk that we’ll extract, say, half that amount, and reach quadrupled (from pre-industrial) CO2 levels some time in the century after this one. That would require extracting what are currently low grade and unprofitable reserves, but they will become more profitable as other sources are depleted.

      Eventually fossil prices will rise, and alternative energy prices drop, to the point that it’s more profitable to switch completely to non-fossil energy. Absent price controls on carbon, I am not convinced this will happen fast enough to avoid some pretty high CO2 levels.

      There are other scenarios that lead to lower fossil fuel consumption, such as a global economic collapse leading to permanently depressed economic growth, or otherwise lower continued growth than we’ve seen historically based on our fossil energy economy. (There are also scenarios of enhanced economic growth…) Even so, a lowered rate of growth doesn’t necessarily imply a lowered final CO2 concentration, just that we’d hit it at a later date.

      I can’t venture my own estimate as to what I think will come to pass. For my policy work I choose to use what appears in the mainstream climate economic literature, and if those estimates change, so will my projections.

  12. Web Hub Tel says:

    This might also be a time to request a post or two on how mathematical convolution relates to the observed CO2 response.
    I googled Azimuth and only found a reference from that interview with Nathan Urban, week #304

    The impulse response model makes its prediction by summing up lots of copies of the impulse repsonse curve, with different sizes and at different times. (Techincally, this is a convolution of the impulse response curve, or Green’s function, with the emissions trajectory curve.)

    Upthread, I made the observation of the CO2 impulse response showing a significant fat-tail (what kind of Green’s function is that?). With convolution you can readily map out how the CO2 emission forcing function interacts with the impulse response, thus giving that loooong lag between shutting down CO2 emissions and seeing a significant change in atmospheric CO2.

    So the suggestion is to educate how this comes about from a mathematical physicists’s perspective, and perhaps spur someone to think about the problem in a new way.

  13. Nathan Urban says:

    One simple approach is to use a Green’s function that is the sum of several decaying exponentials, representing contact with carbon reservoirs with different time scales.

    Some references on impulse response models are Joos and Bruno (1996), Kheshgi and White (1996), Toth et al. (2000), Hooss (2001), Enting (2007), and Li et al. (2009). The last paper segues into box models; see Tomizuka (2009) and Fano (2010), as the Hooss et al. paper (containing the NICCS model I mentioned in Week 304).

    • Web Hub Tel says:

      That’s what I think as well, a mix of distributions. The approach that I notice that everyone takes is to take a distribution of varying time constants. What we actually want is a distribution of varying rate constants. The maximum entropy principle always applies to rates because that is nearer to an energy concept than time, and we apply the mean to an energy. If you do this and assume a diffusion limited rate, you get a curve that goes as 1/(1 + c*sqrt(time)). If you compare this curve to the one in the Joos and Bruno reference and to the IPCC model, you can see the similarity. I advocate this approach because it matches the physicists preferred method of getting to a simple result in as few steps as possible. The rationale is that others interested in the work could then make it more elaborate by building on this foundation.

You can use Markdown or HTML in your comments. You can also use LaTeX, like this: $latex E = m c^2 $. The word 'latex' comes right after the first dollar sign, with a space after it.

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s