In 2004, Pacala and Socolow laid out a list of ways we can battle global warming using current technologies. They said that to avoid serious trouble, we need to choose seven ‘stabilization wedges’: that is, seven ways to cut carbon emissions by 1 gigatonne per year within 50 years. They listed 15 wedges to choose from, and I’ve told you about them here:
• Part 1 – efficiency and conservation.
• Part 2 – shifting from coal to natural gas, carbon capture and storage.
• Part 3 – nuclear power and renewable energy.
• Part 4 – reforestation, good soil management.
According to Pacala:
The message was a very positive one: “gee, we can solve this problem: there are lots of ways to solve it, and lots of ways for the marketplace to solve it.”
I find that interesting, because to me each wedge seems like a gargantuan enterprise—and taken together, they seem like the Seven Labors of Hercules. They’re technically feasible, but who has the stomach for them? I fear things need to get worse before we come to our senses and take action at the scale that’s required.
Anyway, that’s just me. But three years ago, Pacala publicly reconsidered his ideas for a very different reason. Based on new evidence, he gave a talk at Stanford where he said:
It’s at least possible that we’ve already let this thing go too far, and that the biosphere may start to fall apart on us, even if we do all this. We may have to fall back on some sort of dramatic Plan B. We have to stay vigilant as a species.
You can watch his talk here:
It’s pretty damned interesting: he’s a good speaker.
Here’s a dry summary of a few key points. I won’t try to add caveats: I’m sure he would add some himself in print, but I’d rather keep the message simple. I also won’t try to update his information! Not in this blog entry, anyway. But I’ll ask some questions, and I’ll be delighted if you help me out on those.
Emissions targets
First, Pacala’s review of different carbon emissions targets.
The old scientific view, circa 1998: if we could keep the CO2 from doubling from its preindustrial level of 280 parts per million, that would count as a success. Namely, most of the ‘monsters behind the door’ would not come out: continental ice sheets falling into the sea and swamping coastal cities, the collapse of the Atlantic ocean circulation, a drought in the Sahel region of Africa, etcetera.
Many experts say we’d be lucky to get away with CO2 merely doubling. At current burn rates we’ll double it by 2050, and quadruple it by the end of this century. We’ve got enough fossil fuels to send it to seven times its preindustrial levels.
Doubling it would take us to 560 parts per million. A lot of people think that’s too high to be safe. But going for lower levels gets harder:
• In Pacala and Socolow’s original paper, they talked about keeping CO2 below 500 ppm. This would require keeping CO2 emissions constant until 2050. This could be achieved by a radical decarbonization of the economies of rich countries, while allowing carbon emissions in poor countries to grow almost freely until that time.
• For a long time the IPCC and many organizations advocated keeping CO2 below 450 ppm. This would require cutting CO2 emissions by 50% by 2050, which could be achieved by a radical decarbonization in rich countries, and moderate decarbonization in poor countries.
• But by 2008 the IPCC and many groups wanted a cap of 2°C global warming, or keeping CO2 below 430 ppm. This would mean cutting CO2 emissions by 80% by 2050, which would require a radical decarbonization in both rich and poor countries.
The difference here is what poor people have to do. The rich countries need to radically cut carbon emissions in all these scenarios. In the USA, the Lieberman-Warner bill would have forced the complete decarbonization of the economy by 2050.
Then, Pacala spoke about 3 things that make him nervous:
1. Faster emissions growth
A 2007 paper by Canadell et al pointed out that starting in 2000, fossil fuel emissions started growing at 3% per year instead of the earlier figure of 1.5%. This could be due to China’s industrialization. Will this keep up in years to come? If so, the original Pacala-Socolow plan won’t work.
(How much, exactly, did the economic recession change this story?)
2. The ocean sink
Each year fossil fuel burning puts about 8 gigatons of carbon in the atmosphere. The ocean absorbs about 2 gigatons and the land absorbs about 2, leaving about 4 gigatons in the atmosphere.
However, as CO2 emissions rise, the oceanic CO2 sink has been growing less than anticipated. This seems to be due to a change in wind patterns, itself a consequence of global warming.
(What’s the latest story here?)
3. The land sink
As the CO2 levels go up, people expected plants to grow better and suck up more CO2. In the third IPCC report, models predicted that by 2050, plants will be drawing down 6 gigatonnes more carbon per year than they do now! The fourth IPCC report was similar.
This is huge: remember that right now we emit about 8 gigatonnes per year. Indeed, this effect, called CO2 fertilization, could be the difference between the land being a big carbon sink and a big carbon source. Why a carbon source? For one thing, without the plants sucking up CO2, temperatures will rise faster, and the Amazon rainforest may start to die, and permafrost in the Arctic may release more greenhouse gases (especially methane) as it melts.
In a simulation run by Pacala, where he deliberately assumed that plants fail to suck up more carbon dioxide, these effects happened and the biosphere dumped a huge amount of extra CO2 into the atmosphere: the equivalent of 26 stabilization wedges.
So, plans based on the IPCC models are essentially counting on plants to save us from ourselves.
But is there any reason to think plants might not suck up CO2 at the predicted rates?
Maybe. First, people have actually grown forests in doubled CO2 conditions to see how much faster plants grow then. But the classic experiment along these lines used young trees. In 2005, Körner et al did an experiment using mature trees… and they didn’t see them growing any faster!
Second, models in the third IPCC report assumed that as plants grew faster, they’d have no trouble getting all the nitrogen they need. But Hungate et al have argued otherwise. On the other hand, Alexander Barron discovered that some tropical plants were unexpectedly good at ramping up the rate at which they grab ahold of nitrogen from the atmosphere. But on the third hand, that only applies to the tropics. And on the fourth hand—a complicated problem like this requires one of those Indian gods with lots of hands—nitrogen isn’t the only limiting factor to worry about: there’s also phosphorus, for example.
Pacala goes on and discusses even more complicating factors. But his main point is simple. The details of CO2 fertilization matter a lot. It could make the difference between their original plan being roughly good enough… and being nowhere near good enough!
(What’s the latest story here?)

The recession did cut into the global emissions growth rate, as described in Friedlingstein et al. (2010) (considering emissions through 2009). The 3%/year acceleration is largely due to China, from what I’ve read. They can’t keep that growth up forever, but they don’t have to in order to require more wedges.
The story on the ocean carbon sink, particularly the wind-driven mixing in the Southern Ocean, is still ambiguous, as is the story on terrestrial CO2 fertilization. For the ocean sink, you could start with Le Quéré et al. (2007) and scan forward through the papers that cite it. I haven’t yet sorted out the CO2 fertilization debate for myself.
By the way, your last link points to co2science.org (also here), which is one of those slick climate-skeptic dissemination organizations you sometimes hear about. Of course the papers they cite speak for themselves, but you can’t always trust the interpretation (or selection of papers) chosen by that site to be impartial. I can’t really comment on this further since, as I said, I haven’t managed to sort out the literature for myself in this area.
Whoops! I was actually suspicious of that ‘co2science.org’ site, because I’d never heard of it, and because of the sidebar asking ‘do plants like more CO2?’. But their mission statement did not seem suspicious to me, so with some mild misgivings I linked to their page.
I’ve changed the link to one describing Alexander Barron’s research – he’s the guy whose work on nitrogen-fixing tropical plants Pacala mentioned. I can’t instantly figure out what paper Pacala was talking about.
I still think the ‘wedges’ idea is a great way to get people to think about the problem, as it decomposes a seemingly intractable challenge into a smaller number of difficult, but doable challenges. The idea that the wedges start small and build year by year is particularly attractive.
The problem now is that we seem to need about twice as many wedges as Pacala and Socolow originally envisaged. So, exercises based on choosing wedges (such as this one) now seem misguided – the problem isn’t to select wedges, it’s how to implement all the ones that have been identified so far, and how to think up some more to use as backup.
The other problem is that none of the wedges are really unitary – all of them could be scaled up or down to some extent to give a larger (or smaller) contribution. So some of the discussion ought to be on what the limiting factors are that might prevent various wedges from being expanded.
All of this pre-supposes that we can get to the point where political leaders even understand the urgency of the problem. If we can solve *that* problem, implementing the wedges will probably seem easy by comparison.
Great to see you here, Steve!
I agree that our focus should not be on choosing wedges, as if choosing among items on a menu: what would taste better, a slice of pumpkin pie or a slice of cherry pie? A better rough rule of thumb would be: we need to slow the carbon dioxide buildup every way we can, as much as we can, as fast as we can.
The image of a ‘menu’ is especially destructive when it leads proponents of different promising strategies to battle with each other instead of the common enemy. Here I’m especially thinking about solar versus nuclear.
The main good thing about this wedge business is that it starts people thinking quantitatively about how much we need to do and what various actions would accomplish. I’ve been working through the Pacala-Socolow paper as a way to start building up a repository of numbers and facts on the Azimuth Wiki.
Right.
There’s a long list of problems with this strategy, but the main one is that we have yet to make sense of the the main purpose, which as stands is ill-conceived.
The purpose of preventing climate change is, effectively, to continue doubling all our other environmental impacts every 20 years or so… There is just a huge long list of other systemic threats from other environmental impacts. Perhaps the biggest “sleeper” in the bunch is that the world economy has started responding over the past 10 years as if the rapidly growing resource demand cannot be met by increasing supply. The food and fuel resource markets are all panicking, *together*, demonstrating a systemic global market response to being pushed beyond their point of adaptive resilience, and reacting rigidly to demand my escalating price instead of creating supply… “A decisive moment for Investing in Sustainability”
That’s a really serious unacknowledged problem like the 50 other really serious unacknowledged problems we should be dealing with but generally are not. If we don’t solve them all together, we’re just making them all worse, is the combined problem.
Hope that makes one or the other of those understandable. Ask for clarification if not.
Phil wrote:
Who thinks that? I suppose plenty of people implicitly expect exponential economic growth, but surely many people who think seriously about the future must hope that:
1) world population peaks around 2050 and then drops down to some considerably lower level, perhaps less than a billion.
2) energy production shifts from fossil fuels to renewables with nuclear serving as a bridge,
3) energy consumption per capita does not continue to increase but instead converges to some sustainable level.
4) similarly, other use of resources converges to some sustainable level.
This is what I hope, anyway. People who want to use a lot more energy and other stuff should leave this planet. Some actually will, I hope. They may mess up the rest of the universe, but I hope some planets including Earth stay nice for a long time.
You’re overlooking economic growth, and that the precondition for climate mitigation, and other environmental action, is to not reduce the rate of growth appreciably. The problem is that to stabilize our financial system compound growth in resource use is absolutely necessary, and that’s why it’s a precondition for financing climate mitigation. That’s also why climate mitigation will perpetuate the compound growth in all other impacts.
Lots of people fictionalize perpetual motion and multiplying machines of all kinds, but the data and thermodynamics both point to real financial earnings and real resource use being connected.
If the world economy expands its real product every 20 years or so it then is necessarily also doubling its impacts. One could imagine growing impacts are growing slower, but they would also hit more and more sensitive environments, etc. Even applying every trick you can pull, though, it won’t change the best straight line approximation to the scale of impacts being as close to a vertical line as you can draw starting at the time you ask the question…
That’s the math, right?
Phil wrote:
No, I’m not. The idea of 1)-4) is move towards a steady state of resource consumption, where resources are consumed no faster than the Earth produces them. This means the end of economic growth as we know it, which means of course the end of the financial-economic system as we know it.
It might not spell the end to growth of some subtler kind, but economic growth as we know it involves increased use of resources.
John, I think it’s that you consider the economy to be a collection of trends, like data, rather than a collection of mechanisms that operate on their own. That or something like it keeps you from identify the right mechanism driving growth, and making it necessary for financial stability. It’s not just a social belief that drives it, but built into the rules of finance, and the defined purposes of regulation. They are arranged to direct the management of the physical world to allow the financial world to produce continually multiplying real returns for money saved in the past.
When you start studying the active mechanisms you start finding what really must change in our rules, to then guide you to discovering what really can (because the necessary change is a real problem). That would be real system steering. Just using social policy to favor desirable trends, while part of it of course, will have quite the opposite of the intended effect by itself, is the catch.
John wrote
The impression that I have is that a sizeable proportion of people who think seriously about the future expect technological developments that will change things in ways which will allow economic growth to keep increasing, eg, some energy breakthrough that will make currently non-viable recycling viable, etc. The biggest issue is that a lot of the people who say that aren’t actively trying to create these amazing new technologies; they expect “human ingenuity”, which presumably means “other people”, to do that.
David, Yes indeed, a majority of the well paid “problem solvers” are only solving them hypothetically!! …and not paying attention as the earth is making every single step more and more costly and complex, the way nature signals the limits of things…
What we have is a shortage of “problem finders” for discovering what problems are not worth trying to solve, and a waste of time and money taken from others that are!
This article of mine will appear in the UK next week I’m promised. http://www.synapse9.com/pub/ASustInvestMoment.pdf
Somewhat off topic:
Hercules did twelve labours.
Whoops! Well, at the rate we’re going, we’ll need 12 wedges.
Interesting introduction to the recent PBS NOVA program and Pacala interview illustrating the modern human reversal of the original vegetative scrubbing of primeval CO2. And now we refuse to ‘let sleeping carbon lie’.
The nuclear option presentation seemed to me a bit iffy. From what one hears about all the underground facilities these days, what might we think about constructing all future nuclear reactor vessels underground?
That would be a 10% increase! (According to this, global terrestial NPP is 48-69Gt.)
Currently it looks we can completely forget about this:
From satellite observations, Zhao & Running (2010) estimate a 0.55 Gt (ca. 1%) decline in global terrestrial NPP (net primary production) from 2000 to 2009. Between 1982 and 1999 the increase was up to 6%.
3 other observations:
• Shilong Piao, Xuhui Wang, Philitppe Ciais, Biao Zhuz, Tao Wang, and Jiu Liu, Changes in satellite-derived vegetation growth trend in temperate and boreal Eurasia from 1982 to 2006, Global Change Biology, preview 31 March 2011.
• NASA Earth Observatory, April 22, 2006: Northern Forest Affected by Global Warming.
reporting on:
Scott J. Goetz, Andrew G. Bunn, Gregory J. Fiske, R. A. Houghton, Satellite-observed photosynthetic trends across boreal North America associated with climate and fire disturbance, PNAS September 20, 2005 vol. 102.
• Jofre Carnicera, Marta Colla, Miquel Ninyerolac, Xavier Ponsd, Gerardo Sáncheze, Josep Peñuelasa, Widespread crown condition decline, food web disruption, and amplified tree mortality with increased climate change-type drought, PNAS January 25, 2011 vol. 108.
Thanks for your detailed comment, Florifulgurator!
This passage was a bit confusing to me:
Forget about what? Forget about the decline mentioned here, forget about the increase mentioned here, or…?
I actually guess you meant “forget about the massive increase in net primary production predicted by the IPCC report”. Is that what you meant?
I didn’t know what “NPP” meant, so I added a link to the Wikipedia article. For those too lazy to click:
Yes.
If these hints weren’t enough, here’s another hint from paleoclimatology that we can quite possibly forget about the plant fertilization effect:
• Gabriel J. Bowen, James C. Zachos. Rapid carbon sequestration at the termination of the Palaeocene–Eocene Thermal Maximum. Nature Geoscience 3 (2010), 866–869
From an interview with Bowen in Science Daily:
(My emph.)
(Thanks to commenter AR18 for the link. Contrary to his “skeptic” view, the “rapid recovery” is no excuse, since it happened on the timescale of tens of thousands of years. It took much longer than the epoch we just messed up, the holocene got old.)
Other article gone with the internets wind. I post it on forum.
Please don’t keep reposting stuff – an email to me and I can easily dig it out of the spam box. When you post several slightly different versions of the same thing, it’s more work for you and me: I have to look at every one and try to figure out which one(s) to post.
Did you get several re-posts in your folder? I tried on my wordpress blog (where I have a test thread) – but nothing in the spam folder. Actually at the end my comment surfaced on my test blog – but not here.
Yes, I got a pile of re-posts in my spam folder. I suspect that everything that didn’t appear on my blog appeared in my spam folder. By now I have gotten rid of everything except your first attempt. I tried to make it a bit prettier.
I don’t know why your test blog would act differently from my blog here!
Sorry for the mess. My spam folder never caught anything.
Thanks for making stuff prettier!
Well, that’s how today’s computers work: Accidentally. Programmers are not mathematicians (except exceptions). They don’t need proofs. If it works, it works. Why it works nobody cares.
Plan B has to involve industrial consumption of atmospheric CO2. It is currently feasible to produce nontoxic polycarbonate materials from atmospheric CO2. Scaling it up to a gigatonne per year would mainly require finding a market for all that plastic.
Just out of curiosity, how many tonnes of plastic are produced each year now?
What would the effect be on atmospheric CO2 if I were elected king of the planet and decreed that henceforth all plastics were to be made of recycled CO2—assuming for a second that this were possible?
I think the total production is about 1 gigatonne, but China’s numbers are all over the place.
If we used CO2 recycled products everywhere even for durable items – furniture, construction materials (people who live in plastic houses), etc. – I would assume multiple gigatonnes could be used. But still not enough by itself.
My Plan C involves manual sequestration of atmospheric CO2. The plan is robust against civilization collapse since the basic technique is pure stone-age (to the insult of Homo Sapiens Colossus). (However it is not robust against the three mind poisons, so I remain pessimistic.) Scaling it up to a gigatonne per year would mainly require the will of Homo Sapiens to survive and not starve by the billions. The “market” for CO2 (plus nitrogen) fixated in char coal is agriculture.
I think so. There is no reason not to mix in char in soil, and done properly it supposedly can sequester more carbon by humification.
How about doing nothing? The IPCC forecasts for the end-of-the-world nonsense assume that CO2 will continue to escalate at current rates, but that in turn requires the blind-faith assumption that humans will continue to burn fossil fuels at same escalating rate they are now — which is patently absurd since we are now nearing peak oil output and the rate will decrease, not increase.
Then we all need to remember how NASA told us to fear for our lives for Global Cooling in the 70’s, but when they had “more and better data” changed their minds and told us to fear for our lives for Global Warming. I don’t believe we are headed for a world wide end-of-the-world disaster. Those stories are as credible as “I’ve seen a UFO from outer space” stories.
“[The current anthropomorphic global warming nonsense is based on] inherently untrustworthy climate models, similar to those that cannot accurately forecast the weather a week from now” (Dr. Richard Lindzen)
The Earth recovered from prehistoric global warming episodes, very similar to today’s situation, much faster than today’s climate models represent. See http://www.sciencedaily.com/releases/2011/04/110421151919.htm or “Rapid carbon sequestration at the termination of the Palaeocene–Eocene Thermal Maximum”.
We just exited the Little Ice Age, so would you expect the temperatures to fall or rise after a Little Ice Age? You can’t credibly claim that all the warming since the Little Ice Age ended is solely due to human influence.
Unless these people know what causes Ice Ages and can reliably predict the climate more than two days in advance, don’t meddle with something you don’t understand because you can only make things worse.
I presume you’re actually aware that both oil is not the only fossil fuel and that the rate of decline in output of oil is likely to be relatively slow for the next decade or two? (The effect of oil decline may be greater than the production decline rate because there are there’s now both much greater purchasing capacity and demand for oil in the china, india and the middle east, so westerners with a sense of their “entitled” level of consumption may need to adjust their views.)
This is an important and underestimated point. When I mention “peak oil” in a room of finance professionals, I often need to spend the next 5-10 minutes clearing up this misperception. It’s not like anyone is saying oil is going to suddenly run out like a tap switching off. The last barrel of oil will never be extracted because the costs will be so high that people will have moved away from oil long before that happens.
Eric & Dave, with still growing demand in many places, though,… gradual supply reduction creates market shocks that are severe. Continued increasing demand from some will need to be supplied by raising the price to levels that are prohibitive for others, is the hitch. That’s the characteristic of “systemic demand exceeding supply”.
It’s the real world look of “the big crunch”, I’m afraid, that so many people have been predicting for so long, but didn’t understand what it would look like when it came.
Hi Phil,
I agree with you 100%. That is why I talk about it to rooms full of financial professionals. My current occupation is risk manager for a large financial services firm.
You do not need oil to run out in order to have macroeconomic consequence. The second derivative going to zro will be bad enough.
Eric, I’m glad to meet someone who recognizes that, and might understand why I’ve been jumping around pointing to those inflection points in all kinds of things. They are indeed when profit principles reverse sign for a great many investment strategies. There’s the inflection point in oil reserve discoveries in the 1950’s, for example… It should be profitable to help people see them and understand what to do, certainly, and I’d like to help. From a scientific approach it comes with expanding the language of science to include what business people see in the world… and that’s kind of tough. It feels like persuading both that we live in a natural world not their imagined theoretical one, and having to drag them kicking and screaming!! ;-)
AR18: I don’t find inflammatory language like “nonsense”, “blind faith”, “patently absurd”, to be very helpful, nor your past comments such as how people who work on AI are “the stupidest people in the world”, “no one here understands elementary probability”, etc.
As David pointed out, conventional oil is not the only fossil fuel. There is quite a bit of coal, as well as nonconventional oil in shales and tar sands. How close we are to peak oil, let alone peak fossil fuel production, is the subject of much debate. It certainly is not a foregone conclusion, as you seem to assume, that fossil fuel production will peak soon at some level near today’s production.
NASA, the organization, never advanced a public position on global cooling in the 1970s. To my knowledge, neither did any individual NASA researchers (who of course do not speak for NASA). Indeed, the most prominent NASA climate scientist at that time, James Hansen, predicted the opposite.
Setting NASA aside, while global cooling gained media attention in the 1970s, the idea of imminent cooling (as opposed to long-term glaciation over thousands or tens of thousandso of years) never reached consensus in the peer reviewed literature. Again, the literature is close to the opposite of that position (see here).
What, exactly, constitutes a “world wide end-of-the-world disaster”, and who is predicting it? The IPCC, for example, is not predicting extinction of the human race or anything like that.
I am sure Lindzen does not appreciate having elided text inserted into his quotes, especially ones that can’t even spell “anthropogenic” correctly. As for the actual argument, chaos theory places limits on how far ahead the state of the system can be predicted. However, climate prediction does not attempt to predict the state of the system, but rather statistics such as averages, which are governed more by energy balance considerations.
An atmospheric carbon half life of 30,000-40,000 years is not really that rapid as far as impacts on a human timescale are concerned, nor would the policy-relevant impacts be much different if the half life were, say, 120,000 years. Indeed, a 30,000 year half life is an argument for the risks of elevated CO2, not an argument against it.
Who claims that? Certainly not the IPCC. The IPCC merely claims that “most” (more than half) of the warming “since the mid-20th century” is due to human influence.
I’m not sure why you feel the need to bring up a series of strawman arguments that no one here has advanced.
Here you continue to confuse weather and climate prediction. As for ice ages, we know quite a bit about what causes them, but the fact that they are an ongoing subject of research does not imply that we don’t know anything about what the future climate is likely to look like.
Finally, your claim “Don’t meddle with something you don’t understand because you can only make things worse” is not logically supported, since if we don’t understand the implications, “meddling” can a-priori either make things either worse or better.
But the precautionary principle has some validity to it, which is precisely why it is unwise for us to massively increase the CO2 content of the atmosphere. Human civilization has existed within a fairly narrow range of CO2 levels, and we know that civilization can thrive in that range. We don’t know whether civilization can thrive outside that range, so how can we say it’s safe to conduct that experiment? Especially when there are many reasons to believe that the resulting climate changes will be nontrivial on the scale of human experience.
AR18 wrote:
That’s false. The IPCC reports don’t make a “blind-faith assumption that humans will continue to burn fossil fuels at same escalating rate they are now”. They consider a variety of scenarios, all of which take into account the limited reserves of fossil fuels, and all of which assume an increasing usage of carbon-free energy. Here’s what you want to read:
• IPCC, Special Report on Emissions Scenarios (SRES), 2000.
This describes the four scenarios, commonly called ‘SRES scenarios’, used in the fourth IPCC report (the most recent one). You probably want to focus on Section 3.4.3: Energy Resources.
It’s not clear why you focus on oil, because there’s a lot more carbon in the form of coal: roughly an order of magnitude more. If you look at Section 3.4.3: Energy Resources, you’ll see that as far as oil goes, they say:
A ‘ZJ’ is a zettajoule, i.e. 1021 joules. To convert this into more familiar units, I’ll guess (with help) that a barrel of fuel oil yields about 6.4 gigajoules of energy when burnt. So, they were assuming reserves of about 1 trillion barrels of oil, plus a bunch of ‘additionally recoverable resources’. This looks about right, though a little low compared to 2011 Wikipedia data taken from the Energy Information Administration.
But note: this pales in comparison to coal reserves! The IPCC writes:
So, maybe 22 zettajoules of reserves and 89 zettajoules that could be mined with technological advances. The figures on reserves here are roughly in line with 2011 Wikipedia data.
In short: nothing makes me think the IPCC is making a “blind-faith assumption” that we’ll burn carbon at rates beyond what reserves actually allow.
I think Nathan handled your other points pretty well. I find them equally unconvincing.
Let’s use the power of the minds here on Azimuth to attack this from a different perspective. Besides our knowledge of CO2 as a greenhouse gas, the unusual feature of C02 is its long residence time in the atmosphere. In a way it is so persistent as to appear inert over many decades. The time constant for CO2 sequestering has both a short period and a long period, very characteristic of a reaction controlled by a potentially fractal process.
Now I can ask, how does this square with the excellent set of posts that John presented on modeling via Petri Nets? With that kind of model, we are dealing with homogeneous reaction kinetics, and assume uniform mixing so we can solve the equations in a more straightforward manner. But with huge amounts of disorder in the mix, what do we do? I would suggest to apply a simple first-order reaction kinetics model but then use something like superstatistics (ala C.Beck et al) and integrate over all possible reaction pathways to model the actual CO2 residence time.
What you will find is that superstatistics will in fact generate these “fat-tail” residence time curves that atmospheric CO2 demonstrates. The tails drop-off as 1/sqrt(time), so it is just a matter of coming up with a reasonable model for the superstatistics, whether is is driven by some sort of disordered random walk or other diffusion barrier. (same thing happens with the heat decay from radioactive waste dumps, where the disorder is in the half-lifes of radioactive species)
I only suggest that it is perfect applied math (eek!) problem for this crowd to think about. I have already given it a go, but would be interested to see what other people think.
I challenge AR18 to also get involved in this, because it is a very objective question, completely divorced from any kind of agenda that you might imagine to be occurring.
I did a rough calculation of the amount of carbon if we burn through all the remaining estimated coal, oil and natural gas reserves. There’s about a trillion tonnes of carbon, assuming no major new discoveries:
http://www.easterbrook.ca/steve/?p=977
And it’s fairly clear that as the fuel prices rise (which is inevitable once supply starts to decline), you can be fairly sure that every last drop will be extracted, because it becomes increasingly profitable to do so. The key point is that much of the remaining reserves have to remain buried in the ground. I can’t imagine there is ever likely to be the political will to ensure this, given how much our entire political system depends on the profits from the fossil fuel industry.
Thanks again, Steve!
For those too lazy to click the link, here’s the upshot of Steve’s calculation:
Let me also point people toward this paper, which makes a significantly higher guess about how much more carbon will be burnt:
• Nordhaus, W. D. 2007. The challenge of global warming: Economic models and environmental policy, Technical report, http://nordhaus.econ.yale.edu/DICE2007.htm, accessed May 2, 2007, model version: DICE-2007.delta.v7.
A book describing Nordhaus’ 2007 model is here:
• William D. Nordhaus, A Question of Balance: Weighing the Options on Global Warming Policies, Yale University Press, New Haven, 2008.
A version of this book is free online — just click!
On page 127 of his book, Nordhaus estimates that a total of 6±1.2 trillion metric tons of carbon are available to be burnt. Currently we’ve burnt about 0.54 trillion tons.
Let me also take this opportunity to quote a comment by Nathan Urban in our discussion of week 305:
This might also be a time to request a post or two on how mathematical convolution relates to the observed CO2 response.
I googled Azimuth and only found a reference from that interview with Nathan Urban, week #304
Upthread, I made the observation of the CO2 impulse response showing a significant fat-tail (what kind of Green’s function is that?). With convolution you can readily map out how the CO2 emission forcing function interacts with the impulse response, thus giving that loooong lag between shutting down CO2 emissions and seeing a significant change in atmospheric CO2.
So the suggestion is to educate how this comes about from a mathematical physicists’s perspective, and perhaps spur someone to think about the problem in a new way.
One simple approach is to use a Green’s function that is the sum of several decaying exponentials, representing contact with carbon reservoirs with different time scales.
Some references on impulse response models are Joos and Bruno (1996), Kheshgi and White (1996), Toth et al. (2000), Hooss (2001), Enting (2007), and Li et al. (2009). The last paper segues into box models; see Tomizuka (2009) and Fano (2010), as the Hooss et al. paper (containing the NICCS model I mentioned in Week 304).
That’s what I think as well, a mix of distributions. The approach that I notice that everyone takes is to take a distribution of varying time constants. What we actually want is a distribution of varying rate constants. The maximum entropy principle always applies to rates because that is nearer to an energy concept than time, and we apply the mean to an energy. If you do this and assume a diffusion limited rate, you get a curve that goes as 1/(1 + c*sqrt(time)). If you compare this curve to the one in the Joos and Bruno reference and to the IPCC model, you can see the similarity. I advocate this approach because it matches the physicists preferred method of getting to a simple result in as few steps as possible. The rationale is that others interested in the work could then make it more elaborate by building on this foundation.