## 2014 on Azimuth

31 December, 2013

Happy New Year! We’ve got some fun guest posts lined up for next year, including:

Marc Harper, Relative entropy in evolutionary dynamics.

Marc Harper uses ideas from information theory in his work on bioinformatics and evolutionary game theory. This article explains some of his new work. And as a warmup, it explains how relative entropy can serve as a Lyapunov function in evolution!

“What is a Lyapunov function, and why should I care?”

The brief answer, in case you’re eager to know, is this. A Lyapunov function is something that always increases—or always decreases—as time goes on. Examples include entropy and free energy. So, a Lyapunov function can be a way of making the 2nd law of thermodynamics mathematically precise! It’s also a way to show things are approaching equilibrium.

The overall goal here is applying entropy and information theory to better understand the behavior of biological and ecological systems. And in April 2015, Marc Harper and I are helping run a workshop on this topic! We’re doing this with John Harte, an ecologist who uses maximum entropy methods to predict the distribution, abundance and energy usage of species. It should be really interesting!

But back to blog articles:

Manoj Gopalkrishnan, Lyapunov functions for complex-balanced systems.

Manoj Gopalkrishnan is a mathematician at the Tata Institute of Fundamental Research in Mumbai who works on problems coming from chemistry and biology. This post will explain his recent paper on a Lyapunov function for chemical reactions. This function is closely related to free energy, a concept from thermodynamics. So again, one of the overall goals is to apply entropy to better understand living systems.

Since some evolutionary games are isomorphic to chemical reaction networks, this post should be connected to Marc’s. But there’s some mental work left to make the connection—for me, at least. It should be really cool when it all fits together!

Alastair Jamieson-Lane, Stochastic cross impact balance analysis.

Alastair Jamieson-Lane is a mathematician in the master’s program at the University of British Columbia. Very roughly, this post is about a method for determining which economic scenarios are more likely. The likely scenarios get fed into things like the IPCC climate models, so this is important.

This blog article has an interesting origin. Vanessa Schweizer has a bachelor’s degree in physics, a masters in environmental studies, and a PhD in engineering and public policy. She now works at the University of Waterloo on long-term decision-making problems.

A while back, I met Vanessa at a workshop called What Is Climate Change and What To Do About It?, at the Balsillie School of International Affairs, which is in Waterloo. She described her work with Alastair Jamieson-Lane and the physicist Matteo Smerlak on stochastic cross impact balance analysis. It sounded really interesting, something I’d like to work on. So I solicited some blog articles from them. I hope this is just the first!

So: Happy New Year, and good reading!

Also: we’re always looking for good guest posts here on Azimuth, and we have a system for helping you write them. So, if you know something interesting about environmental or energy issues, ecology, biology or chemistry, consider giving it a try!

If you read some posts here, especially guest posts, you’ll get an idea of what we’re looking for. David Tanzer, a software developer in New York who is very active in the Azimuth Project these days, made an organized list of Azimuth blog posts here:

You can see the guest posts listed by author. This overview is also great for catching up on old posts!

## Who is Bankrolling the Climate Change Counter-Movement?

23 December, 2013

It’s mostly secret. A new refereed paper by Robert Brulle of Drexel University looks into it.

“The climate change countermovement has had a real political and ecological impact on the failure of the world to act on the issue of global warming,” said Brulle. “Like a play on Broadway, the countermovement has stars in the spotlight—often prominent contrarian scientists or conservative politicians—but behind the stars is an organizational structure of directors, script writers and producers, in the form of conservative foundations. If you want to understand what’s driving this movement, you have to look at what’s going on behind the scenes.”

So he looked, and he found this:

• The biggest known funders of organizations downplaying the importance of man-made climate change are foundations such as the Searle Freedom Trust, the John William Pope Foundation, the Howard Charitable Foundation and the Sarah Scaife Foundation.

• Koch and ExxonMobil have pulled back from publicly visible funding. From 2003 to 2007, the Koch Affiliated Foundations and the ExxonMobil Foundation were heavily involved in funding the climate change countermovement. But since 2008, they are no longer making publicly traceable contributions.

• Funding has shifted to pass through untraceable sources. As traceable funding drops, the amount of funding given to the countermovement by the Donors Trust has risen dramatically. Donors Trust is a donor-directed foundation whose funders cannot be traced. This one foundation now provides about 25% of all traceable foundation funding used by organizations engaged in promoting systematic denial of human-caused climate change.

• Most funding for denial efforts is untraceable. Despite extensive digging, only a small part of the hundreds of millions in contributions to climate change denying organizations can be found out from public records. A group of 91 climate change denial organizations has a total income of $900 million per year—but only$64 million in identifiable foundation support!

All this is from the original paper:

• Robert J. Brulle, Institutionalizing delay: foundation funding and the creation of U.S. climate change counter-movement organizations, Climatic Change, 2013.

and this summary:

Not just Koch brothers: new study reveals funders behind climate change denial effort, ScienceDaily, 20 December 2013.

Ironically, the original paper would probably be hidden behind a paywall if someone hadn’t liberated it. Get your copy now, while you can!

You can also get 120 pages of details—names and dollar amounts—here:

• Robert J. Brulle, Supplementary online material.

Here is Brulle’s pie chart of who is doing the (traceable!) funding—click to enlarge:

And here is his chart of the organizations who are getting funded:

Here is his graph showing the rise of Donors Trust:

And here is his picture of the social network involved in climate change denial. He calls this the Climate Change Counter Movement. You really need to enlarge this one to see anything:

## Life’s Struggle to Survive

19 December, 2013

Here’s the talk I gave at the SETI Institute:

When pondering the number of extraterrestrial civilizations, it is worth noting that even after it got started, the success of life on Earth was not a foregone conclusion. In this talk, I recount some thrilling episodes from the history of our planet, some well-documented but others merely theorized: our collision with the planet Theia, the oxygen catastrophe, the snowball Earth events, the Permian-Triassic mass extinction event, the asteroid that hit Chicxulub, and more, including the massive environmental changes we are causing now. All of these hold lessons for what may happen on other planets!

To watch the talk, click on the video above. To see

Here’s a mistake in my talk that doesn’t appear in the slides: I suggested that Theia started at the Lagrange point in Earth’s orbit. After my talk, an expert said that at that time, the Solar System had lots of objects with orbits of high eccentricity, and Theia was probably one of these. He said the Lagrange point theory is an idiosyncratic theory, not widely accepted, that somehow found its way onto Wikipedia.

Another issue was brought up in the questions. In a paper in Science, Sherwood and Huber argued that:

Any exceedence of 35 °C for extended periods should
induce hyperthermia in humans and other mammals, as dissipation of metabolic heat becomes impossible. While this never happens now, it would begin to occur with global-mean warming of about 7 °C, calling the habitability of some regions into question. With 11-12 °C warming, such regions would spread to encompass the majority of the human population as currently distributed. Eventual warmings of 12 °C are
possible from fossil fuel burning.

However, the Paleocene-Eocene Thermal Maximum seems to have been even hotter:

So, the question is: where did mammals live during this period, which mammals went extinct, if any, and does the survival of other mammals call into question Sherwood and Huber’s conclusion?

## Global Climate Change Negotiations

28 October, 2013

There were many interesting talks at the Interdisciplinary Climate Change Workshop last week—too many for me to describe them all in detail. But I really must describe the talks by Radoslav Dimitrov. They were full of important things I didn’t know. Some are quite promising.

Radoslav S. Dimitrov is a professor at the Department of Political Science at Western University. What’s interesting is that he’s also been a delegate for the European Union at the UN climate change negotiations since 1990! His work documents the history of climate negotiations from behind closed doors.

Here are some things he said:

• In international diplomacy, there is no questioning the reality and importance of human-caused climate change. The question is just what to do about it.

• Governments go through every line of the IPCC reports twice. They cannot add anything the scientists have written, but they can delete things. All governments have veto power. This makes the the IPCC reports more conservative than they otherwise would be: “considerably diluted”.

• The climate change negotiations have surprised political scientists in many ways:

1) There is substantial cooperation even without the USA taking the lead.

2) Developing countries are accepting obligations, with many overcomplying.

3) There has been action by many countries and subnational entities without any treaty obligations.

4) There have been repeated failures of negotiation despite policy readiness.

• In 2011, China and Saudi Arabia rejected the final agreement at Durban as inadequate. Only Canada, the United States and Australia had been resisting stronger action on climate change. Canada abandoned the Kyoto Protocol the day after the collapse of negotiations at Durban. They publicly blamed China, India and Brazil, even though Brazil had accepted dramatic emissions cuts and China had, for the first time, accepted limits on emissions. Only India had taken a “hardline” attitude. Publicly blaming some other country for the collapse of negotiations is a no-no in diplomacy, so the Chinese took this move by Canada as a slap in the face. In return, they blamed Canada and “the West” for the collapse of Durban.

• Dimitrov is studying the role of persuasion in diplomacy, recording and analyzing hundreds of hours of discussions. Countries try to change each other’s minds, not just behavior.

• The global elite do not see climate change negotiations as an environmental issue. Instead, they feel they are “negotiating the future economy”. They focus on the negative economic consequences of inaction, and the economic benefits of climate action.

• In particular, the EU has managed to persuade many countries that climate change is worth tackling now. They do this with economic, not environmental arguments. For example, they argue that countries who take the initiative will have an advantage in future employment, getting most of the “green jobs”. Results include China’s latest 5-year plan, which some have called “the most progressive legislation in history”, and also Japan’s plan for a 60-80% reduction of carbon emissions. The EU itself also expects big returns on investment in climate change.

I apologize for any oversimplifications or downright errors in my notes here.

### References

You can see some slides for Dimitrov’s talks here:

• Radoslav S. Dimitrov, A climate of change.

• Radoslav S. Dimitrov, Inside Copenhagen: the state of climate governance, Global Environmental Politics 10 (2010), 18–24.

and these more recent book chapters, which are apparently not as easy to get:

• Radoslav S. Dimitrov, Environmental diplomacy, in Handbook of Global Environmental Politics, edited by Paul Harris, Routledge, forthcoming as of 2013.

• Radoslav S. Dimitrov, International negotiations, in Handbook of Global Climate and Environmental Policy, edited by Robert Falkner, Wiley-Blackwell forthcoming as of 2013.

• Radoslav S. Dimitrov, Persuasion in world politics: The UN climate change negotiations, in Handbook of Global Environmental Politics, edited by Peter Dauvergne, Edward Elgar Publishing, Cheltenham, UK, 2012.

• Radoslav S. Dimitrov, American prosperity and the high politics of climate change, in Prospects for a Post-American World, edited by Sabrina Hoque and Sean Clark, University of Toronto Press, Toronto, 2012.

## What To Do About Climate Change?

23 October, 2013

Here are the slides for my second talk in the Interdisciplinary Climate Change Workshop at the Balsillie School of International Affairs:

Like the first it’s just 15 minutes long, so it’s very terse.

I start by noting that slowing the rate of carbon burning won’t stop global warming: most carbon dioxide stays in the air over a century, though individual molecules come and go. Global warming is like a ratchet.

So, we will:

1) leave fossil fuels unburnt,

2) sequester carbon,

3) actively cool the Earth, and/or

4) live with a hotter climate.

Of course we may do a mix of these…. though we’ll certainly do some of option 4), and we might do only this one. My goal in this short talk is not mainly to argue for a particular mix! I mainly want to present some information about the various options.

I do not say anything about the best ways to do option 4); I merely provide some arguments that we’ll wind up doing a lot of this one… because I’m afraid some of the participants in the workshop may be in denial about that.

I also argue that we should start doing research on option 3), because like it or not, I think people are going to become very interested in geoengineering, and without enough solid information about it, people are likely to make bad mistakes: for example, diving into ambitious projects out of desperation.

As usual, if you click on a phrase in blue in this talk, you can get more information.

I want to really thank everyone associated with Azimuth for helping find and compile the information used in this talk! It’s really been a team effort!

## What is Climate Change?

21 October, 2013

Here are the slides for a 15-minute talk I’m giving on Friday for the Interdisciplinary Climate Change Workshop at the Balsillie School of International Affairs:

This will be the first talk of the workshop. Many participants are focused on diplomacy and economics. None are officially biologists or ecologists. So, I want to set the stage with a broad perspective that fits humans into the biosphere as a whole.

I claim that climate change is just one aspect of something bigger: a new geological epoch, the Anthropocene.

I start with evidence that human civilization is having such a big impact on the biosphere that we’re entering a new geological epoch.

Then I point out what this implies. Climate change is not an isolated ‘problem’ of the sort routinely ‘solved’ by existing human institutions. It is part of a shift from the exponential growth phase of human impact on the biosphere to a new, uncharted phase.

In this new phase, institutions and attitudes will change dramatically, like it or not:

Before we could treat ‘nature’ as distinct from ‘civilization’. Now, there is no nature separate from civilization.

Before, we might imagine ‘economic growth’ an almost unalloyed good, with many externalities disregarded. Now, many forms of growth have reached the point where they push the biosphere toward tipping points.

In a separate talk I’ll say a bit about ‘what we can do about it’. So, nothing about that here. You can click on words in blue to see sources for the information.

## What Is Climate Change and What To Do About It?

13 October, 2013

Soon I’m going to a workshop on Interdisciplinary Perspectives on Climate Change at the Balsillie School of International Affairs, or BSIA, in Waterloo, Canada. It’s organized by Simon Dalby, who has a chair in the political economy of climate change at this school.

The plan is to gather people from many different disciplines to provide views on two questions: what is climate change, and what to do about it?

We’re giving really short talks, leaving time for discussion. But before I get there I need to write a 2000-word paper on my view of climate change—’as a mathematician’, supposedly. That’s where I want your help. I think I know roughly what I want to say, and I’ll post some drafts here as soon as I write them. But I’d like get your ideas, too.

For starters, the program looks like this:

#### Friday 25 October: What is Climate Change?

9:00 – 9:30 Introductory remarks
John Ravenhill, Director, BSIA
Dan Scott, University of Waterloo, Interdiscipinary Centre for Climate Change.
Simon Dalby, BSIA

9:30 – 10:45 Presentation Session 1
Chair: Sara Koopman, BSIA
John Baez, University of California (Mathematics)
Jean Andrey, University of Waterloo (Geography)
Byron Williston, Wilfrid Laurier University (Philosophy)

11:15 – 12:30 Presentation Session 2
Chair: Marisa Beck, BSIA
Chris Russill, Carleton University (Communications)
Mike Hulme, Kings’ College London (Climate Science)
Radoslav Dimitrov, Western University (Political Science)

1:30 – 2:30 Presentation Session 3
Chair: Matt Gaudreau, BSIA
Jatin Nathwani, University of Waterloo (Engineering)
Sarah Burch, University of Waterloo (New Social Media and Education)

3:00 – 5:00 Roundtable 1 (all presenters)
Chair: Lucie Edwards, BSIA
Discussant: Vanessa Schweizer, University of Waterloo

5:00 – 5:15 Wrap-up
Dan Scott and Simon Dalby

#### Saturday 26 October: What Should We Do About It?

9:00 – 10:15 Presentation Session 4
Chair: Matt Gaudreau, BSIA
Radoslav Dimitrov, Western University (Political Science)
Mike Hulme, Kings’ College London (Climate Science)
Jean Andrey, University of Waterloo (Geography)

10:45 – 12:00 Presentation Session 5
Chair: Lucie Edwards, BSIA
Jatin Nathwani, University of Waterloo (Engineering)
Sarah Burch, University of Waterloo (Environmental Education)
Chris Russill, Carleton University (Communications)

1:00 – 2:00 Presentation Session 6
Chair: Marisa Beck, BSIA
Byron Williston, Wilfrid Laurier University (Philosophy)
John Baez, University of California (Mathematics)

2:30 – 4:30 Roundtable 2 (all presenters)
Chair: Sara Koopman, BSIA
Discussant: James Orbinski, CIGI Chair in Global Health

4:30 – 5:00 Wrap-up
Dan Scott and Simon Dalby

### Some thoughts

Though I’m playing a designated role in this workshop—the “mathematician”—I don’t think it makes sense to focus on mathematical models of climate change, or the math projects I’m working on now.

I will probably seem strange and “mathematical” enough just saying what I think about climate change! Most of the other people come from fields quite different than mine: they seem much more in tune with the nitty-gritty details of politics and economics. So, perhaps my proper role is to mention some facts and numbers that they probably know already, to remind them of the magnitude, scope and urgency of the problem.

It may also be useful to emphasize that with very high probability, we won’t do enough soon enough, so we need to study a series of fallback positions, not just an ‘optimal’ response to climate change. And these fallback positions should go as far as thinking about what happens if we burn all the available carbon. What to do then?

When I talked about this workshop with the mathematician Sasha Beilinson, he wickedly suggested that the best solution to global warming might be a global economic collapse… and he asked if anyone was looking into this.

Of course this solution comes along with huge problems, and anyone who actually advocates it is branded as a nut and excluded from the ‘serious’ discourse on global warming. But the funny thing is, a global economic collapse could be just as probable as some more optimistic scenarios, for example those that require a massive outbreak of altruism worldwide.

So it’s worth thinking about economic collapse scenarios, and ‘burn carbon until there’s none left’ scenarios, even if we don’t want them. And these are the sort of things that only the mathematician in the room may be brave—or foolish—enough to mention.

What else?

## The North Pole Was, Briefly, a Lake

22 August, 2013

It happened over a month ago. The picture above was taken on 22 July 2013. It shows a buoy anchored near a remote webcam at the North Pole, surrounded by a lake of melted ice:

• Becky Oskin, North Pole now a lake, LiveScience, 23 July 2013.

Instead of snow and ice whirling on the wind, a foot-deep aquamarine lake now sloshes around a webcam stationed at the North Pole. The meltwater lake started forming July 13, following two weeks of warm weather in the high Arctic. In early July, temperatures were 2 to 5 degrees Fahrenheit (1 to 3 degrees Celsius) higher than average over much of the Arctic Ocean, according to the National Snow & Ice Data Center.

Meltwater ponds sprout more easily on young, thin ice, which now accounts for more than half of the Arctic’s sea ice. The ponds link up across the smooth surface of the ice, creating a network that traps heat from the sun. Thick and wrinkly multi-year ice, which has survived more than one freeze-thaw season, is less likely sport a polka-dot network of ponds because of its rough, uneven surface.

This particular meltwater pond was “just over 2 feet deep and a few hundred feet wide”, according to this article:

• Hannah Hickey, Santa’s workshop not flooded—but lots of melting in the Arctic, 30 July 2013.

The pond drained out through cracks in the ice late July 27.

More important is the overall trend in the the total sea ice volume as estimated by the Pan-Arctic Ice Ocean Modeling and Assimilation System (PIOMAS).

The trend line from 1979 to 2011 shows that Arctic sea ice is melting at an average rate of roughly 3,000 cubic kilometers per decade.

In 2010, 2011 and 2012, so much ice melted that the volume fell more than 2 standard deviations below from the trend line—that’s why the jagged curve falls below the shaded region at the far right of the graph. At the end of July this year, it was just about 2 standard deviations below the trend line. The ice volume seems unlikely to break last year’s record low.

As usual, click the picture for more details.

## Monte Carlo Methods in Climate Science

23 July, 2013

joint with David Tweed

One way the Azimuth Project can help save the planet is to get bright young students interested in ecology, climate science, green technology, and stuff like that. So, we are writing an article for Math Horizons, an American magazine for undergraduate math majors. This blog article is a draft of that. You can also see it in PDF form here.

We’d really like to hear your comments! There are severe limits on including more detail, since the article should be easy to read and short. So please don’t ask us to explain more stuff: we’re most interested to know if you sincerely don’t understand something, or feel that students would have trouble understanding something. For comparison, you can see sample Math Horizons articles here.

### Introduction

They look placid lapping against the beach on a calm day, but the oceans are actually quite dynamic. The ocean currents act as ‘conveyor belts’, transporting heat both vertically between the water’s surface and the depths and laterally from one area of the globe to another. This effect is so significant that the temperature and precipitation patterns can change dramatically when currents do.

For example: shortly after the last ice age, northern Europe experienced a shocking change in climate from 10,800 to 9,500 BC. At the start of this period temperatures plummeted in a matter of decades. It became 7° Celsius colder, and glaciers started forming in England! The cold spell lasted for over a thousand years, but it ended as suddenly as it had begun.

Why? The most popular theory is that that a huge lake in North America formed by melting glaciers burst its bank—and in a massive torrent lasting for years, the water from this lake rushed out to the northern Atlantic ocean. By floating atop the denser salt water, this fresh water blocked a major current: the Atlantic Meridional Overturning Circulation. This current brings warm water north and helps keep northern Europe warm. So, when iit shut down, northern Europe was plunged into a deep freeze.

Right now global warming is causing ice sheets in Greenland to melt and release fresh water into the North Atlantic. Could this shut down the Atlantic Meridional Overturning Circulation and make the climate of Northern Europe much colder? In 2010, Keller and Urban [KU] tackled this question using a simple climate model, historical data, probability theory, and lots of computing power. Their goal was to understand the spectrum of possible futures compatible with what we know today.

Let us look at some of the ideas underlying their work.

### Box models

The earth’s physical behaviour, including the climate is far too complex to simulate from the bottom up using basic physical principles, at least for now. The most detailed models today can take days to run on very powerful computers. So to make reasonable predictions on a laptop in a tractable time-frame, geophysical modellers use some tricks.

First, it is possible to split geophysical phenomena into ‘boxes’ containing strongly related things. For example: atmospheric gases, particulate levels and clouds all affect each other strongly; likewise the heat content, currents and salinity of the oceans all interact strongly. However, the interactions between the atmosphere and the oceans are weaker, and we can approximately describe them using just a few settings, such as the amount of atmospheric CO2 entering or leaving the oceans. Clearly these interactions must be consistent—for example, the amount of CO2 leaving the atmosphere box must equal the amount entering the ocean box—but breaking a complicated system into parts lets different specialists focus on different aspects; then we can combine these parts and get an approximate model of entire planet. The box model used by Keller and Urban is shown in Figure 1.

1. The box model used by Keller and Urban.

Second, it turn out that simple but effective box models can be distilled from the complicated physics in terms of forcings and feedbacks. Essentially a forcing is a measured input to the system, such as solar radiation or CO2 released by burning fossil fuels. As an analogy, consider a child on a swing: the adult’s push every so often is a forcing. Similarly a feedback describes how the current ‘box variables’ influence future ones. In the swing analogy, one feedback is how the velocity will influence the future height. Specifying feedbacks typically uses knowledge of the detailed low-level physics to derive simple, tractable functional relationships between groups of large-scale observables, a bit like how we derive the physics of a gas by thinking about collisions of lots of particles.

However, it is often not feasible to get actual settings for the parameters in our model starting from first principles. In other words, often we can get the general form of the equations in our model, but they contain a lot of constants that we can estimate only by looking at historical data.

### Probability modeling

Suppose we have a box model that depends on some settings $S.$ For example, in Keller and Urban’s model, $S$ is a list of 18 numbers. To keep things simple, suppose the settings are element of some finite set. Suppose we also have huge hard disc full of historical measurements, and we want to use this to find the best estimate of $S.$ Because our data is full of ‘noise’ from other, unmodeled phenomena we generally cannot unambiguously deduce a single set of settings. Instead we have to look at things in terms of probabilities. More precisely, we need to study the probability that $S$ take some value $s$ given that the measurements take some value. Let’s call the measurements $M$, and again let’s keep things simple by saying $M$ takes values in some finite set of possible measurements.

The probability that $S = s$ given that $M$ takes some value $m$ is called the conditional probability $P(S=s | M=m).$ How can we compute this conditional probability? This is a somewhat tricky problem.

One thing we can more easily do is repeatedly run our model with randomly chosen settings and see what measurements it predicts. By doing this, we can compute the probability that given setting values $S = s,$ the model predicts measurements $M=m.$ This again is a conditional probability, but now it is called $P(M=m|S=s).$

This is not what we want: it’s backwards! But here Bayes’ rule comes to the rescue, relating what we want to what we can more easily compute:

$\displaystyle{ P(S = s | M = m) = P(M = m| S = s) \frac{P(S = s)}{P(M = m)} }$

Here $P(S = s)$ is the probability that the settings take a specific value $s,$ and similarly for $P(M = m).$ Bayes’ rule is quite easy to prove, and it is actually a general rule that applies to any random variables, not just the settings and the measurements in our problem [Y]. It underpins most methods of figuring out hidden quantities from observed ones. For this reason, it is widely used in modern statistics and data analysis [K].

How does Bayes’ rule help us here? When we repeatedly run our model with randomly chosen settings, we have control over $P(S = s).$ As mentioned, we can compute $P(M=m| S=s).$ Finally, $P(M = m)$ is independent of our choice of settings. So, we can use Bayes’ rule to compute $P(S = s | M = m)$ up to a constant factor. And since probabilities must sum to 1, we can figure out this constant.

This lets us do many things. It lets us find the most likely values of the settings for our model, given our hard disc full of observed data. It also lets us find the probability that the settings lie within some set. This is important: if we’re facing the possibility of a climate disaster, we don’t just want to know the most likely outcome. We would like to know to know that with 95% probability, the outcome will lie in some range.

### An example

Let us look at an example much simpler than that considered by Keller and Urban. Suppose our measurements are real numbers $m_0,\dots, m_T$ related by

$m_{t+1} = s m_t - m_{t-1} + N_t$

Here $s,$ a real constant, is our ‘setting’, while $N_t$ is some ‘noise’: an independent Gaussian random variable for each time $t,$ each with mean zero and some fixed standard deviation. Then the measurements $m_t$ will have roughly sinusoidal behavior but with irregularity added by the noise at each time step, as illustrated in Figure 2.

2. The example system: red are predicted measurements for a given value of the settings, green is another simulation for the same $s$ value and blue is a simulation for a slightly different $s.$

Note how there is no clear signal from either the curves or the differences that the green curve is at the correct setting value while the blue one has the wrong one: the noise makes it nontrivial to estimate $s.$ This is a baby version of the problem faced by Keller and Urban.

### Markov Chain Monte Carlo

Having glibly said that we can compute the conditional probability $P(M=m | S=s),$ how do we actually do this? The simplest way would be to run our model many, many times with the settings set at $S=s$ and determine the fraction of times it predicts measurements equal to $m.$ This gives us an estimate of $P(M=m | S=s).$ Then we can use Bayes’ rule to work out $P(M=m|S=s),$ at least up to a constant factor.

Doing all this by hand would be incredibly time consuming and error prone, so computers are used for this task. In our example, we do this in Figure 3. As we keep running our model over and over, the curve showing $P(M=m |S=s)$ as a function of $s$ settles down to the right answer.

3. The estimates of $P(M=m | S=s)$ as a function of $s$ using uniform sampling, ending up with 480 samples at each point.

However, this is computationally inefficient, as shown in the probability distribution for small numbers of samples. This has quite a few ‘kinks’, which only disappear later. The problem is that there are lots of possible choices of $s$ to try. And this is for a very simple model!

When dealing with the 18 settings involved in the model of Keller and Urban, trying every combination would take far too long. A way to avoid this is Markov Chain Monte Carlo sampling. Monte Carlo is famous for its casinos, so a ‘Monte Carlo’ algorithm is one that uses randomness. A ‘Markov chain’ is a random walk: for example, where you repeatedly flip a coin and take one step right when you get heads, and one step right when you get tails. So, in Markov Chain Monte Carlo, we perform a random walk through the collection of all possible settings, collecting samples.

The key to making this work is that at each step on the walk a proposed modification $s'$ to the current settings $s$ is generated randomly—but it may be rejected if it does not seem to improve the estimates. The essence of the rule is:

The modification $s \mapsto s'$ is randomly accepted with a probability equal to the ratio

$\displaystyle{ \frac{P(M=m | S=s')}{ P(M=m | S=s)} }$

Otherwise the walk stays at the current position.

If the modification is better, so that the ratio is greater than 1, the new state is always accepted. With some additional tricks—such as discarding the very beginning of the walk—this gives a set of samples from which can be used to compute $P(M=m | S=s).$ Then we can compute $P(S = s | M = m)$ using Bayes’ rule.

Figure 4 shows the results of using the Markov Chain Monte Carlo procedure to figure out $P(S= s| M= m)$ in our example.

4. The estimates of $P(S = s|M = m)$ curves using Markov Chain Monte Carlo, showing the current distribution estimate at increasing intervals. The red line shows the current position of the random walk. Again the kinks are almost gone in the final distribution.

Note that the final distribution has only peformed about 66 thousand simulations in total, while the full sampling peformed over 1.5 million. The key advantage of Markov Chain Monte Carlo is that it avoids performing many simulations in areas where the probability is low, as we can see from the way the walk path remains under the big peak in the probability density almost all the time. What is more impressive is that it achieves this without any global view of the probability density, just by looking at how $P(M=m | S=s)$ changes when we make small changes in the settings. This becomes even more important as we move to dealing with systems with many more dimensions and settings, where it proves very effective at finding regions of high probability density whatever their shape.

Why is it worth doing so much work to estimate the probability distribution for settings for a climate model? One reason is that we can then estimate probabilities of future events, such as the collapse of the Atlantic Meridional Ocean Current. And what’s the answer? According to Keller and Urban’s calculation, this current will likely weaken by about a fifth in the 21st century, but a complete collapse is unlikely before 2300. This claim needs to be checked in many ways—for example, using more detailed models. But the importance of the issue is clear, and we hope we have made the importance of good mathematical ideas for climate science clear as well.

### Exploring the topic

The Azimuth Project is a group of scientists, engineers and computer programmers interested in questions like this [A]. If you have questions, or want to help out, just email us. Versions of the computer programs we used in this paper will be made available here in a while.

Here are some projects you can try, perhaps with the help of Kruschke’s textbook [K]:

• There are other ways to do setting estimation using time series: compare some to MCMC in terms of accuracy and robustness.

• We’ve seen a 1-dimensional system with one setting. Simulate some multi-dimensional and multi-setting systems. What new issues arise?

Acknowledgements. We thank Nathan Urban and other
members of the Azimuth Project for many helpful discussions.

### References

[A] Azimuth Project, http://www.azimuthproject.org.

[KU] Klaus Keller and Nathan Urban, Probabilistic hindcasts and projections of the coupled climate, carbon cycle and Atlantic meridional overturning circulation system: a Bayesian fusion of century-scale measurements with a simple model, Tellus A 62 (2010), 737–750. Also available free online.

[K] John K. Kruschke, Doing Bayesian Data Analysis: A Tutorial with R and BUGS, Academic Press, New York, 2010.

[Y] Eliezer S. Yudkowsky, An intuitive explanation of Bayes’ theorem.

## Bridging the Greenhouse-Gas Emissions Gap

28 April, 2013

I could use some help here, finding organizations that can help cut greenhouse gas emissions. I’ll explain what I mean in a minute. But the big question is:

How can we bridge the gap between what we are doing about global warming and what we should be doing?

That’s what this paper is about:

• Kornelis Blok, Niklas Höhne, Kees van der Leun and Nicholas Harrison, Bridging the greenhouse-gas emissions gap, Nature Climate Change 2 (2012), 471-474.

According to the United Nations Environment Programme, we need to cut CO2 emissions by about 12 gigatonnes/year by 2020 to hold global warming to 2 °C.

After the UN climate conference in Copenhagen, many countries made pledges to reduce CO2 emissions. But by 2020 these pledges will cut emissions by at most 6 gigatonnes/year. Even worse, a lot of these pledges are contingent on other people meeting other pledges, and so on… so the confirmed value of all these pledges is only 3 gigatonnes/year.

The authors list 21 things that cities, large companies and individual citizens can do, which they claim will cut greenhouse gas emissions by the equivalent of 10 gigatonnes/year of CO2 by 2020. For each initiative on their list, they claim:

(1) there is a concrete starting position from which a significant up-scaling until the year 2020 is possible;

(2) there are significant additional benefits besides a reduction of greenhouse-gas emissions, so people can be driven by self-interest or internal motivation, not external pressure;

(3) there is an organization or combination of organizations that can lead the initiative;

(4) the initiative has the potential to reach an emission reduction by about 0.5 Gt CO2e by 2020.

### 21 Initiatives

Now I want to quote the paper and list the 21 initiatives. And here’s where I could use your help! For each of these, can you point me to one or more organizations that are in a good position to lead the initiative?

Some are already listed, but even for these I bet there are other good answers. I want to compile a list, and then start exploring what’s being done, and what needs to be done.

By the way, even if the UN estimate of the greenhouse-emissions gap is wrong, and even if all the numbers I’m about to quote are wrong, most of them are probably the right order of magnitude—and that’s all we need to get a sense of what needs to be done, and how we can do it.

#### Companies

1. Top 1,000 companies’ emission reductions. Many of the 1,000 largest greenhouse-gas-emitting companies already have greenhouse-gas emission-reduction goals to decrease their energy use and increase their long-term competitiveness, as well as to demonstrate their corporate social responsibility. An association such as the World Business Council for Sustainable Development could lead 30% of the top 1,000 companies to reduce energy-related emissions 10% below business as usual by 2020 and all companies to reduce their non-carbon dioxide greenhouse-gas emissions by 50%. Impact in 2020: up to 0.7 Gt CO2e.

2. Supply-chain emission reductions. Several companies already have social and environmental requirements for their suppliers, which are driven by increased competitiveness, corporate social responsibility and the ability to be a front-runner. An organization such as the Consumer Goods Forum could stimulate 30% of companies to require their supply chains to reduce emissions 10% below business as usual by 2020. Impact in 2020: up to 0.2 Gt CO2e.

3. Green financial institutions. More than 200 financial organizations are already members of the finance initiative of the United Nations Environment Programme (UNEP-FI). They are committed to environmental goals owing to corporate social responsibility, to gain investor certainty and to be placed well in emerging markets. UNEP-FI could lead the 20 largest banks to reduce the carbon footprint of 10% of their assets by 80%. Impact in 2020: up to 0.4 Gt of their assets by 80%. Impact in 2020: up to 0.4 Gt CO2e.

4. Voluntary-offset companies. Many companies are already offsetting their greenhouse-gas emissions, mostly without explicit external pressure. A coalition between an organization with convening power, for example UNEP, and offset providers could motivate 20% of the companies in the light industry and commercial sector to calculate their greenhouse-gas emissions, apply emission-reduction measures and offset the remaining emissions (retiring the purchased credits). It is ensured that offset projects really reduce emissions by using the ‘gold standard’ for offset projects or another comparable mechanism. Governments could provide incentives by giving tax credits for offsetting, similar to those commonly given for charitable donations. Impact by 2020: up to 2.0 Gt CO2e.

#### Other actors

5. Voluntary-offset consumers. A growing number of individuals (especially with high income) already offset their greenhouse-gas emissions, mostly for flights, but also through carbon-neutral products. Environmental NGOs could motivate 10% of the 20% of richest individuals to offset their personal emissions from electricity use, heating and transport at cost to them of around US\$200 per year. Impact in 2020: up to 1.6 Gt CO2e.

6. Major cities initiative. Major cities are large emitters of greenhouse gases and many have greenhouse-gas reduction targets. Cities are intrinsically highly motivated to act so as to improve local air quality, attractiveness and local job creation. Groups like the C40 Cities Climate Leadership Group and ICLEI — Local Governments for Sustainability could lead the 40 cities in C40 or an equivalent sample to reduce emissions 20% below business as usual by 2020, building on the thousands of emission-reduction activities already implemented by the C40 cities. Impact in 2020: up to 0.7 Gt CO2e.

7. Subnational governments. Several states in the United States and provinces in Canada have already introduced support mechanisms for renewable energy, emission-trading schemes, carbon taxes and industry regulation. As a result, they expect an increase in local competitiveness, jobs and energy security. Following the example set by states such as California, these ambitious US states and Canadian provinces could accept an emission-reduction target of 15–20% below business as usual by 2020, as some states already have. Impact in 2020: up to 0.6 Gt CO2e.

#### Energy efficiency

8. Building heating and cooling. New buildings, and increasingly existing buildings, are designed to be extremely energy efficient to realize net savings and increase comfort. The UN Secretary General’s Sustainable Energy for All Initiative could bring together the relevant players to realize 30% of the full reduction potential for 2020. Impact in 2020: up to 0.6 Gt CO2e.

9. Ban of incandescent lamps. Many countries already have phase-out schedules for incandescent lamps as it provides net savings in the long term. The en.lighten initiative of UNEP and the Global Environment Facility already has a target to globally ban incandescent lamps by 2016. Impact in 2020: up to 0.2 Gt CO2e.

10. Electric appliances. Many international labelling schemes and standards already exist for energy efficiency of appliances, as efficient appliances usually give net savings in the long term. The Collaborative Labeling and Appliance Standards Program or the Super-efficient Equipment and Appliance Deployment Initiative could drive use of the most energy-efficient appliances on the market. Impact in 2020: up to 0.6 Gt CO2e.

11. Cars and trucks. All car and truck manufacturers put emphasis on developing vehicles that are more efficient. This fosters innovation and increases their long-term competitive position. The emissions of new cars in Europe fell by almost 20% in the past decade. A coalition of manufacturers and NGOs joined by the UNEP Partnership for Clean Fuels and Vehicles could agree to save one additional liter per 100 km globally by 2020 for cars, and equivalent reductions for trucks. Impact in 2020: up to 0.7 Gt CO2e.

#### Energy supply

12. Boost solar photovoltaic energy. Prices of solar photovoltaic systems have come down rapidly in recent years, and installed capacity has increased much faster than expected. It created a new industry, an export market and local value added through, for example, roof installations. A coalition of progressive governments and producers could remove barriers by introducing good grid access and net metering rules, paving the way to add another 1,600 GW by 2020 (growth consistent with recent years). Impact in 2020: up to 1.4 Gt CO2e.

13. Wind energy. Cost levels for wind energy have come down dramatically, making wind economically competitive with fossil-fuel-based power generation in many cases. The Global Wind Energy Council could foster the global introduction of arrangements that lead to risk reduction for investments in wind energy, with, for example, grid access and guarantees. This could lead to an installation of 1,070 GW by 2020, which is 650 GW over a reference scenario. Impact in 2020: up to 1.2 Gt CO2e.

14. Access to energy through low-emission options. Strong calls and actions are already underway to provide electricity access to 1.4 billion people who are at present without and fulfill development goals. The UN Secretary General’s Sustainable Energy for All Initiative could ensure that all people without access to electricity get access through low-emission options. Impact in 2020: up to 0.4 Gt CO2e.

15. Phasing out subsidies for fossil fuels. This highly recognized option to reduce emissions would improve investment in clean energy, provide other environmental, health and security benefits, and generate income. The International Energy Agency could work with countries to phase out half of all fossil-fuel subsidies. Impact in 2020: up to 0.9 Gt CO2e.

#### Special sectors

16. International aviation and maritime transport. The aviation and shipping industries are seriously considering efficiency measures and biofuels to increase their competitive advantage. Leading aircraft and ship manufacturers could agree to design their vehicles to capture half of the technical mitigation potential. Impact in 2020: up to 0.2 Gt CO2e.

17. Fluorinated gases (hydrofluorocarbons, perflourocarbons, SF6). Recent industry-led initiatives are already underway to reduce emissions of these gases originating from refrigeration, air-conditioning and industrial processes. Industry associations, such as Refrigerants, Naturally!, could work towards meeting half of the technical mitigation potential. Impact in 2020: up to 0.3 Gt CO2e.

18. Reduce deforestation. Some countries have already shown that it is strongly possible to reduce deforestation with an integrated approach that eliminates the drivers of deforestation. This has benefits for local air pollution and biodiversity, and can support the local population. Led by an individual with convening power, for example, the United Kingdom’s Prince of Wales or the UN Secretary General, such approaches could be rolled out to all the major countries with high deforestation emissions, halving global deforestation by 2020. Impact in 2020: up to 1.8 Gt CO2e.

19. Agriculture. Options to reduce emissions from agriculture often increase efficiency. The International Federation of Agricultural Producers could help to realize 30% of the technical mitigation potential. (Well, at least it could before it collapsed, after this paper was written.) Impact in 2020: up to 0.8 Gt CO2e.

#### Air pollutants

20. Enhanced reduction of air pollutants. Reduction of classic air pollutants including black carbon has been pursued for years owing to positive impacts on health and local air quality. UNEP’s Climate and Clean Air Coalition To Reduce Short-Lived Climate Pollutants already has significant political momentum and could realize half of the technical mitigation potential. Impact in 2020: a reduction in radiative forcing impact equivalent to an emission reduction of greenhouse gases in the order of 1 Gt CO2e, but outside of the definition of the gap.

21. Efficient cook-stoves. Cooking in rural areas is a source of carbon dioxide emissions. Furthermore, there are emissions of black carbon, which also leads to global warming. Replacing these cook-stoves would also significantly increase local air quality and reduce pressure on forests from fuel-wood demand. A global development organization such as the UN Development Programme could take the lead in scaling-up the many already existing programs to eventually replace half of the existing cook-stoves. Impact in 2020: a reduction in radiative forcing impact equivalent to an emission reduction of greenhouse gases of up to 0.6 Gt CO2e, included in the effect of the above initiative and outside of the definition of the gap.

### For more

For more, see the supplementary materials to this paper, and also:

The size of the emissions gap was calculated here:

The Emissions Gap Report 2012, United Nations Environment Programme (UNEP).

If you’re in a rush, just read the executive summary.