New IPCC Report (Part 8)

22 April, 2014

guest post by Steve Easterbrook

(8) To stay below 2°C of warming, most fossil fuels must stay buried in the ground

Perhaps the most profound advance since the previous IPCC report is a characterization of our global carbon budget. This is based on a finding that has emerged strongly from a number of studies in the last few years: the expected temperature change has a simple linear relationship with cumulative CO2 emissions since the beginning of the industrial era:

(Figure SPM.10) Global mean surface temperature increase as a function of cumulative total global CO2 emissions from various lines of evidence. Multi-model results from a hierarchy of climate-carbon cycle models for each RCP until 2100 are shown with coloured lines and decadal means (dots). Some decadal means are indicated for clarity (e.g., 2050 indicating the decade 2041−2050). Model results over the historical period (1860–2010) are indicated in black. The coloured plume illustrates the multi-model spread over the four RCP scenarios and fades with the decreasing number of available models in RCP8.5. The multi-model mean and range simulated by CMIP5 models, forced by a CO2 increase of 1% per year (1% per year CO2 simulations), is given by the thin black line and grey area. For a specific amount of cumulative CO2 emissions, the 1% per year CO2 simulations exhibit lower warming than those driven by RCPs, which include additional non-CO2 drivers. All values are given relative to the 1861−1880 base period. Decadal averages are connected by straight lines.

(Figure SPM.10) Global mean surface temperature increase as a function of cumulative total global CO2 emissions from various lines of evidence. Multi-model results from a hierarchy of climate-carbon cycle models for each RCP until 2100 are shown with coloured lines and decadal means (dots). Some decadal means are indicated for clarity (e.g., 2050 indicating the decade 2041−2050). Model results over the historical period (1860–2010) are indicated in black. The coloured plume illustrates the multi-model spread over the four RCP scenarios and fades with the decreasing number of available models in RCP8.5. The multi-model mean and range simulated by CMIP5 models, forced by a CO2 increase of 1% per year (1% per year CO2 simulations), is given by the thin black line and grey area. For a specific amount of cumulative CO2 emissions, the 1% per year CO2 simulations exhibit lower warming than those driven by RCPs, which include additional non-CO2 drivers. All values are given relative to the 1861−1880 base period. Decadal averages are connected by straight lines.

(Click to enlarge.)

The chart is a little hard to follow, but the main idea should be clear: whichever experiment we carry out, the results tend to lie on a straight line on this graph. You do get a slightly different slope in one experiment, the “1% percent CO2 increase per year” experiment, where only CO2 rises, and much more slowly than it has over the last few decades. All the more realistic scenarios lie in the orange band, and all have about the same slope.

This linear relationship is a useful insight, because it means that for any target ceiling for temperature rise (e.g. the UN’s commitment to not allow warming to rise more than 2°C above pre-industrial levels), we can easily determine a cumulative emissions budget that corresponds to that temperature. So that brings us to the most important paragraph in the entire report, which occurs towards the end of the summary for policymakers:

Limiting the warming caused by anthropogenic CO2 emissions alone with a probability of >33%, >50%, and >66% to less than 2°C since the period 1861–1880, will require cumulative CO2 emissions from all anthropogenic sources to stay between 0 and about 1560 GtC, 0 and about 1210 GtC, and 0 and about 1000 GtC since that period respectively. These upper amounts are reduced to about 880 GtC, 840 GtC, and 800 GtC respectively, when accounting for non-CO2 forcings as in RCP2.6. An amount of 531 [446 to 616] GtC, was already emitted by 2011.

Unfortunately, this paragraph is a little hard to follow, perhaps because there was a major battle over the exact wording of it in the final few hours of inter-governmental review of the “Summary for Policymakers”. Several oil states objected to any language that put a fixed limit on our total carbon budget. The compromise was to give several different targets for different levels of risk.

Let’s unpick them. First notice that the targets in the first sentence are based on looking at CO2 emissions alone; the lower targets in the second sentence take into account other greenhouse gases, and other earth systems feedbacks (e.g. release of methane from melting permafrost), and so are much lower. It’s these targets that really matter:

• To give us a one third (33%) chance of staying below 2°C of warming over pre-industrial levels, we cannot ever emit more than 880 gigatonnes of carbon.

• To give us a 50% chance, we cannot ever emit more than 840 gigatonnes of carbon.

• To give us a 66% chance, we cannot ever emit more than 800 gigatonnes of carbon.

Since the beginning of industrialization, we have already emitted a little more than 500 gigatonnes. So our remaining budget is somewhere between 300 and 400 gigatonnes of carbon. Existing known fossil fuel reserves are enough to release at least 1000 gigatonnes. New discoveries and unconventional sources will likely more than double this. That leads to one inescapable conclusion:

Most of the remaining fossil fuel reserves must stay buried in the ground.

We’ve never done that before. There is no political or economic system anywhere in the world currently that can persuade an energy company to leave a valuable fossil fuel resource untapped. There is no government in the world that has demonstrated the ability to forgo the economic wealth from natural resource extraction, for the good of the planet as a whole. We’re lacking both the political will and the political institutions to achieve this. Finding a way to achieve this presents us with a challenge far bigger than we ever imagined.


You can download all of Climate Change 2013: The Physical Science Basis here. Click below to read any part of this series:

  1. The warming is unequivocal.
  2. Humans caused the majority of it.
  3. The warming is largely irreversible.
  4. Most of the heat is going into the oceans.
  5. Current rates of ocean acidification are unprecedented.
  6. We have to choose which future we want very soon.
  7. To stay below 2°C of warming, the world must become carbon negative.
  8. To stay below 2°C of warming, most fossil fuels must stay buried in the ground.

Climate Change 2013: The Physical Science Basis is also available chapter by chapter here:

  1. Front Matter
  2. Summary for Policymakers
  3. Technical Summary
    1. Supplementary Material

Chapters

  1. Introduction
  2. Observations: Atmosphere and Surface
    1. Supplementary Material
  3. Observations: Ocean
  4. Observations: Cryosphere
    1. Supplementary Material
  5. Information from Paleoclimate Archives
  6. Carbon and Other Biogeochemical Cycles
    1. Supplementary Material
  7. Clouds and Aerosols

    1. Supplementary Material
  8. Anthropogenic and Natural Radiative Forcing
    1. Supplementary Material
  9. Evaluation of Climate Models
  10. Detection and Attribution of Climate Change: from Global to Regional
    1. Supplementary Material
  11. Near-term Climate Change: Projections and Predictability
  12. Long-term Climate Change: Projections, Commitments and Irreversibility
  13. Sea Level Change
    1. Supplementary Material
  14. Climate Phenomena and their Relevance for Future Regional Climate Change
    1. Supplementary Material

Annexes

  1. Annex I: Atlas of Global and Regional Climate Projections
    1. Supplementary Material: RCP2.6, RCP4.5, RCP6.0, RCP8.5
  2. Annex II: Climate System Scenario Tables
  3. Annex III: Glossary
  4. Annex IV: Acronyms
  5. Annex V: Contributors to the WGI Fifth Assessment Report
  6. Annex VI: Expert Reviewers of the WGI Fifth Assessment Report

Civilizational Collapse (Part 1)

25 March, 2014

This story caught my attention, since a lot of people are passing it around:

• Nafeez Ahmed, NASA-funded study: industrial civilisation headed for ‘irreversible collapse’?, Earth Insight, blog on The Guardian, 14 March 2014.

Sounds dramatic! But notice the question mark in the title. The article says that “global industrial civilisation could collapse in coming decades due to unsustainable resource exploitation and increasingly unequal wealth distribution.” But with the word “could” in there, who could possibly argue? It’s certainly possible. What’s the actual news here?

It’s about a new paper that’s been accepted the Elsevier journal Ecological Economics. Since this paper has not been published, and I don’t even know the title, it’s hard to get details yet. According to Nafeez Ahmed,

The research project is based on a new cross-disciplinary ‘Human And Nature DYnamical’ (HANDY) model, led by applied mathematician Safa Motesharrei of the US National Science Foundation-supported National Socio-Environmental Synthesis Center, in association with a team of natural and social scientists.

So I went to Safa Motesharrei‘s webpage. It says he’s a grad student getting his PhD at the Socio-Environmental Synthesis Center, working with a team of people including:

Eugenia Kalnay (atmospheric science)
James Yorke (mathematics)
Matthias Ruth (public policy)
Victor Yakovenko (econophysics)
Klaus Hubacek (geography)
Ning Zeng (meteorology)
Fernando Miralles-Wilhelm (hydrology).

I was able to find this paper draft:

• Safa Motesharri, Jorge Rivas and Eugenia Kalnay, A minimal model for human and nature interaction, 13 November 2012.

I’m not sure how this is related to the paper discussed by Nafeez Ahmed, but it includes some (though not all) of the passages quoted by him, and it describes the HANDY model. It’s an extremely simple model, so I’ll explain it to you.

But first let me quote a bit more of the Guardian article, so you can see why it’s attracting attention:

By investigating the human-nature dynamics of these past cases of collapse, the project identifies the most salient interrelated factors which explain civilisational decline, and which may help determine the risk of collapse today: namely, Population, Climate, Water, Agriculture, and Energy.

These factors can lead to collapse when they converge to generate two crucial social features: “the stretching of resources due to the strain placed on the ecological carrying capacity”; and “the economic stratification of society into Elites [rich] and Masses (or “Commoners”) [poor]” These social phenomena have played “a central role in the character or in the process of the collapse,” in all such cases over “the last five thousand years.”

Currently, high levels of economic stratification are linked directly to overconsumption of resources, with “Elites” based largely in industrialised countries responsible for both:

“… accumulated surplus is not evenly distributed throughout society, but rather has been controlled by an elite. The mass of the population, while producing the wealth, is only allocated a small portion of it by elites, usually at or just above subsistence levels.”

The study challenges those who argue that technology will resolve these challenges by increasing efficiency:

“Technological change can raise the efficiency of resource use, but it also tends to raise both per capita resource consumption and the scale of resource extraction, so that, absent policy effects, the increases in consumption often compensate for the increased efficiency of resource use.”

Productivity increases in agriculture and industry over the last two centuries has come from “increased (rather than decreased) resource throughput,” despite dramatic efficiency gains over the same period.

Modelling a range of different scenarios, Motesharri and his colleagues conclude that under conditions “closely reflecting the reality of the world today… we find that collapse is difficult to avoid.” In the first of these scenarios, civilisation:

“…. appears to be on a sustainable path for quite a long time, but even using an optimal depletion rate and starting with a very small number of Elites, the Elites eventually consume too much, resulting in a famine among Commoners that eventually causes the collapse of society. It is important to note that this Type-L collapse is due to an inequality-induced famine that causes a loss of workers, rather than a collapse of Nature.”

Another scenario focuses on the role of continued resource exploitation, finding that “with a larger depletion rate, the decline of the Commoners occurs faster, while the Elites are still thriving, but eventually the Commoners collapse completely, followed by the Elites.”

In both scenarios, Elite wealth monopolies mean that they are buffered from the most “detrimental effects of the environmental collapse until much later than the Commoners”, allowing them to “continue ‘business as usual’ despite the impending catastrophe.” The same mechanism, they argue, could explain how “historical collapses were allowed to occur by elites who appear to be oblivious to the catastrophic trajectory (most clearly apparent in the Roman and Mayan cases).”

Applying this lesson to our contemporary predicament, the study warns that:

“While some members of society might raise the alarm that the system is moving towards an impending collapse and therefore advocate structural changes to society in order to avoid it, Elites and their supporters, who opposed making these changes, could point to the long sustainable trajectory ‘so far’ in support of doing nothing.”

However, the scientists point out that the worst-case scenarios are by no means inevitable, and suggest that appropriate policy and structural changes could avoid collapse, if not pave the way toward a more stable civilisation.

The two key solutions are to reduce economic inequality so as to ensure fairer distribution of resources, and to dramatically reduce resource consumption by relying on less intensive renewable resources and reducing population growth:

“Collapse can be avoided and population can reach equilibrium if the per capita rate of depletion of nature is reduced to a sustainable level, and if resources are distributed in a reasonably equitable fashion.”

The HANDY model

So what’s the model?

It’s 4 ordinary differential equations:

\dot{x}_C = \beta_C x_C - \alpha_C x_C

\dot{x}_E = \beta_E x_E - \alpha_E x_E

\dot{y} = \gamma y (\lambda - y) - \delta x_C y

\dot{w} = \delta x_C y - C_C - C_E

where:

x_C is the population of the commoners or masses

x_E is the population of the elite

y represents natural resources

w represents wealth

The authors say that

Natural resources exist in three forms: nonrenewable stocks (fossil fuels, mineral deposits, etc), renewable stocks (forests, soils, aquifers), and flows (wind, solar radiation, rivers). In future versions of HANDY, we plan to disaggregate Nature into these three different forms, but for simpli cation in this version, we have adopted a single formulation intended to represent an amalgamation of the three forms.

So, it’s possible that the paper to be published in Ecological Economics treats natural resources using three variables instead of just one.

Now let’s look at the equations one by one:

\dot{x}_C = \beta_C x_C - \alpha_C x_C

This looks weird at first, but \beta_C and \alpha_C aren’t both constants, which would be redundant. \beta_C is a constant birth rate for commoners, while \alpha_C, the death rate for commoners, is a function of wealth.

Similarly, in

\dot{x}_E = \beta_E x_E - \alpha_E x_E

\beta_E is a constant birth rate for the elite, while \alpha_E, the death rate for the elite, is a function of wealth. The death rate is different for the elite and commoners:

For both the elite and commoners, the death rate drops linearly with increasing wealth from its maximum value \alpha_M to its minimum values \alpha_m. But it drops faster for the elite, of course! For the commoners it reaches its minimum when the wealth w reaches some value w_{th}, but for the elite it reaches its minimum earlier, when w = w_{th}/\kappa, where \kappa is some number bigger than 1.

Next, how do natural resources change?

\dot{y} = \gamma y (\lambda - y) - \delta x_C y

The first part of this equation:

\dot{y} = \gamma y (\lambda - y)

describes how natural resources renew themselves if left alone. This is just the logistic equation, famous in models of population growth. Here \lambda is the equilibrium level of natural resources, while \gamma is another number that helps say how fast the resources renew themselves. Solutions of the logistic equation look like this:

But the whole equation

\dot{y} = \gamma y (\lambda - y) - \delta x_C y

has a term saying that natural resources get used up at a rate proportional to the population of commoners x_C times the amount of natural resources y. \delta is just a constant of proportionality.

It’s curious that the population of elites doesn’t affect the depletion of natural resources, and also that doubling the amount of natural resources doubles the rate at which they get used up. Regarding the first issue, the authors offer this explanation:

The depletion term includes a rate of depletion per worker, \delta, and is proportional to both Nature and the number of workers. However, the economic activity of Elites is modeled to represent executive, management, and supervisory functions, but not engagement in the direct extraction of resources, which is done by Commoners. Thus, only Commoners produce.

I didn’t notice a discussion of the second issue.

Finally, the change in the amount of wealth is described by this equation:

\dot{w} = \delta x_C y - C_C - C_E

The first term at right precisely matches the depletion of natural resources in the previous equation, but with the opposite sign: natural resources are getting turned into ‘wealth’. C_C describes consumption by commoners and C_E describes consumption by the elite. These are both functions of wealth, a bit like the death rates… but as you’d expect increasing wealth increases consumption:

For both the elite and commoners, consumption grows linearly with increasing wealth until wealth reaches the critical level w_{th}. But it grows faster for the elites, and reaches a higher level.

So, that’s the model… at least in this preliminary version of the paper.

Some solutions of the model

There are many parameters in this model, and many different things can happen depending on their values and the initial conditions. The paper investigates many different scenarios. I don’t have the energy to describe them all, so I urge you to skim it and look at the graphs.

I’ll just show you three. Here is one that Nafeez Ahmed mentioned, where civilization

appears to be on a sustainable path for quite a long time, but even using an optimal depletion rate and starting with a very small number of Elites, the Elites eventually consume too much, resulting in a famine among Commoners that eventually causes the collapse of society.

I can see why Ahmed would like to talk about this scenario: he’s written a book called A User’s Guide to the Crisis of Civilization and How to Save It. Clearly it’s worth putting some thought into risks of this sort. But how likely is this particular scenario compared to others? For that we’d need to think hard about how well this model matches reality.

It’s obviously a crude simplification of an immensely complex and unknowable system: the whole civilization on this planet. That doesn’t mean it’s fundamentally wrong! Its predictions could still be qualitatively correct. But to gain confidence in this, we’d need material that is not made in the draft paper I’ve seen. It says:

The scenarios most closely reflecting the reality of our world today are found in the third group of experiments (see section 5.3), where we introduced economic strati cation. Under such conditions,
we find that collapse is difficult to avoid.

But it would be nice to see a more careful approach to setting model parameters, justifying the simplifications built into the model, exploring what changes when some simplifications are reduced, and so on.

Here’s a happier scenario, where the parameters are chosen differently:

The main difference is that the depletion of resources per commoner, \delta, is smaller.

And here’s yet another, featuring cycles of prosperity, overshoot and collapse:

Tentative conclusions

I hope you see that I’m neither trying to ‘shoot down’ this model nor defend it. I’m just trying to understand it.

I think it’s very important—and fun—to play around with models like this, keep refining them, comparing them against each other, and using them as tools to help our thinking. But I’m not very happy that Nafeez Ahmed called this piece of work a “highly credible wake-up call” without giving us any details about what was actually done.

I don’t expect blog articles on the Guardian to feature differential equations! But it would be great if journalists who wrote about new scientific results would provide a link to the actual work, so people who want to could dig deeper can do so. Don’t make us scour the internet looking for clues.

And scientists: if your results are potentially important, let everyone actually see them! If you think civilization could be heading for collapse, burying your evidence and your recommendations for avoiding this calamity in a closed-access Elsevier journal is not the optimal strategy to deal with the problem.

There’s been a whole side-battle over whether NASA actually funded this study:

• Keith Kloor, About that popular Guardian story on the collapse of industrial civilization, Collide-A-Scape, blog on Discover, March 21, 2014.

• Nafeez Ahmed, Did NASA fund ‘civilisation collapse’ study, or not?, Earth Insight, blog on The Guardian, 21 March 2014.

But that’s very boring compared to fun of thinking about the model used in this study… and the challenging, difficult business of trying to think clearly about the risks of civilizational collapse.

Addendum

The paper is now freely available here:

• Safa Motesharri, Jorge Rivas and Eugenia Kalnay, Human and nature dynamics (HANDY): modeling inequality and use of resources in the collapse or sustainability of societies, Ecological Economics 101 (2014), 90–102.


Markov Models of Social Change (Part 2)

5 March, 2014

guest post by Vanessa Schweizer

This is my first post to Azimuth. It’s a companion to the one by Alaistair Jamieson-Lane. I’m an assistant professor at the University of Waterloo in Canada with the Centre for Knowledge Integration, or CKI. Through our teaching and research, the CKI focuses on integrating what appears, at first blush, to be drastically different fields in order to make the world a better place. The very topics I would like to cover today, which are mathematics and policy design, are an example of our flavour of knowledge integration. However, before getting into that, perhaps some background on how I got here would be helpful.

The conundrum of complex systems

For about eight years, I have focused on various problems related to long-term forecasting of social and technological change (long-term meaning in excess of 10 years). I became interested in these problems because they are particularly relevant to how we understand and respond to global environmental changes such as climate change.

In case you don’t know much about global warming or what the fuss is about, part of what makes the problem particularly difficult is that the feedback from the physical climate system to human political and economic systems is exceedingly slow. It is so slow, that under traditional economic and political analyses, an optimal policy strategy may appear to be to wait before making any major decisions – that is, wait for scientific knowledge and technologies to improve, or at least wait until the next election [1]. Let somebody else make the tough (and potentially politically unpopular) decisions!

The problem with waiting is that the greenhouse gases that scientists are most concerned about stay in the atmosphere for decades or centuries. They are also churned out by the gigatonne each year. Thus the warming trends that we have experienced for the past 30 years, for instance, are the cumulative result of emissions that happened not only recently but also long ago—in the case of carbon dioxide, as far back as the turn of the 20th century. The world in the 1910s was quainter than it is now, and as more economies around the globe industrialize and modernize, it is natural to wonder: how will we manage to power it all? Will we still rely so heavily on fossil fuels, which are the primary source of our carbon dioxide emissions?

Such questions are part of what makes climate change a controversial topic. Present-day policy decisions about energy use will influence the climatic conditions of the future, so what kind of future (both near-term and long-term) do we want?

Futures studies and trying to learn from the past

Many approaches can be taken to answer the question of what kind of future we want. An approach familiar to the political world is for a leader to espouse his or her particular hopes and concerns for the future, then work to convince others that those ideas are more relevant than someone else’s. Alternatively, economists do better by developing and investigating different simulations of economic developments over time; however, the predictive power of even these tools drops off precipitously beyond the 10-year time horizon.

The limitations of these approaches should not be too surprising, since any stockbroker will say that when making financial investments, past performance is not necessarily indicative of future results. We can expect the same problem with rhetorical appeals, or economic models, that are based on past performances or empirical (which also implies historical) relationships.

A different take on foresight

A different approach avoids the frustration of proving history to be a fickle tutor for the future. By setting aside the supposition that we must be able to explain why the future might play out a particular way (that is, to know the ‘history’ of a possible future outcome), alternative futures 20, 50, or 100 years hence can be conceptualized as different sets of conditions that may substantially diverge from what we see today and have seen before. This perspective is employed in cross-impact balance analysis, an algorithm that searches for conditions that can be demonstrated to be self-consistent [3].

Findings from cross-impact balance analyses have been informative for scientific assessments produced by the Intergovernmental Panel on Climate Change Research, or IPCC. To present a coherent picture of the climate change problem, the IPCC has coordinated scenario studies across economic and policy analysts as well as climate scientists since the 1990s. Prior to the development of the cross-impact balance method, these researchers had to do their best to identify appropriate ranges for rates of population growth, economic growth, energy efficiency improvements, etc. through their best judgment.

A retrospective using cross-impact balances on the first Special Report on Emissions Scenarios found that the researchers did a good job in many respects. However, they underrepresented the large number of alternative futures that would result in high greenhouse gas emissions in the absence of climate policy [4].

As part of the latest update to these coordinated scenarios, climate change researchers decided it would be useful to organize alternative futures according socio-economic conditions that pose greater or fewer challenges to mitigation and adaptation. Mitigation refers to policy actions that decrease greenhouse gas emissions, while adaptation refers to reducing harms due to climate change or to taking advantage of benefits. Some climate change researchers argued that it would be sufficient to consider alternative futures where challenges to mitigation and adaptation co-varied, e.g. three families of futures where mitigation and adaptation challenges would be low, medium, or high.

Instead, cross-impact balances revealed that mixed-outcome futures—such as socio-economic conditions simultaneously producing fewer challenges to mitigation but greater challenges to adaptation—could not be completely ignored. This counter-intuitive finding, among others, brought the importance of quality of governance to the fore [5].

Although it is generally recognized that quality of governance—e.g. control of corruption and the rule of law—affects quality of life [6], many in the climate change research community have focused on technological improvements, such as drought-resistant crops, or economic incentives, such as carbon prices, for mitigation and adaptation. The cross-impact balance results underscored that should global patterns of quality of governance across nations take a turn for the worse, poor governance could stymie these efforts. This is because the influence of quality of governance is pervasive; where corruption is permitted at the highest levels of power, it may be permitted at other levels as well—including levels that are responsible for building schools, teaching literacy, maintaining roads, enforcing public order, and so forth.

The cross-impact balance study revealed this in the abstract, as summarized in the example matrices below. Alastair included a matrix like these in his post, where he explained that numerical judgments in such a matrix can be used to calculate the net impact of simultaneous influences on system factors. My purpose in presenting these matrices is a bit different, as the matrix structure can also explain why particular outcomes behave as system attractors.

In this example, a solid light gray square means that the row factor directly influences the column factor some amount, while white space means that there is no direct influence:

Dark gray squares along the diagonal have no meaning, since everything is perfectly correlated to itself. The pink squares highlight the rows for the factors “quality of governance” and “economy.” The importance of these rows is more apparent here; the matrix above is a truncated version of this more detailed one:

(Click to enlarge.)

The pink rows are highlighted because of a striking property of these factors. They are the two most influential factors of the system, as you can see from how many solid squares appear in their rows. The direct influence of quality of governance is second only to the economy. (Careful observers will note that the economy directly influences quality of governance, while quality of governance directly influences the economy). Other scholars have meticulously documented similar findings through observations [7].

As a method for climate policy analysis, cross-impact balances fill an important gap between genius forecasting (i.e., ideas about the far-off future espoused by one person) and scientific judgments that, in the face of deep uncertainty, are overconfident (i.e. neglecting the ‘fat’ or ‘long’ tails of a distribution).

Wanted: intrepid explorers of future possibilities

However, alternative visions of the future are only part of the information that’s needed to create the future that is desired. Descriptions of courses of action that are likely to get us there are also helpful. In this regard, the post by Jamieson-Lane describes early work on modifying cross-impact balances for studying transition scenarios rather than searching primarily for system attractors.

This is where you, as the mathematician or physicist, come in! I have been working with cross-impact balances as a policy analyst, and I can see the potential of this method to revolutionize policy discussions—not only for climate change but also for policy design in general. However, as pointed out by entrepreneurship professor Karl T. Ulrich, design problems are NP-complete. Those of us with lesser math skills can be easily intimidated by the scope of such search problems. For this reason, many analysts have resigned themselves to ad hoc explorations of the vast space of future possibilities. However, some analysts like me think it is important to develop methods that do better. I hope that some of you Azimuth readers may be up for collaborating with like-minded individuals on the challenge!

References

The graph of carbon emissions is from reference [2]; the pictures of the matrices are adapted from reference [5]:

[1] M. Granger Morgan, Milind Kandlikar, James Risbey and Hadi Dowlatabadi, Why conventional tools for policy analysis are often inadequate for problems of global change, Climatic Change 41 (1999), 271–281.

[2] T.F. Stocker et al., Technical Summary, in Climate Change 2013: The Physical Science Basis. Contribution of Working Group I to the Fifth Assessment Report of the Intergovernmental Panel on Climate Change (2013), T.F. Stocker, D. Qin, G.-K. Plattner, M. Tignor, S.K. Allen, J. Boschung, A. Nauels, Y. Xia, V. Bex, and P.M. Midgley (eds.) Cambridge University Press, New York.

[3] Wolfgang Weimer-Jehle, Cross-impact balances: a system-theoretical approach to cross-impact analysis, Technological Forecasting & Social Change 73 (2006), 334–361.

[4] Vanessa J. Schweizer and Elmar Kriegler, Improving environmental change research with systematic techniques for qualitative scenarios, Environmental Research Letters 7 (2012), 044011.

[5] Vanessa J. Schweizer and Brian C. O’Neill, Systematic construction of global socioeconomic pathways using internally consistent element combinations, Climatic Change 122 (2014), 431–445.

[6] Daniel Kaufman, Aart Kray and Massimo Mastruzzi, Worldwide Governance Indicators (2013), The World Bank Group.

[7] Daron Acemoglu and James Robinson, The Origins of Power, Prosperity, and Poverty: Why Nations Fail. Website.


Life’s Struggle to Survive

19 December, 2013

Here’s the talk I gave at the SETI Institute:

When pondering the number of extraterrestrial civilizations, it is worth noting that even after it got started, the success of life on Earth was not a foregone conclusion. In this talk, I recount some thrilling episodes from the history of our planet, some well-documented but others merely theorized: our collision with the planet Theia, the oxygen catastrophe, the snowball Earth events, the Permian-Triassic mass extinction event, the asteroid that hit Chicxulub, and more, including the massive environmental changes we are causing now. All of these hold lessons for what may happen on other planets!

To watch the talk, click on the video above. To see
slides of the talk, click here!

Here’s a mistake in my talk that doesn’t appear in the slides: I suggested that Theia started at the Lagrange point in Earth’s orbit. After my talk, an expert said that at that time, the Solar System had lots of objects with orbits of high eccentricity, and Theia was probably one of these. He said the Lagrange point theory is an idiosyncratic theory, not widely accepted, that somehow found its way onto Wikipedia.

Another issue was brought up in the questions. In a paper in Science, Sherwood and Huber argued that:

Any exceedence of 35 °C for extended periods should
induce hyperthermia in humans and other mammals, as dissipation of metabolic heat becomes impossible. While this never happens now, it would begin to occur with global-mean warming of about 7 °C, calling the habitability of some regions into question. With 11-12 °C warming, such regions would spread to encompass the majority of the human population as currently distributed. Eventual warmings of 12 °C are
possible from fossil fuel burning.

However, the Paleocene-Eocene Thermal Maximum seems to have been even hotter:

So, the question is: where did mammals live during this period, which mammals went extinct, if any, and does the survival of other mammals call into question Sherwood and Huber’s conclusion?


High-Speed Finance

8 August, 2012

 

These days, a lot of buying and selling of stocks is done by computers—it’s called algorithmic trading. Computers can do it much faster than people. Watch how they’ve been going wild!

The date is at lower left. In 2000 it took several seconds for computers to make a trade. By 2010 the time had dropped to milliseconds… or even microseconds. And around this year, market activity started becoming much more intense.

I can’t even see the Flash Crash on May 6 of 2010—also known as The Crash of 2:45. The Dow Jones plummeted 9% in 5 minutes, then quickly bounced back. For fifteen minutes, the economy lost a trillion dollars. Then it reappeared.

But on August 5, 2011, when the credit rating of the US got downgraded, you’ll see the activity explode! And it’s been crazy ever since.

The movie above was created by Nanex, a company that provides market data to traders. The x axis shows the time of day, from 9:30 to 16:00. The y axis… well, it’s the amount of some activity per unit time, but they don’t say what. Do you know?

The folks at Nanex have something very interesting to say about this. It’s not high frequency trading or ‘HFT’ that they’re worried about—that’s actually gone down slightly from 2008 to 2012. What’s gone up is ‘high frequency quoting’, also known as ‘quote spam’ or ‘quote stuffing’.

Over on Google+, Sergey Ten explained the idea to me:

Quote spam is a well-known tactic. It used by high-frequency traders to get competitive advantage over other high-frequency traders. HF traders generate high-frequency quote spam using a pseudorandom (or otherwise structured) algorithm, with his computers coded to ignore it. His competitors don’t know the generating algorithm and have to process each quote, thus increasing their load, consuming bandwidth and getting a slight delay in processing.

A quote is an offer to buy or sell stock at a given price. For a clear and entertaining of how this works and why traders are locked into a race for speed, try:

• Chris Stucchio, A high frequency trader’s apology, Part 1, 16 April 2012. Part 2, 25 April 2012.

I don’t know a great introduction to quote spam, but this paper isn’t bad:

• Jared F. Egginton, Bonnie F. Van Ness, and Robert A. Van Ness, Quote stuffing, 15 March 2012.

Toward the physical limits of speed

In fact, the battle for speed is so intense that trading has run up against the speed of light.

For example, by 2013 there will be a new transatlantic cable at the bottom of the ocean, the first in a decade. Why? Just to cut the communication time between US and UK traders by 5 milliseconds. The new fiber optic line will be straighter than existing ones:

“As a rule of thumb, each 62 miles that the light has to travel takes about 1 millisecond,” Thorvardarson says. “So by straightening the route between Halifax and London, we have actually shortened the cable by 310 miles, or 5 milliseconds.”

Meanwhile, a London-based company called Fixnetix has developed a special computer chip that can prepare a trade in just 740 nanoseconds. But why stop at nanoseconds?

With the race for the lowest “latency” continuing, some market participants are even talking about picoseconds––trillionths of a second. At first the realm of physics and math and then computer science, the picosecond looms as the next time barrier.

Actions that take place in nanoseconds and picoseconds in some cases run up against the sheer limitations of physics, said Mark Palmer, chief executive of Lexington, Mass.-based StreamBase Systems.

Black swans and the ultrafast machine ecology

As high-frequency trading and high-frequency quoting leave slow-paced human reaction times in the dust, markets start to behave differently. Here’s a great paper about that:

• Neil Johnson, Guannan Zhao, Eric Hunsader, Jing Meng, Amith Ravindar, Spencer Carran amd Brian Tivnan, Financial black swans driven by ultrafast machine ecology.

A black swan is an unexpectedly dramatic event, like a market crash or a stock bubble that bursts. But according to this paper, such events are now happening all the time at speeds beyond our perception!

Here’s one:

It’s a price spike in the stock of a company called Super Micro Computer, Inc.. On October 1st, 2010, it shot up 26% and then crashed back down. But this all happened in 25 milliseconds!

These ultrafast black swans happen at least once a day. And they happen most of all to financial institutions.

Here’s a great blog article about this stuff:

• Mark Buchanan, Approaching the singularity—in global finance, The Physics of Finance, 13 February 2012.

I won’t try to outdo Buchanan’s analysis. I’ll just quote the abstract of the original paper:

Society’s drive toward ever faster socio-technical systems, means that there is an urgent need to understand the threat from ‘black swan’ extreme events that might emerge. On 6 May 2010, it took just five minutes for a spontaneous mix of human and machine interactions in the global trading cyberspace to generate an unprecedented system-wide Flash Crash. However, little is known about what lies ahead in the crucial sub-second regime where humans become unable to respond or intervene sufficiently quickly. Here we analyze a set of 18,520 ultrafast black swan events that we have uncovered in stock-price movements between 2006 and 2011. We provide empirical evidence for, and an accompanying theory of, an abrupt system-wide transition from a mixed human-machine phase to a new all-machine phase characterized by frequent black swan events with ultrafast durations (<650ms for crashes, <950ms for spikes). Our theory quantifies the systemic fluctuations in these two distinct phases in terms of the diversity of the system's internal ecology and the amount of global information being processed. Our finding that the ten most susceptible entities are major international banks, hints at a hidden relationship between these ultrafast 'fractures' and the slow 'breaking' of the global financial system post-2006. More generally, our work provides tools to help predict and mitigate the systemic risk developing in any complex socio-technical system that attempts to operate at, or beyond, the limits of human response times.

Trans-quantitative analysts?

When you get into an arms race of trying to write algorithms whose behavior other algorithms can’t predict, the math involved gets very tricky. Over on Google+, F. Lengvel pointed out something strange. In May 2010, Christian Marks claimed that financiers were hiring experts on large ordinals—crudely speaking, big infinite numbers!—to design algorithms that were hard to outwit.

I can’t confirm his account, but I can’t resist quoting it:

In an unexpected development for the depressed market for mathematical logicians, Wall Street has begun quietly and aggressively recruiting proof theorists and recursion theorists for their expertise in applying ordinal notations and ordinal collapsing functions to high-frequency algorithmic trading. Ordinal notations, which specify sequences of ordinal numbers of ever increasing complexity, are being used by elite trading operations to parameterize families of trading strategies of breathtaking sophistication.

The monetary advantage of the current strategy is rapidly exhausted after a lifetime of approximately four seconds — an eternity for a machine, but barely enough time for a human to begin to comprehend what happened. The algorithm then switches to another trading strategy of higher ordinal rank, and uses this for a few seconds on one or more electronic exchanges, and so on, while opponent algorithms attempt the same maneuvers, risking billions of dollars in the process.

The elusive and highly coveted positions for proof theorists on Wall Street, where they are known as trans-quantitative analysts, have not been advertised, to the chagrin of executive recruiters who work on commission. Elite hedge funds and bank holding companies have been discreetly approaching mathematical logicians who have programming experience and who are familiar with arcane software such as the ordinal calculator. A few logicians were offered seven figure salaries, according to a source who was not authorized to speak on the matter.

Is this for real? I like the idea of ‘trans-quantitative analysts’: it reminds me of ‘transfinite numbers’, which is another name for infinities. But it sounds a bit like a joke, and I haven’t been able to track down any references to trans-quantitative analysts, except people talking about Christian Marks’ blog article.

I understand a bit about ordinal notations, but I don’t think this is the time to go into that—not before I’m sure this stuff is for real. Instead, I’d rather reflect on a comment of Boris Borcic over on Google+:

Last week it occurred to me that LessWrong and OvercomingBias together might play a role to explain why Singularists don’t seem to worry about High Frequency Robot Trading as a possible pathway for Singularity-like developments. I mean IMO they should, the Singularity is about machines taking control, ownership is control, HFT involves slicing ownership in time-slices too narrow for humans to know themselves owners and possibly control.

The ensuing discussion got diverted to the question of whether algorithmic trading involved ‘intelligence’, but maybe intelligence is not the point. Perhaps algorithmic traders have become successful parasites on the financial system without themselves being intelligent, simply by virtue of their speed. And the financial system, in turn, seems to be a successful parasite on our economy. Regulatory capture—the control of the regulating agencies by the industry they’re supposed to be regulating—seems almost complete. Where will this lead?


Disease-Spreading Zombies

20 July, 2012

Are you a disease-spreading zombie?

You may have read about the fungus that can infect an ant and turn it into a zombie, making it climb up the stem of a plant and hang onto it, then die and release spores from a stalk that grows out of its head.

But this isn’t the only parasite that controls the behavior of its host.

If you ever got sick, had diarrhea, and thought hard about why, you’ll understand what I mean. You were helping spread the disease… especially if you were poor and didn’t have a toilet. This is why improved sanitation actually reduces the virulence of some diseases: it’s no longer such a good strategy for bacteria to cause diarrhea, so they evolve away from it!

There are plenty of other examples. Lots of diseases make you sneeze or cough, spreading the germs to other people. The rabies virus drives dogs crazy and makes them want to bite. There’s a parasitic flatworm that makes ants want to climb to the top of a blade of grass, lock their jaws onto it and wait there until they get eaten by a sheep! But the protozoan Toxoplasma gondii is more mysterious.

It causes a disease called toxoplasmosis. You can get it from cats, you can get it from eating infected meat, and you can even inherit it from your mother.

Lots of people have it: somewhere between 1/3 and 1/2 of everyone in the world!

A while back, the Czech scientist Jaroslav Flegr did some experiments. He found that people who tested positive for this parasite have slower reaction times. But even more interestingly, he claims that men with the parasite are more introverted, suspicious, oblivious to other people’s opinions of them, and inclined to disregard rules… while infected women, are more outgoing, trusting, image-conscious, and rule-abiding than uninfected women!

What could explain this?

The disease is carried by both cats and mice. Cats catch it by eating mice. The disease causes behavior changes in mice: they seem to become more anxious and run around more. This may increase their chance of getting eaten by a cat and passing on the disease. But we are genetically similar to mice… so we too may become more anxious when we’re infected with this disease. And men and women may act differently when they’re anxious.

It’s just a theory so far. Nonetheless, I won’t be surprised to hear there are parasites that affect our behavior in subtle ways. I don’t know if viruses or bacteria are sophisticated enough to trigger changes in behavior more subtle than diarrhea… but there are always lots of bacteria in your body, about 10 times as many as actual human cells. Many of these belong to unidentified species. And as long as they don’t cause obvious pathologies, doctors have had little reason to study them.

As for viruses, don’t forget that about 8% of your DNA is made of viruses that once copied themselves into your ancestors’ genome. They’re called endogenous retroviruses, and I find them very spooky and fascinating. Once they get embedded in our DNA, they can’t always get back out: a lot of them are defective, containing deletions or nonsense mutations. But some may still be able to get back out. And there are hints that some are implicated in certain kinds of cancer and autoimmune disease.

Even more intriguingly, a 2004 study reported that antibodies to endogenous retroviruses were more common in people with schizophrenia! And the cerebrospinal fluid of people who’d recently gotten schizophrenia contained levels of a key enzyme used by retroviruses, reverse transcriptase, four times higher than control subjects.

So it’s possible—just possible—that some viruses, either free-living or built into our DNA, may change our behavior in subtle ways that increase their chance of spreading.

For more on Jaroslav Flegr’s research, read this fascinating article:

• Kathleen MacAuliffe, How your cat is making you crazy, The Atlantic, March 2012.

Among other things you’ll read about the parasitologists
Glenn McConkey and Joanne Webster, who have shown that Toxoplasma gondii has two genes that allow it to crank up production of the neurotransmitter dopamine in the host’s brain. It seems this makes rats feel pleasure when they smell a cat!

(Do you like cats? Hmm.)

Of course, in business and politics we see many examples of ‘parasites’ that hijack organizations and change these organizations’ behavior to benefit themselves. It’s not nice. But it’s natural.

So even if you aren’t a disease-spreading zombie, it’s quite possible you’re dealing with them on a regular basis.


Five Books About Our Future

16 May, 2012

Jordan Peacock has suggested interviewing me for Five Books, a website where people talk about five books they’ve read.

It’s probably going against the point of this site to read books especially for the purpose of getting interviewed about them. But I like the idea of talking about books that paint different visions of our future, and the issues we face. And I may need to read some more to carry out this plan.

So: what are you favorite books on this subject?

I’d like to pick books with different visions, preferably focused on the relatively near-term future, and preferably somewhat plausible—though I don’t expect every book to seem convincing to all reasonable people.

Here are some options that leap to mind.

Whole Earth Discipline

• Stewart Brand, Whole Earth Discipline: An Ecopragmatist Manifesto, Viking Penguin, 2009.

I’ve been meaning to write about this one for a long time! Brand argues that changes in this century will be dominated by global warming, urbanization and biotechnology. He advocates new thinking on topics that traditional environmentalists have rather set negative opinions about, like nuclear power, genetic engineering, and the advantages of urban life. This is on my list for sure.

Limits to Growth

• Donnella Meadows, Jørgen Randers, and Dennis Meadows, Limits to Growth: The 30-Year Update, Chelsea Green Publishing Company, 2004.

Sad to say, I’ve never read the original 1972 book The Limits to Growth—or the 1974 edition which among other things presented a simple computer model of world population, industrialization, pollution, food production and resource depletion. Both the book and the model (called World3) have been much criticized over the years. But recently some have argued its projections—which were intended to illustrate ideas, not predict the future—are not doing so badly:

• Graham Turner, A comparison of The Limits to Growth with thirty years of reality, Commonwealth Scientific and Industrial Research Organisation (CSIRO).

It would be interesting to delve into this highly controversial topic. By the way, the model is now available online:

• Brian Hayes, Limits to Growth.

with an engaging explanation here:

• Brian Hayes, World3, the public beta, Bit-Player: An Amateur’s Look at Computation and Mathematics, 15 April 2012.

It runs on your web-browser, and it’s easy to take a copy for yourself and play around with it.

The Ecotechnic Future

John Michael Greer believes that ‘peak oil’—or more precisely, the slow decline of fossil fuel production—will spell the end to our modern technological civilization. He spells this out here:

• John Michael Greer, The Long Descent, New Society Publishers, 2008.

I haven’t read this book, but I’ve read the sequel, which begins to imagine what comes afterwards:

• John Michael Greer, The Ecotechnic Future, New Society Publishers, 2009.

Here he argues that in the next century or three we will go through a transition through ‘scarcity economies’ to ‘salvage economies’ to sustainable economies that use much less energy than we do now.

Both these books seem to outrage everyone who envisages our future as a story of technological progress continuing more or less along the lines we’ve already staked out.

The Singularity is Near

In the opposite direction, we have:

• Ray Kurzweil, The Singularity is Near, Penguin Books, 2005.

I’ve only read bits of this. According to Wikipedia, the main premises of the book are:

• A technological-evolutionary point known as “the singularity” exists as an achievable goal for humanity. (What exactly does Kurzeil mean by the “the singularity”? I think I know what other people, like Vernor Vinge and Eliezer Yudkowsky, mean by it. But what does he mean?)

• Through a law of accelerating returns, technology is progressing toward the singularity at an exponential rate. (What does in the world does it mean to progress toward a singularity at an exponential rate? I know that Kurzweil provides evidence that lots of things are growing exponentially… but if they keep doing that, that’s not what I’d call a singularity.)

• The functionality of the human brain is quantifiable in terms of technology that we can build in the near future.

• Medical advances make it possible for a significant number of Kurzweil’s generation (Baby Boomers) to live long enough for the exponential growth of technology to intersect and surpass the processing of the human brain.

If you think you know a better book that advocates a roughly similar thesis, let me know.

A Prosperous Way Down

• Howard T. Odum and Elisabeth C. Odum, A Prosperous Way Down: Principles and Policies, Columbia University Press, 2001.

Howard T. Odum is the father of ‘systems ecology’, and developed an interesting graphical language for describing energy flows in ecosystems. According to George Mobus:

In this book he and Elisabeth take on the situation regarding social ecology under the conditions of diminishing energy flows. Taking principles from systems ecology involving systems suffering from the decline of energy (e.g. deciduous forests in fall), showing how such systems have adapted or respond to those conditions, they have applied these to the human social system. The Odums argued that if we humans were wise enough to apply these principles through policy decisions to ourselves, we might find similar ways to adapt with much less suffering than is potentially implied by sudden and drastic social collapse.

This seems to be a more scholarly approach to some of the same issues:

• Howard T. Odum, Environment, Power, and Society for the Twenty-First Century: The Hierarchy of Energy, Columbia U. Press, 2007.

More?

There are plenty of other candidates I know less about. These two seem to be free online:

• Lester Brown, World on the Edge: How to Prevent Environmental and Economic Collapse, W. W. Norton & Company, 2011.

• Richard Heinberg, The End of Growth: Adapting to Our New Economic Reality, New Society Publishers, 2009.

I would really like even more choices—especially books by thoughtful people who do think we can solve the problems confronting us… but do not think all problems will automatically be solved by human ingenuity and leave it to the rest of us to work out the, umm, details.


Azimuth on Google Plus (Part 4)

11 November, 2011

Again, some eye candy to start the show. Stare fixedly at the + sign here until the pink dots completely disappear:

In a semiconductor, a ‘hole’ is the absence of an electron, and it can move around a as if it were a particle. If you have a hole moving to the right, in reality you have electrons moving to the left. Here pink dots moving counterclockwise look like a green dot moving clockwise!

A related puzzle: what happens when you hold a helium balloon on a string while you’re driving in a car with the windows closed… and then you make a sharp right turn? I’ve done it, so I know from experience.

Now for the real stuff:

• Tom Murphy, a physics professor at U.C. San Diego, has a blog worth visiting: Do the Math. He uses physics and math to make informed guesses about the future of energy production. Try out his overview on ‘peak oil’.

• Hundreds of top conservation scientists took a survey, and 99.5% felt that a serious loss of biodiversity is either ‘likely’, ‘very likely’, or ‘virtually certain’. Tropical coral ecosystems were perceived as the most seriously affected. A slim majority think we need to decide on rules for ‘triage’: deciding which species to save and which to give up on.

• Climate change is causing a massive change in tree species across Western USA. “Ecosystems are always changing at the landscape level, but normally the rate of change is too slow for humans to notice,” said Steven Running, a co-author of a study on this at the University of Montana. “Now the rate of change is fast enough we can see it.” The study used remote sensing of large areas over a four-year period.

• The James Dyson Award calls on design and engineering students to create innovative, practical, elegant solutions to the challenges that face us. This year, Edward Linacre won for a self-powering device that extracts water from the air for irrigation purposes. Linacre comes from the drought-afflicted continent of Australia. But his invention borrows some tricks from the Namib beetle, which survives some of the driest deserts in Africa by harvesting the moisture that condenses on its back during the early morning. That’s called biomimicry.


• The New York Times has a great profile of Jeremy Grantham. He heads a successful firm managing $100 billion assets, and now he’s 72. So why is he saying this?

… it’s very important to me to make a lot of money now, much more than when I was 40 or 50.

Not because he has a brand new gold-digger ‘trophy wife’ or spendthrift heirs. No, he puts all the money into the Grantham Foundation for the Protection of the Environment. He’s famous for his quarterly letters on future trends—you can read them free online! And thanks to this, he has some detailed ideas about what’s coming up, and what we should do about it:

Energy “will give us serious and sustained problems” over the next 50 years as we make the transition from hydrocarbons—oil, coal, gas—to solar, wind, nuclear and other sources, but we’ll muddle through to a solution to Peak Oil and related challenges. Peak Everything Else will prove more intractable for humanity. Metals, for instance, “are entropy at work . . . from wonderful metal ores to scattered waste,” and scarcity and higher prices “will slowly increase forever,” but if we scrimp and recycle, we can make do for another century before tight constraint kicks in.

Agriculture is more worrisome. Local water shortages will cause “persistent irritation”—wars, famines. Of the three essential macro nutrient fertilizers, nitrogen is relatively plentiful and recoverable, but we’re running out of potassium and phosphorus, finite mined resources that are “necessary for all life.” Canada has large reserves of potash (the source of potassium), which is good news for Americans, but 50 to 75 percent of the known reserves of phosphate (the source of phosphorus) are located in Morocco and the western Sahara. Assuming a 2 percent annual increase in phosphorus consumption, Grantham believes the rest of the world’s reserves won’t last more than 50 years, so he expects “gamesmanship” from the phosphate-rich.

And he rates soil erosion as the biggest threat of all. The world’s population could reach 10 billion within half a century—perhaps twice as many human beings as the planet’s overtaxed resources can sustainably support, perhaps six times too many.

It’s not that he doesn’t take climate change seriously. However, he seems to have almost given up on the US political establishment doing anything about it. So he’s shifted his focus:

Grantham put his own influence and money behind the climate-change bill passed by the House in 2009. “But even $100 million wouldn’t have gotten it through the Senate,” he said. “The recession more or less ruled it out. It pushed anything having to do with the environment down 10 points, across the board. Unemployment and interest in environmental issues move inversely.”

Having missed a once-in-a-generation legislative opportunity to address climate change, American environmentalists are looking for new strategies. Grantham believes that the best approach may be to recast global warming, which depresses crop yields and worsens soil erosion, as a factor contributing to resource depletion. “People are naturally much more responsive to finite resources than they are to climate change,” he said. “Global warming is bad news. Finite resources is investment advice.” He believes this shift in emphasis plays to Americans’ strength. “Americans are just about the worst at dealing with long-term problems, down there with Uzbekistan,” he said, “but they respond to a market signal better than almost anyone. They roll the dice bigger and quicker than most.”

Let’s wrap up with some more fun stuff: impressive volcanos!

Morgan Abbou explains:

Volcanic lightning photograph by Francisco Negroni. In a scene no human could have witnessed, an apocalyptic agglomeration of lightning bolts illuminates an ash cloud above Chile’s Puyehue volcano in June 2011. The minutes-long exposure shows individual bolts as if they’d all occurred at the same moment and, due to the Earth’s rotation, renders stars (left) as streaks. Lightning to the right of the ash cloud appears to have illuminated nearby clouds.hence the apparent absence of stars on that side of the picture. After an ominous series of earthquakes on the previous day, the volcano erupted that afternoon, convincing authorities to evacuate some 3,500 area residents. Eruptions over the course of the weekend resulted in heavy ashfalls, including in Argentine towns 60 miles (a hundred kilometers) away.

Here’s another shot of the same volcano:

And here’s Mount Etna blowing out a smoke ring in March of 2000. By its shadow, this ring was estimated to be 200 meters in diameter!


Apocalypse, Retreat or Revolution?

3 November, 2011

I’ve been enjoying this book:

• Tim Lenton and Andrew Watson, Revolutions That Made the Earth, Oxford U. Press, Oxford, 2011.

It’s mainly about the history of life on Earth, and how life has affected the climate and atmosphere. For example: when photosynthesis first started pumping a deadly toxic gas into the atmosphere—oxygen—how did life evolve to avoid disaster?

Or: why did most of the Earth freeze, about 650 million years ago, and what did life do then?

Or: what made 96% of all marine species and 70% of vertebrates on land die out, around 250 million years ago?

This is the book’s strength: a detailed but readable version of the greatest story we know, complete with mysteries yet to be solved. But at the end they briefly ponder the future. They consider various scenarios, lumped into three categories: apocalypse, retreat or revolution.

Apocalypse

They begin by reviewing the familiar story: how soaring population and fossil fuel usage is making our climate ever hotter, making our oceans ever more acidic, and sucking phosphorus and other nutrients out of ground and into the sea.

They consider different ways these trends could push the Earth into a new, inhospitable state. They use the term ‘apocalypse’. I think ‘disaster’ is better, but anyway, they write:

Even the normally cheerful and creative Jim Lovelock argues that we are already doomed, and nothing we can do now will stop the Earth system being carried by its own internal dynamics into a different and inhospitable state for us. If so, all we can do is try to adapt. We disagree—in our view the game is not yet up. As far as we can see no one has yet made a convincing scientific case that we are close to a global tipping point for ‘runaway’ climate change.

[...]

Yet even without truly ‘runaway’ change, the combination of unmitigated fossil fuel burning and positive feedbacks from within the Earth system could still produce an apocalyptic climate for humanity. We could raise global temperature by up to 6 °C this century, with more to come next century. On the way there, many parts of the Earth system could pas their own thresholds and undergo profound changes in state. These are what Tim [Lenton] and colleagues have called ‘tipping elements’ in the climate system.

They warrant a book by themselves, so we will just touch on them briefly here. The tipping elements include the great ice sheets covering Greenland and West Antarctica that are already losing mass and adding to sea level rise. In the tropics, there are already changes in atmospheric circulation, and in the pattern of El Niño events. The Amazon rainforest suffered severe drought in 2005 and might in the future face a climate drying-triggered dieback, destroying biodiversity and adding carbon to the atmosphere. Over India, an atmospheric brown cloud of pollution is already disrupting the summer monsoon, threatening food security. The monsoon in West Africa could be seriously disrupted as the neighboring ocean warms up. The boreal forests that cloak the northern high latitudes are threatened by warming, forest fires and insect infestation. The list goes on. The key point is that the Earth’s climate, being a complex feedback system, is unlikely to respond in an entirely smooth and proportional way to significant changes in energy balance caused by human activities.

Here is a map of some tipping elements. Click for more details:

Retreat

They write:

A popular answer to apocalyptic visions of the future is retreat, into a lower energy, lower material consumption, and ultimately lower population world. In this future world the objective is to minimize human effects on the Earth system and allow Gaia to reassert herself, with more room for natural ecosystems and minimal intervention in global cycles. The noble aim is long-term sustainability for for people as well as the planet.

There are some good and useful things we can take from such visions of the future, especially in helping to wean ourselves off fossil fuels, achieve greater energy efficiency, promote recycling and redefine what we mean by quality of life. However, we think that visions of retreat are hopelessly at odds with current trends, and with the very nature of what drives revolutionary changes of the Earth. They lack pragmatism and ultimately they lack ambition. Moreover, a retreat sufficient to forestall the problems outlined above might be just as bad as the problems it sought to avoid.

Revolution

They write:

Our alternative vision of the future is of revolution, into a high energy, high recycling world that can support billions of people as part of a thriving and sustainable biosphere. The key to reaching this vision of the future is to learn from past revolutions: future civilizations must be fuelled from sustainable energy sources, and they must undertake a greatly enhanced recycling of resources.

And here is where the lessons of previous ‘revolutions’ are especially useful. As I said last time, they list four:

1. The origin of life, before 3.8 billion years ago.

2. The Great Oxidation, when photosynthesis put oxygen into the atmosphere between 3.4 and 2.5 billion years ago.

3. The rise of complex life (eukaryotes), roughly 2 billion years ago.

4. The rise of humanity, roughly 0 billion years ago.

Their book argues that all three of the earlier revolutions disrupted the Earth’s climate, pushing it out of stability. It only restabilized after reaching a fundamentally new state. This new stable state could only be born after some new feedback mechanisms had developed.

For example, in every revolution, it has been important to find ways to recycle ‘wastes’ and make them into useful ‘resources’. This was true with oxygen during the Great Oxidation… and it must be true with our waste products now!

In any sort of approximate equilibrium state, there can’t be much ‘waste’: almost everything needs to be recycled. Serious amounts of ‘waste’ can only occur for fairly short periods of time, in the grand scheme of things. For example, we are now burning fossil fuels and creating a lot of waste CO2, but this can’t go on forever: it’s only a transitional phase.

Apocalypse and Revolution?

I should talk about all this in more detail someday. But not today.

For now, I would just like to suggest that ‘apocalypse’ and ‘revolution’ are not really diametrically opposed alternatives. All three previous revolutions destroyed the world as it had been!

For example, when the Great Oxidation occurred, this was an ‘apocalypse’ for anaerobic life forms, who now struggle to survive in specialized niches here and there. It only seems like a triumphant ‘revolution’ in retrospect, to the new life forms that comfortably survive in the new world.

So, I think we’re headed for a combination of apocalypse and revolution: the death of many old things, and the birth of new ones. At best we have a bit of influence in nudging things in a direction we like. I don’t think ‘retreat’ is a real option: nostalgic though I am about many old things, time always pushes us relentlessly into new and strange worlds.


US Weather Disasters in 2011

6 September, 2011

The US Federal Emergency Management Agency (FEMA) is running out of money!

So far this year, ten weather disasters have each caused over a billion dollars of damage in the United States. This beats the record set in 2008, when there were nine. FEMA now has less than a billion dollars in its fund:

• Brian Naylor, Costs Of Irene Add Up As FEMA Runs Out Of Cash, Morning Edition, National Public Radio, 30 August 2011.

Let’s review these disasters:

10) Hurricane Irene, August 27-28: A large and powerful Atlantic hurricane that left extensive flood and wind damage along its path through the Caribbean, the east coast of the US and as far north as Atlantic Canada. Early estimates say Irene caused $7 billion in damages in the US.

9) Upper Midwest flooding, summer: An above-average snowpack across the northern Rocky Mountains, combined with rainstorms, caused the Missouri and Souris rivers to swell beyond their banks across the Upper Midwest. An estimated 11,000 people were forced to evacuate Minot, N.D. Numerous levees were breached along the Missouri River, flooding thousands of acres of farmland. Over $2 billion in damages.

8) Mississippi River flooding, spring-summer: Persistent rainfall (nearly triple the normal amount in the Ohio Valley), combined with melting snowpack, caused historical flooding along the Mississippi River and its tributaries. At least two people died. $2 to $4 billion in damages.

7) Southern Plains/Southwest drought, heat wave and wildfires, spring and summer: Drought, heat waves, and wildfires hit Texas, Oklahoma, New Mexico, Arizona, southern Kansas, western Arkansas and Louisiana this year. Wildfire fighting costs for the region are about $1 million per day, with over 2,000 homes and structures lost by mid-August. Over $5 billion in damages so far.

6) Midwest/Southeast tornadoes, May 22-27: Central and southern states saw approximately 180 twisters and 177 deaths within a week. A tornado in Joplin, Mo., caused at least 141 deaths—the deadliest single tornado to strike the United States since modern record keeping began in 1950. Over $7 billion in damages.

5) Southeast/Ohio Valley/Midwest tornadoes, April 25-30: This outbreak of tornadoes over central and southern states led to 327 deaths. Of those fatalities, 240 occurred in Alabama. The deadliest of the estimated 305 tornadoes in the outbreak was an EF-5 that hit northern Alabama, killing 78 people. Several big cities were directly affected by strong tornadoes, including Tuscaloosa, Birmingham and Huntsville in Alabama, and Chattanooga in Tennessee. Over $9 billion in damages.

4) Midwest/Southeast tornadoes, April 14-16: An outbreak over central and southern states produced an estimated 160 tornadoes. Thirty-eight people died, 22 of them in North Carolina. Over $2 billion in damages.

3) Southeast/Midwest tornadoes, April 8-11: An outbreak of tornadoes over central and southern states saw an estimated 59 tornadoes. Over $2.2 billion in damages.

2) Midwest/Southeast tornadoes, April 4-5: An outbreak of tornadoes over central and southern states saw an estimated 46 tornadoes. Nine people died. Over $2.3 billion in damages.

1) Blizzard, Jan 29-Feb 3: A large winter storm hit many central, eastern and northeastern states. 36 people died. Over $2 billion in damages.

I got most of this information from this article, which was written before Irene pushed 2011 into the lead:

• Brett Israel, 2011 ties for most billion-dollar weather disasters, Our Amazing Planet, 18 August 2011.

We can expect more weather disasters as global warming proceeds. The National Academy of Sciences says:

• Increases of precipitation at high latitudes and drying of the already semi-arid regions are projected with increasing global warming, with seasonal changes in several regions expected to be about 5-10% per degree of warming. However, patterns of precipitation show much larger variability across models than patterns of temperature.

• Large increases in the area burned by wildfire are expected in parts of Australia, western Canada, Eurasia and the United States.

• Extreme precipitation events—that is, days with the top 15% of rainfall—are expected to increase by 3-10% per degree of warming.

• In many regions the amount of flow in streams and rivers is expected to change by 5-15% per degree of warming, with decreases in some areas and increases in others.

• The total number of tropical cyclones should decrease slightly or remain unchanged. Their wind speed is expected to increase by 1-4% per degree of warming.

Some people worry about sea level rise, but I think the bite from weather disasters and ensuing crop failures will hurt much more, much sooner.

Since it doesn’t look like politicians will do enough to cut carbon emissions, insurance companies are moving to act on their own—not to prevent weather disasters, but to minimize their effect:

Swiss Re’s global headquarters face Lake Zurich, overlooking a small yacht harbor. Bresch and a colleague, Andreas Schraft, sometimes walk the 20 minutes to the train station together after work, past more yachts, an arboretum, and a series of bridges. In September 2005, probably on one of these walks, the two began to discuss what they now call “Faktor K,” for “Kultur”: the culture factor. Losses from Hurricanes Katrina, Rita, and Wilma had been much higher than expected in ways the existing windstorm models hadn’t predicted, and it wasn’t because they were far off on wind velocities.

The problem had to do more with how people on the Gulf Coast were assessing windstorm risk as a group. Mangrove swamps on the Louisiana coast had been cut down and used as fertilizer, stripping away a barrier that could have sapped the storm of some of its energy. Levees were underbuilt, not overbuilt. Reinsurers and modeling firms had focused on technology and the natural sciences; they were missing lessons from economists and social scientists. “We can’t just add another bell and whistle to the model,” says Bresch, “It’s about how societies tolerate risk.”

“We approach a lot of things as much as we can from the point of statistics and hard data,” says David Smith, head of model development for Eqecat, a natural hazards modeling firm. “It’s not the perfect expression.” The discrepancy between the loss his firm modeled for Katrina and the ultimate claims-based loss number for his clients was the largest Smith had seen. Like others in the industry, Eqecat had failed to anticipate the extent of levee failure. Construction quality in the Gulf states before Katrina was poorer than anticipated, and Eqecat was surprised by a surge in demand after the storm that inflated prices for labor and materials to rebuild. Smith recognizes that these are questions for sociologists and economists as well as engineers, and he consults with the softer sciences to get his models right. But his own market has its demands, too. “The more we can base the model on empirical data,” he says, “the more defendable it is.”

After their walk around the lake in 2005, Swiss Re’s Bresch and Schraft began meeting with social scientists and laying out two goals. First, they wanted to better understand the culture factor and, ultimately, the risks they were underwriting. Second, they wanted to use that understanding to help the insured prevent losses before they had to be paid for.

The business of insurers and reinsurers rests on balancing a risk between two extremes. If the risk isn’t probable enough, or the potential loss isn’t expensive enough, there’s no reason for anyone to buy insurance for it. If it’s too probable and the loss too expensive, the premium will be unaffordable. This is bad for both the insured and the insurer. So the insurance industry has an interest in what it calls “loss mitigation.” It encourages potential customers to keep their property from being destroyed in the first place. If Swiss Re is trying to affect the behavior of the property owners it underwrites, it’s sending a signal: Some behavior is so risky that it’s hard to price. Keep it up, and you’ll have no insurance and we’ll have no business. That’s bad for everyone.

To that end, Swiss Re has started speaking about climate risk, not climate change. That the climate is changing has been established in the eyes of the industry. “For a long time,” says Bresch, “people thought we only needed to do detailed modeling to truly understand in a specific region how the climate will change. … You can do that forever.” In many places, he says, climate change is only part of the story. The other part is economic development. In other words, we’re building in the wrong places in the wrong way, so wrong that what we build often isn’t even insurable. In an interview published by Swiss Re, Wolf Dombrowsky, of the Disaster Research Center at Kiel University in Germany, points out that it’s wrong to say that a natural disaster destroyed something; the destruction was not nature’s fault but our own.

In 1888 the city of Sundsvall in Sweden, built of wood, burned to the ground. A group of reinsurers, Swiss Re among them, let Sweden’s insurers know there was going to be a limit in the future on losses from wooden houses, and it was going to be low. Sweden began building with stone. Reinsurance is a product, but also a carrot in the negotiation between culture and reality; it lets societies know what habits are unsustainable.

More recently, the company has been working with McKinsey & Co., the European Commission, and several environmental groups to develop a methodology it calls the “economics of climate adaptation,” a way to encourage city planners to build in a way that will be insurable in the future. A study of the U.K. port of Hull looks at potential losses by 2030 under several different climate scenarios. Even under the most extreme, losses were expected to grow by $17 million due to climate change and by $23 million due to economic growth. How Hull builds in the next two decades matters more to it than the levels of carbon dioxide in the air. A similar study for Entergy (ETR), a New Orleans-based utility, concluded that adaptations on the Gulf Coast—such as tightening building codes, restoring wetlands and barrier islands, building levees around chemical plants, and requiring that new homes in high-risk areas be elevated—could almost completely offset the predicted cost of 100-year storms happening every 40 years.

I actually disagree somewhat with the statement “it’s wrong to say that a natural disaster destroyed something; the destruction was not nature’s fault but our own.” There’s some truth to this, but also some untruth. The question of “fault” or “blame” is a slippery one here, and there’s probably no way to completely settle it.

Is it the “fault” of people in Vermont that they weren’t fully prepared for a hurricane? After all, it’s rare—or at least it used to be rare—for hurricanes to make it that far north. The governor of Vermont, Peter Shumlin, recently said:

I find it extraordinary that so many political leaders won’t actually talk about the relationship between climate change, fossil fuels, our continuing irrational exuberance about burning fossil fuels, in light of these storm patterns that we’ve been experiencing.

We had storms this spring that flooded our downtowns and put us through many of the same exercises that we’re going through right now. We didn’t used to get weather patterns like this in Vermont. We didn’t get tropical storms. We didn’t get flash flooding.

We in the colder states are going to see the results of climate change first. Myself, Premier Charest up in Quebec, Governor Cuomo over in New York, we understand that the flooding and the extraordinary weather patterns that we’re seeing are a result of our burnings of fossil fuel. We’ve got to get off fossil fuels as quickly as we know how, to make this planet livable for our children and our grandchildren.

On the other hand, you could say that it is the fault of Vermonters, or at least humanity as a whole, for causing global warming in the first place.

But ultimately, pinning blame on someone or something is less important than figuring out how to solve the problems we face.


Follow

Get every new post delivered to your Inbox.

Join 2,716 other followers