## High-Speed Finance

8 August, 2012

These days, a lot of buying and selling of stocks is done by computers—it’s called algorithmic trading. Computers can do it much faster than people. Watch how they’ve been going wild!

The date is at lower left. In 2000 it took several seconds for computers to make a trade. By 2010 the time had dropped to milliseconds… or even microseconds. And around this year, market activity started becoming much more intense.

I can’t even see the Flash Crash on May 6 of 2010—also known as The Crash of 2:45. The Dow Jones plummeted 9% in 5 minutes, then quickly bounced back. For fifteen minutes, the economy lost a trillion dollars. Then it reappeared.

But on August 5, 2011, when the credit rating of the US got downgraded, you’ll see the activity explode! And it’s been crazy ever since.

The movie above was created by Nanex, a company that provides market data to traders. The x axis shows the time of day, from 9:30 to 16:00. The y axis… well, it’s the amount of some activity per unit time, but they don’t say what. Do you know?

The folks at Nanex have something very interesting to say about this. It’s not high frequency trading or ‘HFT’ that they’re worried about—that’s actually gone down slightly from 2008 to 2012. What’s gone up is ‘high frequency quoting’, also known as ‘quote spam’ or ‘quote stuffing’.

Over on Google+, Sergey Ten explained the idea to me:

Quote spam is a well-known tactic. It used by high-frequency traders to get competitive advantage over other high-frequency traders. HF traders generate high-frequency quote spam using a pseudorandom (or otherwise structured) algorithm, with his computers coded to ignore it. His competitors don’t know the generating algorithm and have to process each quote, thus increasing their load, consuming bandwidth and getting a slight delay in processing.

A quote is an offer to buy or sell stock at a given price. For a clear and entertaining of how this works and why traders are locked into a race for speed, try:

• Chris Stucchio, A high frequency trader’s apology, Part 1, 16 April 2012. Part 2, 25 April 2012.

I don’t know a great introduction to quote spam, but this paper isn’t bad:

• Jared F. Egginton, Bonnie F. Van Ness, and Robert A. Van Ness, Quote stuffing, 15 March 2012.

### Toward the physical limits of speed

In fact, the battle for speed is so intense that trading has run up against the speed of light.

For example, by 2013 there will be a new transatlantic cable at the bottom of the ocean, the first in a decade. Why? Just to cut the communication time between US and UK traders by 5 milliseconds. The new fiber optic line will be straighter than existing ones:

“As a rule of thumb, each 62 miles that the light has to travel takes about 1 millisecond,” Thorvardarson says. “So by straightening the route between Halifax and London, we have actually shortened the cable by 310 miles, or 5 milliseconds.”

Meanwhile, a London-based company called Fixnetix has developed a special computer chip that can prepare a trade in just 740 nanoseconds. But why stop at nanoseconds?

With the race for the lowest “latency” continuing, some market participants are even talking about picoseconds––trillionths of a second. At first the realm of physics and math and then computer science, the picosecond looms as the next time barrier.

Actions that take place in nanoseconds and picoseconds in some cases run up against the sheer limitations of physics, said Mark Palmer, chief executive of Lexington, Mass.-based StreamBase Systems.

### Black swans and the ultrafast machine ecology

As high-frequency trading and high-frequency quoting leave slow-paced human reaction times in the dust, markets start to behave differently. Here’s a great paper about that:

• Neil Johnson, Guannan Zhao, Eric Hunsader, Jing Meng, Amith Ravindar, Spencer Carran amd Brian Tivnan, Financial black swans driven by ultrafast machine ecology.

A black swan is an unexpectedly dramatic event, like a market crash or a stock bubble that bursts. But according to this paper, such events are now happening all the time at speeds beyond our perception!

Here’s one:

It’s a price spike in the stock of a company called Super Micro Computer, Inc.. On October 1st, 2010, it shot up 26% and then crashed back down. But this all happened in 25 milliseconds!

These ultrafast black swans happen at least once a day. And they happen most of all to financial institutions.

• Mark Buchanan, Approaching the singularity—in global finance, The Physics of Finance, 13 February 2012.

I won’t try to outdo Buchanan’s analysis. I’ll just quote the abstract of the original paper:

Society’s drive toward ever faster socio-technical systems, means that there is an urgent need to understand the threat from ‘black swan’ extreme events that might emerge. On 6 May 2010, it took just five minutes for a spontaneous mix of human and machine interactions in the global trading cyberspace to generate an unprecedented system-wide Flash Crash. However, little is known about what lies ahead in the crucial sub-second regime where humans become unable to respond or intervene sufficiently quickly. Here we analyze a set of 18,520 ultrafast black swan events that we have uncovered in stock-price movements between 2006 and 2011. We provide empirical evidence for, and an accompanying theory of, an abrupt system-wide transition from a mixed human-machine phase to a new all-machine phase characterized by frequent black swan events with ultrafast durations (<650ms for crashes, <950ms for spikes). Our theory quantifies the systemic fluctuations in these two distinct phases in terms of the diversity of the system's internal ecology and the amount of global information being processed. Our finding that the ten most susceptible entities are major international banks, hints at a hidden relationship between these ultrafast 'fractures' and the slow 'breaking' of the global financial system post-2006. More generally, our work provides tools to help predict and mitigate the systemic risk developing in any complex socio-technical system that attempts to operate at, or beyond, the limits of human response times.

### Trans-quantitative analysts?

When you get into an arms race of trying to write algorithms whose behavior other algorithms can’t predict, the math involved gets very tricky. Over on Google+, F. Lengvel pointed out something strange. In May 2010, Christian Marks claimed that financiers were hiring experts on large ordinals—crudely speaking, big infinite numbers!—to design algorithms that were hard to outwit.

I can’t confirm his account, but I can’t resist quoting it:

In an unexpected development for the depressed market for mathematical logicians, Wall Street has begun quietly and aggressively recruiting proof theorists and recursion theorists for their expertise in applying ordinal notations and ordinal collapsing functions to high-frequency algorithmic trading. Ordinal notations, which specify sequences of ordinal numbers of ever increasing complexity, are being used by elite trading operations to parameterize families of trading strategies of breathtaking sophistication.

The monetary advantage of the current strategy is rapidly exhausted after a lifetime of approximately four seconds — an eternity for a machine, but barely enough time for a human to begin to comprehend what happened. The algorithm then switches to another trading strategy of higher ordinal rank, and uses this for a few seconds on one or more electronic exchanges, and so on, while opponent algorithms attempt the same maneuvers, risking billions of dollars in the process.

The elusive and highly coveted positions for proof theorists on Wall Street, where they are known as trans-quantitative analysts, have not been advertised, to the chagrin of executive recruiters who work on commission. Elite hedge funds and bank holding companies have been discreetly approaching mathematical logicians who have programming experience and who are familiar with arcane software such as the ordinal calculator. A few logicians were offered seven figure salaries, according to a source who was not authorized to speak on the matter.

Is this for real? I like the idea of ‘trans-quantitative analysts’: it reminds me of ‘transfinite numbers’, which is another name for infinities. But it sounds a bit like a joke, and I haven’t been able to track down any references to trans-quantitative analysts, except people talking about Christian Marks’ blog article.

I understand a bit about ordinal notations, but I don’t think this is the time to go into that—not before I’m sure this stuff is for real. Instead, I’d rather reflect on a comment of Boris Borcic over on Google+:

Last week it occurred to me that LessWrong and OvercomingBias together might play a role to explain why Singularists don’t seem to worry about High Frequency Robot Trading as a possible pathway for Singularity-like developments. I mean IMO they should, the Singularity is about machines taking control, ownership is control, HFT involves slicing ownership in time-slices too narrow for humans to know themselves owners and possibly control.﻿

The ensuing discussion got diverted to the question of whether algorithmic trading involved ‘intelligence’, but maybe intelligence is not the point. Perhaps algorithmic traders have become successful parasites on the financial system without themselves being intelligent, simply by virtue of their speed. And the financial system, in turn, seems to be a successful parasite on our economy. Regulatory capture—the control of the regulating agencies by the industry they’re supposed to be regulating—seems almost complete. Where will this lead?

20 July, 2012

Are you a disease-spreading zombie?

You may have read about the fungus that can infect an ant and turn it into a zombie, making it climb up the stem of a plant and hang onto it, then die and release spores from a stalk that grows out of its head.

But this isn’t the only parasite that controls the behavior of its host.

If you ever got sick, had diarrhea, and thought hard about why, you’ll understand what I mean. You were helping spread the disease… especially if you were poor and didn’t have a toilet. This is why improved sanitation actually reduces the virulence of some diseases: it’s no longer such a good strategy for bacteria to cause diarrhea, so they evolve away from it!

There are plenty of other examples. Lots of diseases make you sneeze or cough, spreading the germs to other people. The rabies virus drives dogs crazy and makes them want to bite. There’s a parasitic flatworm that makes ants want to climb to the top of a blade of grass, lock their jaws onto it and wait there until they get eaten by a sheep! But the protozoan Toxoplasma gondii is more mysterious.

It causes a disease called toxoplasmosis. You can get it from cats, you can get it from eating infected meat, and you can even inherit it from your mother.

Lots of people have it: somewhere between 1/3 and 1/2 of everyone in the world!

A while back, the Czech scientist Jaroslav Flegr did some experiments. He found that people who tested positive for this parasite have slower reaction times. But even more interestingly, he claims that men with the parasite are more introverted, suspicious, oblivious to other people’s opinions of them, and inclined to disregard rules… while infected women, are more outgoing, trusting, image-conscious, and rule-abiding than uninfected women!

What could explain this?

The disease is carried by both cats and mice. Cats catch it by eating mice. The disease causes behavior changes in mice: they seem to become more anxious and run around more. This may increase their chance of getting eaten by a cat and passing on the disease. But we are genetically similar to mice… so we too may become more anxious when we’re infected with this disease. And men and women may act differently when they’re anxious.

It’s just a theory so far. Nonetheless, I won’t be surprised to hear there are parasites that affect our behavior in subtle ways. I don’t know if viruses or bacteria are sophisticated enough to trigger changes in behavior more subtle than diarrhea… but there are always lots of bacteria in your body, about 10 times as many as actual human cells. Many of these belong to unidentified species. And as long as they don’t cause obvious pathologies, doctors have had little reason to study them.

As for viruses, don’t forget that about 8% of your DNA is made of viruses that once copied themselves into your ancestors’ genome. They’re called endogenous retroviruses, and I find them very spooky and fascinating. Once they get embedded in our DNA, they can’t always get back out: a lot of them are defective, containing deletions or nonsense mutations. But some may still be able to get back out. And there are hints that some are implicated in certain kinds of cancer and autoimmune disease.

Even more intriguingly, a 2004 study reported that antibodies to endogenous retroviruses were more common in people with schizophrenia! And the cerebrospinal fluid of people who’d recently gotten schizophrenia contained levels of a key enzyme used by retroviruses, reverse transcriptase, four times higher than control subjects.

So it’s possible—just possible—that some viruses, either free-living or built into our DNA, may change our behavior in subtle ways that increase their chance of spreading.

For more on Jaroslav Flegr’s research, read this fascinating article:

• Kathleen MacAuliffe, How your cat is making you crazy, The Atlantic, March 2012.

Among other things you’ll read about the parasitologists
Glenn McConkey and Joanne Webster, who have shown that Toxoplasma gondii has two genes that allow it to crank up production of the neurotransmitter dopamine in the host’s brain. It seems this makes rats feel pleasure when they smell a cat!

(Do you like cats? Hmm.)

Of course, in business and politics we see many examples of ‘parasites’ that hijack organizations and change these organizations’ behavior to benefit themselves. It’s not nice. But it’s natural.

So even if you aren’t a disease-spreading zombie, it’s quite possible you’re dealing with them on a regular basis.

## Five Books About Our Future

16 May, 2012

Jordan Peacock has suggested interviewing me for Five Books, a website where people talk about five books they’ve read.

It’s probably going against the point of this site to read books especially for the purpose of getting interviewed about them. But I like the idea of talking about books that paint different visions of our future, and the issues we face. And I may need to read some more to carry out this plan.

So: what are you favorite books on this subject?

I’d like to pick books with different visions, preferably focused on the relatively near-term future, and preferably somewhat plausible—though I don’t expect every book to seem convincing to all reasonable people.

Here are some options that leap to mind.

### Whole Earth Discipline

• Stewart Brand, Whole Earth Discipline: An Ecopragmatist Manifesto, Viking Penguin, 2009.

I’ve been meaning to write about this one for a long time! Brand argues that changes in this century will be dominated by global warming, urbanization and biotechnology. He advocates new thinking on topics that traditional environmentalists have rather set negative opinions about, like nuclear power, genetic engineering, and the advantages of urban life. This is on my list for sure.

### Limits to Growth

• Donnella Meadows, Jørgen Randers, and Dennis Meadows, Limits to Growth: The 30-Year Update, Chelsea Green Publishing Company, 2004.

Sad to say, I’ve never read the original 1972 book The Limits to Growth—or the 1974 edition which among other things presented a simple computer model of world population, industrialization, pollution, food production and resource depletion. Both the book and the model (called World3) have been much criticized over the years. But recently some have argued its projections—which were intended to illustrate ideas, not predict the future—are not doing so badly:

• Graham Turner, A comparison of The Limits to Growth with thirty years of reality, Commonwealth Scientific and Industrial Research Organisation (CSIRO).

It would be interesting to delve into this highly controversial topic. By the way, the model is now available online:

• Brian Hayes, Limits to Growth.

with an engaging explanation here:

• Brian Hayes, World3, the public beta, Bit-Player: An Amateur’s Look at Computation and Mathematics, 15 April 2012.

It runs on your web-browser, and it’s easy to take a copy for yourself and play around with it.

### The Ecotechnic Future

John Michael Greer believes that ‘peak oil’—or more precisely, the slow decline of fossil fuel production—will spell the end to our modern technological civilization. He spells this out here:

• John Michael Greer, The Long Descent, New Society Publishers, 2008.

I haven’t read this book, but I’ve read the sequel, which begins to imagine what comes afterwards:

• John Michael Greer, The Ecotechnic Future, New Society Publishers, 2009.

Here he argues that in the next century or three we will go through a transition through ‘scarcity economies’ to ‘salvage economies’ to sustainable economies that use much less energy than we do now.

Both these books seem to outrage everyone who envisages our future as a story of technological progress continuing more or less along the lines we’ve already staked out.

### The Singularity is Near

In the opposite direction, we have:

• Ray Kurzweil, The Singularity is Near, Penguin Books, 2005.

I’ve only read bits of this. According to Wikipedia, the main premises of the book are:

• A technological-evolutionary point known as “the singularity” exists as an achievable goal for humanity. (What exactly does Kurzeil mean by the “the singularity”? I think I know what other people, like Vernor Vinge and Eliezer Yudkowsky, mean by it. But what does he mean?)

• Through a law of accelerating returns, technology is progressing toward the singularity at an exponential rate. (What does in the world does it mean to progress toward a singularity at an exponential rate? I know that Kurzweil provides evidence that lots of things are growing exponentially… but if they keep doing that, that’s not what I’d call a singularity.)

• The functionality of the human brain is quantifiable in terms of technology that we can build in the near future.

• Medical advances make it possible for a significant number of Kurzweil’s generation (Baby Boomers) to live long enough for the exponential growth of technology to intersect and surpass the processing of the human brain.

If you think you know a better book that advocates a roughly similar thesis, let me know.

### A Prosperous Way Down

• Howard T. Odum and Elisabeth C. Odum, A Prosperous Way Down: Principles and Policies, Columbia University Press, 2001.

Howard T. Odum is the father of ‘systems ecology’, and developed an interesting graphical language for describing energy flows in ecosystems. According to George Mobus:

In this book he and Elisabeth take on the situation regarding social ecology under the conditions of diminishing energy flows. Taking principles from systems ecology involving systems suffering from the decline of energy (e.g. deciduous forests in fall), showing how such systems have adapted or respond to those conditions, they have applied these to the human social system. The Odums argued that if we humans were wise enough to apply these principles through policy decisions to ourselves, we might find similar ways to adapt with much less suffering than is potentially implied by sudden and drastic social collapse.

This seems to be a more scholarly approach to some of the same issues:

• Howard T. Odum, Environment, Power, and Society for the Twenty-First Century: The Hierarchy of Energy, Columbia U. Press, 2007.

### More?

There are plenty of other candidates I know less about. These two seem to be free online:

• Lester Brown, World on the Edge: How to Prevent Environmental and Economic Collapse, W. W. Norton & Company, 2011.

• Richard Heinberg, The End of Growth: Adapting to Our New Economic Reality, New Society Publishers, 2009.

I would really like even more choices—especially books by thoughtful people who do think we can solve the problems confronting us… but do not think all problems will automatically be solved by human ingenuity and leave it to the rest of us to work out the, umm, details.

## Azimuth on Google Plus (Part 4)

11 November, 2011

Again, some eye candy to start the show. Stare fixedly at the + sign here until the pink dots completely disappear:

In a semiconductor, a ‘hole’ is the absence of an electron, and it can move around a as if it were a particle. If you have a hole moving to the right, in reality you have electrons moving to the left. Here pink dots moving counterclockwise look like a green dot moving clockwise!

A related puzzle: what happens when you hold a helium balloon on a string while you’re driving in a car with the windows closed… and then you make a sharp right turn? I’ve done it, so I know from experience.

Now for the real stuff:

• Tom Murphy, a physics professor at U.C. San Diego, has a blog worth visiting: Do the Math. He uses physics and math to make informed guesses about the future of energy production. Try out his overview on ‘peak oil’.

• Hundreds of top conservation scientists took a survey, and 99.5% felt that a serious loss of biodiversity is either ‘likely’, ‘very likely’, or ‘virtually certain’. Tropical coral ecosystems were perceived as the most seriously affected. A slim majority think we need to decide on rules for ‘triage’: deciding which species to save and which to give up on.

• Climate change is causing a massive change in tree species across Western USA. “Ecosystems are always changing at the landscape level, but normally the rate of change is too slow for humans to notice,” said Steven Running, a co-author of a study on this at the University of Montana. “Now the rate of change is fast enough we can see it.” The study used remote sensing of large areas over a four-year period.

• The James Dyson Award calls on design and engineering students to create innovative, practical, elegant solutions to the challenges that face us. This year, Edward Linacre won for a self-powering device that extracts water from the air for irrigation purposes. Linacre comes from the drought-afflicted continent of Australia. But his invention borrows some tricks from the Namib beetle, which survives some of the driest deserts in Africa by harvesting the moisture that condenses on its back during the early morning. That’s called biomimicry.

• The New York Times has a great profile of Jeremy Grantham. He heads a successful firm managing $100 billion assets, and now he’s 72. So why is he saying this? … it’s very important to me to make a lot of money now, much more than when I was 40 or 50. Not because he has a brand new gold-digger ‘trophy wife’ or spendthrift heirs. No, he puts all the money into the Grantham Foundation for the Protection of the Environment. He’s famous for his quarterly letters on future trends—you can read them free online! And thanks to this, he has some detailed ideas about what’s coming up, and what we should do about it: Energy “will give us serious and sustained problems” over the next 50 years as we make the transition from hydrocarbons—oil, coal, gas—to solar, wind, nuclear and other sources, but we’ll muddle through to a solution to Peak Oil and related challenges. Peak Everything Else will prove more intractable for humanity. Metals, for instance, “are entropy at work . . . from wonderful metal ores to scattered waste,” and scarcity and higher prices “will slowly increase forever,” but if we scrimp and recycle, we can make do for another century before tight constraint kicks in. Agriculture is more worrisome. Local water shortages will cause “persistent irritation”—wars, famines. Of the three essential macro nutrient fertilizers, nitrogen is relatively plentiful and recoverable, but we’re running out of potassium and phosphorus, finite mined resources that are “necessary for all life.” Canada has large reserves of potash (the source of potassium), which is good news for Americans, but 50 to 75 percent of the known reserves of phosphate (the source of phosphorus) are located in Morocco and the western Sahara. Assuming a 2 percent annual increase in phosphorus consumption, Grantham believes the rest of the world’s reserves won’t last more than 50 years, so he expects “gamesmanship” from the phosphate-rich. And he rates soil erosion as the biggest threat of all. The world’s population could reach 10 billion within half a century—perhaps twice as many human beings as the planet’s overtaxed resources can sustainably support, perhaps six times too many. It’s not that he doesn’t take climate change seriously. However, he seems to have almost given up on the US political establishment doing anything about it. So he’s shifted his focus: Grantham put his own influence and money behind the climate-change bill passed by the House in 2009. “But even$100 million wouldn’t have gotten it through the Senate,” he said. “The recession more or less ruled it out. It pushed anything having to do with the environment down 10 points, across the board. Unemployment and interest in environmental issues move inversely.”

Having missed a once-in-a-generation legislative opportunity to address climate change, American environmentalists are looking for new strategies. Grantham believes that the best approach may be to recast global warming, which depresses crop yields and worsens soil erosion, as a factor contributing to resource depletion. “People are naturally much more responsive to finite resources than they are to climate change,” he said. “Global warming is bad news. Finite resources is investment advice.” He believes this shift in emphasis plays to Americans’ strength. “Americans are just about the worst at dealing with long-term problems, down there with Uzbekistan,” he said, “but they respond to a market signal better than almost anyone. They roll the dice bigger and quicker than most.”

Let’s wrap up with some more fun stuff: impressive volcanos!

Morgan Abbou explains:

Volcanic lightning photograph by Francisco Negroni. In a scene no human could have witnessed, an apocalyptic agglomeration of lightning bolts illuminates an ash cloud above Chile’s Puyehue volcano in June 2011. The minutes-long exposure shows individual bolts as if they’d all occurred at the same moment and, due to the Earth’s rotation, renders stars (left) as streaks. Lightning to the right of the ash cloud appears to have illuminated nearby clouds.hence the apparent absence of stars on that side of the picture. After an ominous series of earthquakes on the previous day, the volcano erupted that afternoon, convincing authorities to evacuate some 3,500 area residents. Eruptions over the course of the weekend resulted in heavy ashfalls, including in Argentine towns 60 miles (a hundred kilometers) away.

Here’s another shot of the same volcano:

And here’s Mount Etna blowing out a smoke ring in March of 2000. By its shadow, this ring was estimated to be 200 meters in diameter!

## Apocalypse, Retreat or Revolution?

3 November, 2011

I’ve been enjoying this book:

• Tim Lenton and Andrew Watson, Revolutions That Made the Earth, Oxford U. Press, Oxford, 2011.

It’s mainly about the history of life on Earth, and how life has affected the climate and atmosphere. For example: when photosynthesis first started pumping a deadly toxic gas into the atmosphere—oxygen—how did life evolve to avoid disaster?

Or: why did most of the Earth freeze, about 650 million years ago, and what did life do then?

Or: what made 96% of all marine species and 70% of vertebrates on land die out, around 250 million years ago?

This is the book’s strength: a detailed but readable version of the greatest story we know, complete with mysteries yet to be solved. But at the end they briefly ponder the future. They consider various scenarios, lumped into three categories: apocalypse, retreat or revolution.

#### Apocalypse

They begin by reviewing the familiar story: how soaring population and fossil fuel usage is making our climate ever hotter, making our oceans ever more acidic, and sucking phosphorus and other nutrients out of ground and into the sea.

They consider different ways these trends could push the Earth into a new, inhospitable state. They use the term ‘apocalypse’. I think ‘disaster’ is better, but anyway, they write:

Even the normally cheerful and creative Jim Lovelock argues that we are already doomed, and nothing we can do now will stop the Earth system being carried by its own internal dynamics into a different and inhospitable state for us. If so, all we can do is try to adapt. We disagree—in our view the game is not yet up. As far as we can see no one has yet made a convincing scientific case that we are close to a global tipping point for ‘runaway’ climate change.

[...]

Yet even without truly ‘runaway’ change, the combination of unmitigated fossil fuel burning and positive feedbacks from within the Earth system could still produce an apocalyptic climate for humanity. We could raise global temperature by up to 6 °C this century, with more to come next century. On the way there, many parts of the Earth system could pas their own thresholds and undergo profound changes in state. These are what Tim [Lenton] and colleagues have called ‘tipping elements’ in the climate system.

They warrant a book by themselves, so we will just touch on them briefly here. The tipping elements include the great ice sheets covering Greenland and West Antarctica that are already losing mass and adding to sea level rise. In the tropics, there are already changes in atmospheric circulation, and in the pattern of El Niño events. The Amazon rainforest suffered severe drought in 2005 and might in the future face a climate drying-triggered dieback, destroying biodiversity and adding carbon to the atmosphere. Over India, an atmospheric brown cloud of pollution is already disrupting the summer monsoon, threatening food security. The monsoon in West Africa could be seriously disrupted as the neighboring ocean warms up. The boreal forests that cloak the northern high latitudes are threatened by warming, forest fires and insect infestation. The list goes on. The key point is that the Earth’s climate, being a complex feedback system, is unlikely to respond in an entirely smooth and proportional way to significant changes in energy balance caused by human activities.

Here is a map of some tipping elements. Click for more details:

#### Retreat

They write:

A popular answer to apocalyptic visions of the future is retreat, into a lower energy, lower material consumption, and ultimately lower population world. In this future world the objective is to minimize human effects on the Earth system and allow Gaia to reassert herself, with more room for natural ecosystems and minimal intervention in global cycles. The noble aim is long-term sustainability for for people as well as the planet.

There are some good and useful things we can take from such visions of the future, especially in helping to wean ourselves off fossil fuels, achieve greater energy efficiency, promote recycling and redefine what we mean by quality of life. However, we think that visions of retreat are hopelessly at odds with current trends, and with the very nature of what drives revolutionary changes of the Earth. They lack pragmatism and ultimately they lack ambition. Moreover, a retreat sufficient to forestall the problems outlined above might be just as bad as the problems it sought to avoid.

#### Revolution

They write:

Our alternative vision of the future is of revolution, into a high energy, high recycling world that can support billions of people as part of a thriving and sustainable biosphere. The key to reaching this vision of the future is to learn from past revolutions: future civilizations must be fuelled from sustainable energy sources, and they must undertake a greatly enhanced recycling of resources.

And here is where the lessons of previous ‘revolutions’ are especially useful. As I said last time, they list four:

1. The origin of life, before 3.8 billion years ago.

2. The Great Oxidation, when photosynthesis put oxygen into the atmosphere between 3.4 and 2.5 billion years ago.

3. The rise of complex life (eukaryotes), roughly 2 billion years ago.

4. The rise of humanity, roughly 0 billion years ago.

Their book argues that all three of the earlier revolutions disrupted the Earth’s climate, pushing it out of stability. It only restabilized after reaching a fundamentally new state. This new stable state could only be born after some new feedback mechanisms had developed.

For example, in every revolution, it has been important to find ways to recycle ‘wastes’ and make them into useful ‘resources’. This was true with oxygen during the Great Oxidation… and it must be true with our waste products now!

In any sort of approximate equilibrium state, there can’t be much ‘waste’: almost everything needs to be recycled. Serious amounts of ‘waste’ can only occur for fairly short periods of time, in the grand scheme of things. For example, we are now burning fossil fuels and creating a lot of waste CO2, but this can’t go on forever: it’s only a transitional phase.

#### Apocalypse and Revolution?

I should talk about all this in more detail someday. But not today.

For now, I would just like to suggest that ‘apocalypse’ and ‘revolution’ are not really diametrically opposed alternatives. All three previous revolutions destroyed the world as it had been!

For example, when the Great Oxidation occurred, this was an ‘apocalypse’ for anaerobic life forms, who now struggle to survive in specialized niches here and there. It only seems like a triumphant ‘revolution’ in retrospect, to the new life forms that comfortably survive in the new world.

So, I think we’re headed for a combination of apocalypse and revolution: the death of many old things, and the birth of new ones. At best we have a bit of influence in nudging things in a direction we like. I don’t think ‘retreat’ is a real option: nostalgic though I am about many old things, time always pushes us relentlessly into new and strange worlds.

## US Weather Disasters in 2011

6 September, 2011

The US Federal Emergency Management Agency (FEMA) is running out of money!

So far this year, ten weather disasters have each caused over a billion dollars of damage in the United States. This beats the record set in 2008, when there were nine. FEMA now has less than a billion dollars in its fund:

• Brian Naylor, Costs Of Irene Add Up As FEMA Runs Out Of Cash, Morning Edition, National Public Radio, 30 August 2011.

Let’s review these disasters:

10) Hurricane Irene, August 27-28: A large and powerful Atlantic hurricane that left extensive flood and wind damage along its path through the Caribbean, the east coast of the US and as far north as Atlantic Canada. Early estimates say Irene caused $7 billion in damages in the US. 9) Upper Midwest flooding, summer: An above-average snowpack across the northern Rocky Mountains, combined with rainstorms, caused the Missouri and Souris rivers to swell beyond their banks across the Upper Midwest. An estimated 11,000 people were forced to evacuate Minot, N.D. Numerous levees were breached along the Missouri River, flooding thousands of acres of farmland. Over$2 billion in damages.

8) Mississippi River flooding, spring-summer: Persistent rainfall (nearly triple the normal amount in the Ohio Valley), combined with melting snowpack, caused historical flooding along the Mississippi River and its tributaries. At least two people died. $2 to$4 billion in damages.

7) Southern Plains/Southwest drought, heat wave and wildfires, spring and summer: Drought, heat waves, and wildfires hit Texas, Oklahoma, New Mexico, Arizona, southern Kansas, western Arkansas and Louisiana this year. Wildfire fighting costs for the region are about $1 million per day, with over 2,000 homes and structures lost by mid-August. Over$5 billion in damages so far.

6) Midwest/Southeast tornadoes, May 22-27: Central and southern states saw approximately 180 twisters and 177 deaths within a week. A tornado in Joplin, Mo., caused at least 141 deaths—the deadliest single tornado to strike the United States since modern record keeping began in 1950. Over $7 billion in damages. 5) Southeast/Ohio Valley/Midwest tornadoes, April 25-30: This outbreak of tornadoes over central and southern states led to 327 deaths. Of those fatalities, 240 occurred in Alabama. The deadliest of the estimated 305 tornadoes in the outbreak was an EF-5 that hit northern Alabama, killing 78 people. Several big cities were directly affected by strong tornadoes, including Tuscaloosa, Birmingham and Huntsville in Alabama, and Chattanooga in Tennessee. Over$9 billion in damages.

4) Midwest/Southeast tornadoes, April 14-16: An outbreak over central and southern states produced an estimated 160 tornadoes. Thirty-eight people died, 22 of them in North Carolina. Over $2 billion in damages. 3) Southeast/Midwest tornadoes, April 8-11: An outbreak of tornadoes over central and southern states saw an estimated 59 tornadoes. Over$2.2 billion in damages.

2) Midwest/Southeast tornadoes, April 4-5: An outbreak of tornadoes over central and southern states saw an estimated 46 tornadoes. Nine people died. Over $2.3 billion in damages. 1) Blizzard, Jan 29-Feb 3: A large winter storm hit many central, eastern and northeastern states. 36 people died. Over$2 billion in damages.

I got most of this information from this article, which was written before Irene pushed 2011 into the lead:

• Brett Israel, 2011 ties for most billion-dollar weather disasters, Our Amazing Planet, 18 August 2011.

We can expect more weather disasters as global warming proceeds. The National Academy of Sciences says:

• Increases of precipitation at high latitudes and drying of the already semi-arid regions are projected with increasing global warming, with seasonal changes in several regions expected to be about 5-10% per degree of warming. However, patterns of precipitation show much larger variability across models than patterns of temperature.

• Large increases in the area burned by wildfire are expected in parts of Australia, western Canada, Eurasia and the United States.

• Extreme precipitation events—that is, days with the top 15% of rainfall—are expected to increase by 3-10% per degree of warming.

• In many regions the amount of flow in streams and rivers is expected to change by 5-15% per degree of warming, with decreases in some areas and increases in others.

• The total number of tropical cyclones should decrease slightly or remain unchanged. Their wind speed is expected to increase by 1-4% per degree of warming.

Some people worry about sea level rise, but I think the bite from weather disasters and ensuing crop failures will hurt much more, much sooner.

Since it doesn’t look like politicians will do enough to cut carbon emissions, insurance companies are moving to act on their own—not to prevent weather disasters, but to minimize their effect:

Swiss Re’s global headquarters face Lake Zurich, overlooking a small yacht harbor. Bresch and a colleague, Andreas Schraft, sometimes walk the 20 minutes to the train station together after work, past more yachts, an arboretum, and a series of bridges. In September 2005, probably on one of these walks, the two began to discuss what they now call “Faktor K,” for “Kultur”: the culture factor. Losses from Hurricanes Katrina, Rita, and Wilma had been much higher than expected in ways the existing windstorm models hadn’t predicted, and it wasn’t because they were far off on wind velocities.

The problem had to do more with how people on the Gulf Coast were assessing windstorm risk as a group. Mangrove swamps on the Louisiana coast had been cut down and used as fertilizer, stripping away a barrier that could have sapped the storm of some of its energy. Levees were underbuilt, not overbuilt. Reinsurers and modeling firms had focused on technology and the natural sciences; they were missing lessons from economists and social scientists. “We can’t just add another bell and whistle to the model,” says Bresch, “It’s about how societies tolerate risk.”

“We approach a lot of things as much as we can from the point of statistics and hard data,” says David Smith, head of model development for Eqecat, a natural hazards modeling firm. “It’s not the perfect expression.” The discrepancy between the loss his firm modeled for Katrina and the ultimate claims-based loss number for his clients was the largest Smith had seen. Like others in the industry, Eqecat had failed to anticipate the extent of levee failure. Construction quality in the Gulf states before Katrina was poorer than anticipated, and Eqecat was surprised by a surge in demand after the storm that inflated prices for labor and materials to rebuild. Smith recognizes that these are questions for sociologists and economists as well as engineers, and he consults with the softer sciences to get his models right. But his own market has its demands, too. “The more we can base the model on empirical data,” he says, “the more defendable it is.”

After their walk around the lake in 2005, Swiss Re’s Bresch and Schraft began meeting with social scientists and laying out two goals. First, they wanted to better understand the culture factor and, ultimately, the risks they were underwriting. Second, they wanted to use that understanding to help the insured prevent losses before they had to be paid for.

The business of insurers and reinsurers rests on balancing a risk between two extremes. If the risk isn’t probable enough, or the potential loss isn’t expensive enough, there’s no reason for anyone to buy insurance for it. If it’s too probable and the loss too expensive, the premium will be unaffordable. This is bad for both the insured and the insurer. So the insurance industry has an interest in what it calls “loss mitigation.” It encourages potential customers to keep their property from being destroyed in the first place. If Swiss Re is trying to affect the behavior of the property owners it underwrites, it’s sending a signal: Some behavior is so risky that it’s hard to price. Keep it up, and you’ll have no insurance and we’ll have no business. That’s bad for everyone.

To that end, Swiss Re has started speaking about climate risk, not climate change. That the climate is changing has been established in the eyes of the industry. “For a long time,” says Bresch, “people thought we only needed to do detailed modeling to truly understand in a specific region how the climate will change. … You can do that forever.” In many places, he says, climate change is only part of the story. The other part is economic development. In other words, we’re building in the wrong places in the wrong way, so wrong that what we build often isn’t even insurable. In an interview published by Swiss Re, Wolf Dombrowsky, of the Disaster Research Center at Kiel University in Germany, points out that it’s wrong to say that a natural disaster destroyed something; the destruction was not nature’s fault but our own.

In 1888 the city of Sundsvall in Sweden, built of wood, burned to the ground. A group of reinsurers, Swiss Re among them, let Sweden’s insurers know there was going to be a limit in the future on losses from wooden houses, and it was going to be low. Sweden began building with stone. Reinsurance is a product, but also a carrot in the negotiation between culture and reality; it lets societies know what habits are unsustainable.

More recently, the company has been working with McKinsey & Co., the European Commission, and several environmental groups to develop a methodology it calls the “economics of climate adaptation,” a way to encourage city planners to build in a way that will be insurable in the future. A study of the U.K. port of Hull looks at potential losses by 2030 under several different climate scenarios. Even under the most extreme, losses were expected to grow by $17 million due to climate change and by$23 million due to economic growth. How Hull builds in the next two decades matters more to it than the levels of carbon dioxide in the air. A similar study for Entergy (ETR), a New Orleans-based utility, concluded that adaptations on the Gulf Coast—such as tightening building codes, restoring wetlands and barrier islands, building levees around chemical plants, and requiring that new homes in high-risk areas be elevated—could almost completely offset the predicted cost of 100-year storms happening every 40 years.

I actually disagree somewhat with the statement “it’s wrong to say that a natural disaster destroyed something; the destruction was not nature’s fault but our own.” There’s some truth to this, but also some untruth. The question of “fault” or “blame” is a slippery one here, and there’s probably no way to completely settle it.

Is it the “fault” of people in Vermont that they weren’t fully prepared for a hurricane? After all, it’s rare—or at least it used to be rare—for hurricanes to make it that far north. The governor of Vermont, Peter Shumlin, recently said:

I find it extraordinary that so many political leaders won’t actually talk about the relationship between climate change, fossil fuels, our continuing irrational exuberance about burning fossil fuels, in light of these storm patterns that we’ve been experiencing.

We had storms this spring that flooded our downtowns and put us through many of the same exercises that we’re going through right now. We didn’t used to get weather patterns like this in Vermont. We didn’t get tropical storms. We didn’t get flash flooding.

We in the colder states are going to see the results of climate change first. Myself, Premier Charest up in Quebec, Governor Cuomo over in New York, we understand that the flooding and the extraordinary weather patterns that we’re seeing are a result of our burnings of fossil fuel. We’ve got to get off fossil fuels as quickly as we know how, to make this planet livable for our children and our grandchildren.

On the other hand, you could say that it is the fault of Vermonters, or at least humanity as a whole, for causing global warming in the first place.

But ultimately, pinning blame on someone or something is less important than figuring out how to solve the problems we face.

## Melting Permafrost (Part 1)

1 September, 2011

Some people worry about rising sea levels due to global warming. But that will happen slowly. I worry about tipping points.

The word “tipping point” should remind you of pushing on a glass of water. If you push it a little, and then stop, it’ll right itself: no harm done. But if you push it past a certain point, it starts tipping over. Then it’s hard to stop.

So, we need to study possible tipping points in the Earth’s climate system. Here’s a list of them:

Tipping point, Azimuth Library.

Today I want to talk about one: melting permafrost. When melting permafrost in the Arctic starts releasing lots of carbon dioxide and methane—a vastly more potent greenhouse gas—the Earth will get even hotter. That, in turn, will melt even more permafrost. In theory, this feedback loop could tip the Earth over to a much hotter state. But how much should we worry about this?

Climate activist Joe Romm takes it very seriously:

• Joe Romm, NSIDC bombshell: Thawing permafrost feedback will turn Arctic from carbon sink to source in the 2020s, releasing 100 billion tons of carbon by 2100, Climate Progress, 17 February 2011.

If you click on just one link of mine today, let it be this! He writes in a clear, snappy way. But let me take you through some of the details in my own more pedestrian fashion.

For starters, the Arctic is melting. Here’s a graph of Arctic sea ice volume created by the Pan-Arctic Ice Ocean Modeling and Assimilation System—click to enlarge:

The blue line is the linear best fit, but you can see it’s been melting faster lately. Is this a glitch or a new trend? Time will tell.

2011 is considerably worse than 2007, the previous record-holder. Here you can clearly see the estimated total volume in thousands of cubic kilometers, and how it changes with the seasons:

As the Arctic melts, many things are changing. The fabled Northwest Passage is becoming a practical waterway, so battles are starting to heat up over who controls it. The U.S. and other nations see it as an international waterway. But Canada says they own it, and have the right to regulate and protect it:

• Jackie Northam, Arctic warming unlocking a fabled waterway, Morning Edition, National Public Radio, 15 August 2011.

But the 800-pound gorilla in the room is the melting permafrost. A lot of the Arctic is covered by permafrost, and it stores a lot of carbon, both as peat and as methane. After all, peat is rotten plant material, and rotting plants make methane. Recent work estimates that between 1400 and 1700 gigatonnes of carbon is stored in permafrost soils worldwide:

• C. Tarnocai, J. G. Canadell, E. A. G. Schuur, P. Kuhry, G. Mazhitova, and S. Zimov, Soil organic carbon pools in the northern circumpolar permafrost region, Global Biogeochemical Cycles 23 (2009), GB2023.

That’s more carbon than currently resides in all living things, and twice as much carbon as held by the atmosphere!

How much of this carbon will be released as the Arctic melts—and how fast? There’s a new paper about that:

• Kevin Schaefer, Tingjun Zhang, Lori Bruhwiler, Andrew Barrett, Amount and timing of permafrost carbon release in response to climate warming, Tellus B 63 (2011), 165–180.

It’s not free, but you can read Joe Romm’s summary. Here’s their estimate on how carbon will be released by melting permafrost:

So, they’re guessing that melting permafrost will release a gigatonne of carbon per year by the mid-2030s. Moreover, they say:

We predict that the PCF [permafrost carbon feedback] will change the Arctic from a carbon sink to a source after the mid-2020s and is strong enough to cancel 42-88% of the total global land sink. The thaw and decay of permafrost carbon is irreversible and accounting for the PCF will require larger reductions in fossil fuel emissions to reach a target atmospheric CO2 concentration.

One of the authors explains more details here:

“The amount of carbon released [by 2200] is equivalent to half the amount of carbon that has been released into the atmosphere since the dawn of the industrial age,” said NSIDC scientist Kevin Schaefer. “That is a lot of carbon.”

The carbon from permanently frozen ground known as permafrost “will make its impact, not only on the climate, but also on international strategies to reduce climate change Schaefer said. “If we want to hit a target carbon concentration, then we have to reduce fossil fuel emissions that much lower than previously calculated to account for this additional carbon from the permafrost,” Schaefer said. “Otherwise we will end up with a warmer Earth than we want.”

The carbon comes from plant material frozen in soil during the ice age of the Pleistocene: the icy soil trapped and preserved the biomass for thousands of years. Schaefer equates the mechanism to storing broccoli in the home freezer: “As long as it stays frozen, it stays stable for many years,” he said. “But you take it out of the freezer and it will thaw out and decay.”

Now, permafrost is thawing in a warming climate and “just like the broccoli” the biomass will thaw and decay, releasing carbon into the atmosphere like any other decomposing plant material, Schaefer said. To predict how much carbon will enter the atmosphere and when, Schaefer and coauthors modeled the thaw and decay of organic matter currently frozen in permafrost under potential future warming conditions as predicted by the Intergovernmental Panel on Climate Change.

They found that between 29-59 percent of the permafrost will disappear by 2200. That permafrost took tens of thousands of years to form, but will melt in less than 200, Schaefer said.

Sound alarmist? In fact, there are three unrealistically conservative assumptions built into this paper:

1) The authors assume the ‘moderate warming’ scenario called A1B, which has atmospheric concentrations of CO2 reaching 520 ppm by 2050 and stabilizing at 700 ppm in 2100. But so far we seem to be living out the A1F1 scenario, which reaches 1000 ppm by century’s end.

2) Their estimate of future temperatures neglects the effect of greenhouse gases released by melting permafrost.

3) They assume all carbon emitted by permafrost will be in the form of CO2, not methane.

Point 2) means that the whole question of a feedback loop is not explored in this paper. I understand why. To do that, you can’t use someone else’s climate model: you need to build your own! But it’s something we need to study. Do you know anyone who is? Joe Romm says:

Countless studies make clear that global warming will release vast quantities of greenhouse gases into the atmosphere this decade. Yet, no climate model currently incorporates the amplifying feedback from methane released by a defrosting tundra.

If we try to understand this feedback, point 3) becomes important. After all, while methane goes away faster than CO2, its greenhouse effect is much stronger while it lasts. For the first 20 years, methane has about 72 times the global warming potential of carbon dioxide. Over the first 100 years, it’s about 25 times as powerful.

Let’s think about that a minute. In 2008, we burnt about 8 gigatonnes of carbon. If Schaefer et al are right, we can expect 1 extra gigatonne of carbon to be released from Arctic permafrost by around 2035. If that’s almost all in the form of carbon dioxide, it makes our situation slightly worse. But if a lot of it is methane, which is—let’s roughly say—72 times as bad—then our situation will be dramatically worse.

But I don’t know how much of the carbon released will be in the form of methane. I also don’t know how much of the methane will turn into other organic compounds before it gets into the atmosphere. I’d really like to know!

I hope you learn more about this stuff and help me out. Here are a few good references available for free online, to get started:

• Edward A. G. Schuur et al, Vulnerability of permafrost carbon to climate change: implications for the global carbon cycle, Bioscience 58 (2008), 701-714.

• David M. Lawrence, Andrew G. Slater, Robert A. Tomas, Marika M. Holland and Clara Deser, Accelerated Arctic land warming and permafrost degradation during rapid sea ice loss, Geophysical Research Letters 35 (2008), L11506.

• Amanda Leigh Mascarelli, A sleeping giant?, Nature Reports Climate Change, 5 March 2009.

The last one discusses the rise in atmospheric methane that was observed in 2007:

It also discusses the dangers of methane being released from ice-methane crystals called methane clathrates at the bottom of the ocean—something I’m deliberately not talking about here, because it deserves its own big discussion. However, there are also clathrates in the permafrost. Here’s a picture by W. F. Kuhs, showing what methane clathrate looks like at the atomic scale:

The green guy in the middle is methane, trapped in a cage of water molecules. Click for more details.

If you know more good references, please tell me about them here or add them to:

Permafrost, Azimuth Library.

## Bayesian Computations of Expected Utility

19 August, 2011

GiveWell is an organization that rates charities. They’ve met people who argue that

charities working on reducing the risk of sudden human extinction must be the best ones to support, since the value of saving the human race is so high that “any imaginable probability of success” would lead to a higher expected value for these charities than for others.

For example, say I have a dollar to spend on charity. One charity says that with this dollar they can save the life of one child in Somalia. Another says that with this dollar they can increase by .000001% our chance of saving 1 billion people from the effects of a massive asteroid colliding with the Earth.

Naively, in terms of the expected number of lives saved, the latter course of action seems 10 times better, since

.000001% × 1 billion = 10

But is it really better?

It’s a subtle question, with all sorts of complicating factors, like why should I trust these guys?

I’m not ready to present a thorough analysis of this sort of question today. But I would like to hear what you think about it. And I’d like you to read what the founder of Givewell has to say about it:

• Holden Karnofsky, Why we can’t take expected value estimates literally (even when they’re unbiased), 18 August 2011.

He argues against what he calls an ‘explicit expected value’ or ‘EEV’ approach:

The mistake (we believe) is estimating the “expected value” of a donation (or other action) based solely on a fully explicit, quantified formula, many of whose inputs are guesses or very rough estimates. We believe that any estimate along these lines needs to be adjusted using a “Bayesian prior”; that this adjustment can rarely be made (reasonably) using an explicit, formal calculation; and that most attempts to do the latter, even when they seem to be making very conservative downward adjustments to the expected value of an opportunity, are not making nearly large enough downward adjustments to be consistent with the proper Bayesian approach.

His focus, in short, is on the fact that anyone saying “this money can increase by .000001% our chance of saving 1 billion people from an asteroid impact” is likely to be pulling those numbers from thin air. If they can’t really back up their numbers with a lot of hard evidence, then our lack of confidence in their estimate should be taken into account somehow.

His article spends a lot of time analyzing a less complex but still very interesting example:

It seems fairly clear that a restaurant with 200 Yelp reviews, averaging 4.75 stars, ought to outrank a restaurant with 3 Yelp reviews, averaging 5 stars. Yet this ranking can’t be justified in an explicit expected utility framework, in which options are ranked by their estimated average/expected value.

This is the only question I really want to talk about today. Actually I’ll focus on a similar question that Tim van Beek posed on this blog:

You have two kinds of fertilizer, A and B. You know that of 4 trees who got A, three thrived and one died. Of 36 trees that got B, 24 thrived and 12 died. Which fertilizer would you buy?

So, 3/4 of the trees getting fertilizer A thrived, while only 2/3 of those getting fertilizer B thrived. That makes fertilizer A seem better. However, the sample size is considerably larger for fertilizer B, so we may feel more confident about the results in this case. Which should we choose?

Nathan Urban tackled the problem in an interesting way. Let me sketch what he did before showing you his detailed work.

Suppose that before doing any experiments at all, we assume the probability $\pi$ that a fertilizer will make a tree thrive is a number uniformly distributed between 0 and 1. This assumption is our “Bayesian prior”.

Note: I’m not saying this prior is “correct”. You are allowed to choose a different prior! Choosing a different prior will change your answer to this puzzle. That can’t be helped. We need to make some assumption to answer this kind of puzzle; we are simply making it explicit here.

Starting from this prior, Nathan works out the probability that $\pi$ has some value given that when we apply the fertilizer to 4 trees, 3 thrive. That’s the black curve below. He also works out the probability that $\pi$ has some value given that when we apply the fertilizer to 36 trees, 24 thrive. That’s the red curve:

The red curve, corresponding to the experiment with 36 trees, is much more sharply peaked. That makes sense. It means that when we do more experiments, we become more confident that we know what’s going on.

We still have to choose a criterion to decide which fertilizer is best! This is where ‘decision theory’ comes in. For example, suppose we want to maximize the expected number of the trees that thrive. Then Nathan shows that fertilizer A is slightly better, despite the smaller sample size.

However, he also shows that if fertilizer A succeeded 4 out of 5 times, while fertilizer B succeeded 7 out of 9 times, the same evaluation procedure would declare fertilizer B better! Its percentage success rate is less: about 78% instead of 80%. However, the sample size is larger. And in this particular case, given our particular Bayesian prior and given what we are trying to maximize, that’s enough to make fertilizer B win.

So if someone is trying to get you to contribute to a charity, there are many interesting issues involved in deciding if their arguments are valid or just a bunch of… fertilizer.

Here is Nathan’s detailed calculation:

It’s fun to work out an official ‘correct’ answer mathematically, as John suggested. Of course, this ends up being a long way of confirming the obvious—and the answer is only as good as the assumptions—but I think it’s interesting anyway. In this case, I’ll work it out by maximizing expected utility in Bayesian decision theory, for one choice of utility function. This dodges the whole risk aversion point, but it opens discussion for how the assumptions might be modified to account for more real-world considerations. Hopefully others can spot whether I’ve made mistakes in the derivations.

In Bayesian decision theory, the first thing you do is write down the data-generating process and then compute a posterior distribution for what is unknown.

In this case, we may assume the data-generating process (likelihood function) is a binomial distribution $\mathrm{Bin}(s,n|\pi)$ for $s$ successes in $n$ trials, given a probability of success $\pi$. Fertilizer A corresponds to $s=3$, $n=4$ and fertilizer B corresponds to $s=24$, $n=36$.

The probability of success $\pi$ is unknown, and we want to infer its posterior conditional on the data, $p(\pi|s,n)$. To compute a posterior we need to assume a prior on $\pi$.

It turns out that the Beta distribution is conjugate to a binomial likelihood, meaning that if we assume a Beta distributed prior, the then the posterior is also Beta distributed. If the prior is $\pi \sim \mathrm{Beta}(\alpha_0,\beta_0)$ then the posterior is

$\pi \sim \mathrm{Beta}(\alpha=\alpha_0+s,\beta=\beta_0+n-s).$

One choice for a prior is a uniform prior on $[0,1]$, which corresponds to a $\mathrm{Beta}(1,1)$ distribution. There are of course other prior choices which will lead to different conclusions. For this prior, the posterior is $\mathrm{Beta}(\pi; s+1, n-s+1)$. The posterior mode is

$(\alpha-1)/(\alpha+\beta-2) = s/n$

and the posterior mean is

$\alpha/(\alpha+\beta) = (s+1)/(n+2).$

So, what is the inference for fertilizers A and B? I made a graph of the posterior distributions. You can see that the inference for fertilizer B is sharper, as expected, since there is more data. But the inference for fertilizer A tends towards higher success rates, which can be quantified.

Fertilizer A has a posterior mode of 3/4 = 0.75 and B has a mode of 2/3 = 0.667, corresponding to the sample proportions. The mode isn’t the only measure of central tendency we could use. The means are 0.667 for A and 0.658 for B; the medians are 0.686 for A and 0.661 for B. No matter which of the three statistics we choose, fertilizer A looks better than fertilizer B.

But we haven’t really done “decision theory” yet. We’ve just compared point estimators. Actually, we have done a little decision theory, implicitly. It turns out that picking the mean corresponds to the estimator which minimizes the expected squared error in $\pi$, where “squared error” can be thought of formally as a loss function in decision theory. Picking the median corresponds to minimizing the expected absolute loss, and picking the mode corresponds to minimizing the minimizing the 0-1 loss (where you lose nothing if you guess $\pi$ exactly and lose 1 otherwise).

Still, these don’t really correspond to a decision theoretic view of the problem. We don’t care about the quantity $\pi$ at all, let alone some point estimator of it. We only care about $\pi$ indirectly, insofar as it helps us predict something about what the fertilizer will do to new trees. For that, we have to move from the posterior distribution $p(\pi|s,n)$ to the predictive distribution

$p(y|s,n) = \int p(y|\pi,n)\,p(\pi|s,n)\,d\pi ,$

where $y$ is a random variable indicating whether a new tree will thrive under treatment. Here I assume that the success of new trees follows the same binomial distribution as in the experimental group.

For a Beta posterior, the predictive distribution is beta-binomial, and the expected number of successes for a new tree is equal to the mean of the Beta distribution for $\pi$ – i.e. the posterior mean we computed before, $(s+1)/(n+2)$. If we introduce a utility function such that we are rewarded 1 util for a thriving tree and 0 utils for non-thriving tree, then the expected utility is equal to the expected success rate. Therefore, under these assumptions, we should choose the fertilizer that maximizes the quantity $(s+1)/(n+2)$, which, as we’ve seen, favors fertilizer A (0.667) over fertilizer B (0.658).

An interesting mathematical question is, does this ever work out to a “non-obvious” conclusion? That is, if fertilizer A has a sample success rate which is greater than fertilizer B’s sample success rate, but expected utility maximization prefers fertilizer B? Mathematically, we’re looking for a set ${s,s',n,n'}$ such that $s/n>s'/n'$ but $(s+1)/(n+2) < (s'+1)/(n'+2)$. (Also there are obvious constraints on $s$ and $s'$.) The answer is yes. For example, if fertilizer A has 4 of 5 successes while fertilizer B has 7 of 9 successes.

By the way, on a quite different note: NASA currently rates the chances of the asteroid Apophis colliding with the Earth in 2036 at 4.3 × 10-6. It estimates that the energy of such a collision would be comparable with a 510-megatonne thermonuclear bomb. This is ten times larger than the largest bomb actually exploded, the Tsar Bomba. The Tsar Bomba, in turn, was ten times larger than all the explosives used in World War II.

There’s an interesting Chinese plan to deflect Apophis if that should prove necessary. It is, however, quite a sketchy plan. I expect people will make more detailed plans shortly before Apophis comes close to the Earth in 2029.

## Calculating Catastrophe

14 June, 2011

This book could be interesting. If you read it, could you tell us what you think?

• Gordon Woo, Calculating Catastrophe, World Scientific Press, Singapore, 2011.

Apparently Dr. Gordon Woo was trained in mathematical physics at Cambridge, MIT and Harvard, and has made his career as a ‘calculator of catastrophes’. He has consulted for the IAEA on the seismic safety of nuclear plants and for BP on offshore oil well drilling—it’ll be fun to see what he has to say about his triumphant success in preventing disasters in both those areas. He now works at a company called Risk Management Solutions, where he works on modelling catastrophes for insurance purposes, and has designed a model for terrorism risk.

According to the blurb I got:

This book has been written to explain, to a general readership, the underlying philosophical ideas and scientific principles that govern catastrophic events, both natural and man-made. Knowledge of the broad range of catastrophes deepens understanding of individual modes of disaster. This book will be of interest to anyone aspiring to understand catastrophes better, but will be of particular value to those engaged in public and corporate policy, and the financial markets.

The table of contents lists: Natural Hazards; Societal Hazards; A Sense of Scale; A Measure of Uncertainty; A Matter of Time; Catastrophe Complexity; Terrorism; Forecasting; Disaster Warning; Disaster Scenarios; Catastrophe Cover; Catastrophe Risk Securitization; Risk Horizons.

Maybe you know other good books on the same subject?

For a taste of his thinking, you can try this:

• Gordon Woo, Terrorism risk.

Terrorism sounds like a particularly difficult risk to model, since it involves intelligent agents who try to do unexpected things. But maybe there are still some guiding principles. Woo writes:

It turns out that the number of operatives involved in planning and preparing attacks has a tipping point in respect of the ease with which the dots might be joined by counter-terrorism forces. The opportunity for surveillance experts to spot a community of terrorists, and gather sufficient evidence for courtroom convictions, increases nonlinearly with the number of operatives – above a critical number, the opportunity improves dramatically. This nonlinearity emerges from analytical studies of networks, using modern graph theory methods (Derenyi et al. [21]). Below the tipping point, the pattern of terrorist links may not necessarily betray much of a signature to the counter-terrorism services. However, above the tipping point, a far more obvious signature may become apparent in the guise of a large connected network cluster of dots, which reveals the presence of a form of community. The most ambitious terrorist plans, involving numerous operatives, are thus liable to be thwarted. As exemplified by the audacious attempted replay in 2006 of the Bojinka spectacular, too many terrorists spoil the plot (Woo, [22]).

Intelligence surveillance and eavesdropping of terrorist networks thus constrain the pipeline of planned attacks that logistically might otherwise seem almost boundless. Indeed, such is the capability of the Western forces of counterterrorism, that most planned attacks, as many as 80% to 90%, are interdicted. For example, in the three years before the 7/7/05 London attack, eight plots were interdicted. Yet any non-interdicted planned attack is construed as a significant intelligence failure. The public expectation of flawless security is termed the ‘90-10 paradox.’ Even if 90% of plots are foiled, it is by the 10% which succeed that the security services are ultimately remembered.

Of course the reference to “modern graph theoretical methods” will be less intimidating or impressive to many readers here than to the average, quite possibly innumerate reader of this document. But here’s the actual reference, in case you’re curious:

• I. Derenyi, G. Palla and T. Vicsek, Clique percolation in random networks, Phys. Rev. Lett. 94 (2005), 160202.

Just for fun, let me summarize the main result, so you can think about how relevant it might be to terrorist networks.

A graph is roughly a bunch of dots connected by edges. A clique in a graph is some subset of dots each of which is connected to every other. So, if dots are people and we draw an edge when two people are friends, a clique is a bunch of people who are all friends with each other—hence the name ‘clique’. But we might also use a clique to represent a bunch of people who are all engaged in the same activity, like a terrorist plot.

We’ve talked here before about Erdős–Rényi random graphs. These are graphs formed by taking a bunch of dots and randomly connecting each pair by an edge with some fixed probability $p$. In the paper above, the authors argue that for an Erdős–Rényi random graph with $N$ vertices, the chance that most of the cliques with $k$ elements all touch each other and form one big fat ‘giant component’ shoots up suddenly when

$p \ge [(k-1) N]^{-1/k-1}$

This sort of effect is familiar in many different contexts: it’s called a ‘percolation threshold’. I can guess the implications for terrorist networks that Gordon Woo is alluding to. However, doubt the details of the math are very important here, since social networks are not well modeled by Erdős–Rényi random graphs.

In the real world, if you and I have a mutual friend, that will increase the chance that we’ll be friends. Similarly, if we share a conspirator, that increases the chance that we’re in the same conspiracy. But in a world where friendship was described by an Erdős–Rényi random graph, that would not be the case!

So, while I agree that large terrorist networks are easier to catch than small ones, I don’t think the math of Erdős–Rényi random graphs give any quantitative insight into how much easier it is.

## How Sea Level Rise Will Affect New York

9 June, 2011

Let’s try answering this question on Quora:

How will global warming, and particularly sea level rises, affect New York City?

I doubt sea level rise will be the first way we’ll get badly hurt by global warming. I think it’ll be crop losses caused by floods, droughts and heat waves, and property damage caused by storms. But the question focuses on sea level rise, so perhaps we should think about that… along with any other ways that New York City is particularly susceptible to the effects of global warming.

Suppose you know a lot about New York, but you need an estimate of sea level rise to get started. In the Azimuth Project page on sea level rise, you’ll see a lot of discussion of this subject. Naturally, it’s complicated. But say you just want some numbers. Okay: very roughly, by the end of the century we can expect a sea level of at least 0.6 meters, not counting any melting from Greenland and Antarctica and at most 2 meters, including Greenland and Antarctica. That’s roughly between 2 and 6 feet.

On the other hand, there’s at least one report saying sea levels may rise in the Northeast US at twice the average global rate. What’s the latest word on that?

Now, here’s a website that claims to show what various amounts of sea level rise would do to different areas:

• Firetree.net, Flood maps, including New York City.

Details on how these maps were made are here. One problem is that they focus too much on really big sea level rises: the smallest rise shown is 1 meter, then 2 meters… and it goes up to 60 meters!

Anyway, here’s part of New York City now:

Here it is after a 1-meter (3-foot) sea level rise:

(Click to enlarge any of these.) And here’s 2 meters, or 6 feet:

It’s a bit hard to spot the effects in Manhattan. They’re much more noticeable in the low-lying areas between Jersey City and Secaucus. What are those: parks, industrial areas, or suburbs? I’ve heard New Yorkers crack jokes about the ‘swamps of Jersey’…

But of course, a lot of the city is underground. What will happen to subways and other infrastructure, like sewage systems? And what about water supplies? On coastlines, saltwater can infiltrate into surface waters and aquifers. Where does freshwater meet saltwater near New York City? How will the effect of floods and storms change?

And of course, there are other parts of New York City these little maps don’t show: for those, go here. But watch out: at first you’ll see the effect of a 7-meter sea level rise… you’ll need to change the settings to see the effects of a more realistic rise.

If you live in a place that will be flooded, let me know!

Luckily, we don’t have to figure everything out ourselves: the state of New York has a task force devoted to this. And as task forces do, they’ve written a report:

• New York Department of Environmental Conservation, Sea Level Rise Task Force, Final Report.

New York City also has an ambitious environmental plan:

• New York City, PlaNYC 2030.

Finally, let me quote part of this:

• Jim O’Grady, Sea level rise could turn New York into Venice, experts warn, WNYC News, 9 February 2011.

Because it looks ahead 200 years, this article paints a more dire picture than my remarks above:

David Bragdon, Director of the Mayor’s Office of Long-Term Planning & Sustainability, is charged with preparing for the dangers of climate change. He said the city is taking precautions like raising the pumps at a wastewater treatment plant in the Rockaways and building the Willets Point development in Queens on six feet of landfill. The goal is to manage the risk from 100-year storms—one of the most severe. The mayor’s report says by the end of this century, 100-year storms could start arriving every 15 to 35 years.

Klaus Jacob, a Columbia University research scientist who specializes in disaster risk management, said that estimate may be too conservative. “What is now the impact of a 100-year storm will be, by the end of this century, roughly a 10-year storm,” he warned.

Back on the waterfront, oceanographer Malcolm Bowman offered what he said is a suitably outsized solution to this existential threat: storm surge barriers.

They would rise from the waters at Throgs Neck, where Long Island Sound and the East River meet, and at the opening to the lower harbor between the Rockaways and Sandy Hook, New Jersey. Like the barriers on the Thames River that protect London, they would stay open most of the time to let ships pass but close to protect the city during hurricanes and severe storms.

The structures at their highest points would be 30 feet above the harbor surface. Preliminary engineering studies put the cost at around \$11 billion.

Jacob suggested a different but equally drastic approach. He said sea level rise may force New Yorkers to pull back from vulnerable neighborhoods. “We will have to densify the high-lying areas and use the low-lying areas as parks and buffer zones,” he said.

In this scenario, New York in 200 years looks like Venice. Concentrations of greenhouse gases in the atmosphere have melted ice sheets in Greenland and Antarctica and raised our local sea level by six to eight feet. Inundating storms at certain times of year swell the harbor until it spills into the streets. Dozens of skyscrapers in Lower Manhattan have been sealed at the base and entrances added to higher floors. The streets of the financial district have become canals.

“You may have to build bridges or get Venice gondolas or your little speed boats ferrying yourself up to those buildings,” Jacob said.

David Bragdon is not comfortable with such scenarios. He’d rather talk about the concrete steps he’s taking now, like updating the city’s flood evacuation plan to show more neighborhoods at risk. That would help the people living in them be better prepared to evacuate.

He said it’s too soon to contemplate the “extreme” step of moving “two, three, four hundred thousand people out of areas they’ve occupied for generations,” and disinvesting “literally billions of dollars of infrastructure in those areas.” On the other hand: “Another extreme would be to hide our heads in the sand and say, ‘Nothing’s going to happen.’”

Bragdon said he doesn’t think New Yorkers of the future will have to retreat very far from shore, if at all, but he’s not sure. And he would neither commit to storm surge barriers nor eliminate them as an option. He said what’s needed is more study—and that he’ll have further details in April, when the city updates PlaNYC.

Jacob warned that in preparing for disaster, no matter how far off, there’s a gulf between study and action. “There’s a good intent,” he said of New York’s climate change planning to date. “But, you know, mother nature doesn’t care about intent. Mother nature wants to see resiliency. And that is questionable, whether we have that.”