The Irreversible Momentum of Clean Energy

17 January, 2017

The president of the US recently came out with an article in Science. It’s about climate change and clean energy:

• Barack Obama, The irreversible momentum of clean energy, Science, 13 January 2017.

Since it’s open-access, I’m going to take the liberty of quoting the whole thing, minus the references, which provide support for a lot of his facts and figures.

The irreversible momentum of clean energy

The release of carbon dioxide (CO2) and other greenhouse gases (GHGs) due to human activity is increasing global average surface air temperatures, disrupting weather patterns, and acidifying the ocean. Left unchecked, the continued growth of GHG emissions could cause global average temperatures to increase by another 4°C or more by 2100 and by 1.5 to 2 times as much in many midcontinent and far northern locations. Although our understanding of the impacts of climate change is increasingly and disturbingly clear, there is still debate about the proper course for U.S. policy — a debate that is very much on display during the current presidential transition. But putting near-term politics aside, the mounting economic and scientific evidence leave me confident that trends toward a clean-energy economy that have emerged during my presidency will continue and that the economic opportunity for our country to harness that trend will only grow. This Policy Forum will focus on the four reasons I believe the trend toward clean energy is irreversible.

ECONOMIES GROW, EMISSIONS FALL

The United States is showing that GHG mitigation need not conflict with economic growth. Rather, it can boost efficiency, productivity, and innovation. Since 2008, the United States has experienced the first sustained period of rapid GHG emissions reductions and simultaneous economic growth on record. Specifically, CO2 emissions from the energy sector fell by 9.5% from 2008 to 2015, while the economy grew by more than 10%. In this same period, the amount of energy consumed per dollar of real gross domestic product (GDP) fell by almost 11%, the amount of CO2 emitted per unit of energy consumed declined by 8%, and CO2 emitted per dollar of GDP declined by 18%.

The importance of this trend cannot be overstated. This “decoupling” of energy sector emissions and economic growth should put to rest the argument that combatting climate change requires accepting lower growth or a lower standard of living. In fact, although this decoupling is most pronounced in the United States, evidence that economies can grow while emissions do not is emerging around the world. The International Energy Agency’s (IEA’s) preliminary estimate of energy related CO2 emissions in 2015 reveals that emissions stayed flat compared with the year before, whereas the global economy grew. The IEA noted that “There have been only four periods in the past 40 years in which CO2 emission levels were flat or fell compared with the previous year, with three of those — the early 1980s, 1992, and 2009 — being associated with global economic weakness. By contrast, the recent halt in emissions growth comes in a period of economic growth.”

At the same time, evidence is mounting that any economic strategy that ignores carbon pollution will impose tremendous costs to the global economy and will result in fewer jobs and less economic growth over the long term. Estimates of the economic damages from warming of 4°C over preindustrial levels range from 1% to 5% of global GDP each year by 2100. One of the most frequently cited economic models pins the estimate of annual damages from warming of 4°C at ~4% of global GDP, which could lead to lost U.S. federal revenue of roughly $340 billion to $690 billion annually.

Moreover, these estimates do not include the possibility of GHG increases triggering catastrophic events, such as the accelerated shrinkage of the Greenland and Antarctic ice sheets, drastic changes in ocean currents, or sizable releases of GHGs from previously frozen soils and sediments that rapidly accelerate warming. In addition, these estimates factor in economic damages but do not address the critical question of whether the underlying rate of economic growth (rather than just the level of GDP) is affected by climate change, so these studies could substantially understate the potential damage of climate change on the global macroeconomy.

As a result, it is becoming increasingly clear that, regardless of the inherent uncertainties in predicting future climate and weather patterns, the investments needed to reduce emissions — and to increase resilience and preparedness for the changes in climate that can no longer be avoided — will be modest in comparison with the benefits from avoided climate-change damages. This means, in the coming years, states, localities, and businesses will need to continue making these critical investments, in addition to taking common-sense steps to disclose climate risk to taxpayers, homeowners, shareholders, and customers. Global insurance and reinsurance businesses are already taking such steps as their analytical models reveal growing climate risk.

PRIVATE-SECTOR EMISSIONS REDUCTIONS

Beyond the macroeconomic case, businesses are coming to the conclusion that reducing emissions is not just good for the environment — it can also boost bottom lines, cut costs for consumers, and deliver returns for shareholders.

Perhaps the most compelling example is energy efficiency. Government has played a role in encouraging this kind of investment and innovation. My Administration has put in place (i) fuel economy standards that are net beneficial and are projected to cut more than 8 billion tons of carbon pollution over the lifetime of new vehicles sold between 2012 and 2029 and (ii) 44 appliance standards and new building codes that are projected to cut 2.4 billion tons of carbon pollution and save $550 billion for consumers by 2030.

But ultimately, these investments are being made by firms that decide to cut their energy waste in order to save money and invest in other areas of their businesses. For example, Alcoa has set a goal of reducing its GHG intensity 30% by 2020 from its 2005 baseline, and General Motors is working to reduce its energy intensity from facilities by 20% from its 2011 baseline over the same timeframe. Investments like these are contributing to what we are seeing take place across the economy: Total energy consumption in 2015 was 2.5% lower than it was in 2008, whereas the economy was 10% larger.

This kind of corporate decision-making can save money, but it also has the potential to create jobs that pay well. A U.S. Department of Energy report released this week found that ~2.2 million Americans are currently employed in the design, installation, and manufacture of energy-efficiency products and services. This compares with the roughly 1.1 million Americans who are employed in the production of fossil fuels and their use for electric power generation. Policies that continue to encourage businesses to save money by cutting energy waste could pay a major employment dividend and are based on stronger economic logic than continuing the nearly $5 billion per year in federal fossil-fuel subsidies, a market distortion that should be corrected on its own or in the context of corporate tax reform.

MARKET FORCES IN THE POWER SECTOR

The American electric-power sector — the largest source of GHG emissions in our economy — is being transformed, in large part, because of market dynamics. In 2008, natural gas made up ~21% of U.S. electricity generation. Today, it makes up ~33%, an increase due almost entirely to the shift from higher-emitting coal to lower-emitting natural gas, brought about primarily by the increased availability of low-cost gas due to new production techniques. Because the cost of new electricity generation using natural gas is projected to remain low relative to coal, it is unlikely that utilities will change course and choose to build coal-fired power plants, which would be more expensive than natural gas plants, regardless of any near-term changes in federal policy. Although methane emissions from natural gas production are a serious concern, firms have an economic incentive over the long term to put in place waste-reducing measures consistent with standards my Administration has put in place, and states will continue making important progress toward addressing this issue, irrespective of near-term federal policy.

Renewable electricity costs also fell dramatically between 2008 and 2015: the cost of electricity fell 41% for wind, 54% for rooftop solar photovoltaic (PV) installations, and 64% for utility-scale PV. According to Bloomberg New Energy Finance, 2015 was a record year for clean energy investment, with those energy sources attracting twice as much global capital as fossil fuels.

Public policy — ranging from Recovery Act investments to recent tax credit extensions — has played a crucial role, but technology advances and market forces will continue to drive renewable deployment. The levelized cost of electricity from new renewables like wind and solar in some parts of the United States is already lower than that for new coal generation, without counting subsidies for renewables.

That is why American businesses are making the move toward renewable energy sources. Google, for example, announced last month that, in 2017, it plans to power 100% of its operations using renewable energy — in large part through large-scale, long-term contracts to buy renewable energy directly. Walmart, the nation’s largest retailer, has set a goal of getting 100% of its energy from renewables in the coming years. And economy-wide, solar and wind firms now employ more than 360,000 Americans, compared with around 160,000 Americans who work in coal electric generation and support.

Beyond market forces, state-level policy will continue to drive clean-energy momentum. States representing 40% of the U.S. population are continuing to move ahead with clean-energy plans, and even outside of those states, clean energy is expanding. For example, wind power alone made up 12% of Texas’s electricity production in 2015 and, at certain points in 2015, that number was >40%, and wind provided 32% of Iowa’s total electricity generation in 2015, up from 8% in 2008 (a higher fraction than in any other state).

GLOBAL MOMENTUM

Outside the United States, countries and their businesses are moving forward, seeking to reap benefits for their countries by being at the front of the clean-energy race. This has not always been the case. A short time ago, many believed that only a small number of advanced economies should be responsible for reducing GHG emissions and contributing to the fight against climate change. But nations agreed in Paris that all countries should put forward increasingly ambitious climate policies and be subject to consistent transparency and accountability requirements. This was a fundamental shift in the diplomatic landscape, which has already yielded substantial dividends. The Paris Agreement entered into force in less than a year, and, at the follow-up meeting this fall in Marrakesh, countries agreed that, with more than 110 countries representing more than 75% of global emissions having already joined the Paris Agreement, climate action “momentum is irreversible”. Although substantive action over decades will be required to realize the vision of Paris, analysis of countries’ individual contributions suggests that meeting mediumterm respective targets and increasing their ambition in the years ahead — coupled with scaled-up investment in clean-energy technologies — could increase the international community’s probability of limiting warming to 2°C by as much as 50%.

Were the United States to step away from Paris, it would lose its seat at the table to hold other countries to their commitments, demand transparency, and encourage ambition. This does not mean the next Administration needs to follow identical domestic policies to my Administration’s. There are multiple paths and mechanisms by which this country can achieve — efficiently and economically — the targets we embraced in the Paris Agreement. The Paris Agreement itself is based on a nationally determined structure whereby each country sets and updates its own commitments. Regardless of U.S. domestic policies, it would undermine our economic interests to walk away from the opportunity to hold countries representing two-thirds of global emissions — including China, India, Mexico, European Union members, and others — accountable. This should not be a partisan issue. It is good business and good economics to lead a technological revolution and define market trends. And it is smart planning to set long term emission-reduction targets and give American companies, entrepreneurs, and investors certainty so they can invest and manufacture the emission-reducing technologies that we can use domestically and export to the rest of the world. That is why hundreds of major companies — including energy-related companies from ExxonMobil and Shell, to DuPont and Rio Tinto, to Berkshire Hathaway Energy, Calpine, and Pacific Gas and Electric Company — have supported the Paris process, and leading investors have committed $1 billion in patient, private capital to support clean-energy breakthroughs that could make even greater climate ambition possible.

CONCLUSION

We have long known, on the basis of a massive scientific record, that the urgency of acting to mitigate climate change is real and cannot be ignored. In recent years, we have also seen that the economic case for action — and against inaction — is just as clear, the business case for clean energy is growing, and the trend toward a cleaner power sector can be sustained regardless of near-term federal policies.

Despite the policy uncertainty that we face, I remain convinced that no country is better suited to confront the climate challenge and reap the economic benefits of a low-carbon future than the United States and that continued participation in the Paris process will yield great benefit for the American people, as well as the international community. Prudent U.S. policy over the next several decades would prioritize, among other actions, decarbonizing the U.S. energy system, storing carbon and reducing emissions within U.S. lands, and reducing non-CO2 emissions.

Of course, one of the great advantages of our system of government is that each president is able to chart his or her own policy course. And President-elect Donald Trump will have the opportunity to do so. The latest science and economics provide a helpful guide for what the future may bring, in many cases independent of near-term policy choices, when it comes to combatting climate change and transitioning to a clean energy economy.


Solar Irradiance Measurements

14 January, 2017

guest post by Nadja Kutz

This blog post is based on a thread in the Azimuth Forum.

The current theories about the Sun’s life-time indicate that the Sun will turn into a red giant in about 5 billion years. How and when this process is going to be destructive to the Earth is still debated. Apparently, according to more or less current theories, there has been a quasilinear increase in luminosity. On page 3 of

• K.-P. Schröder and Robert Connon Smith, Distant future of the Sun and Earth revisited, 2008.

we read:

The present Sun is increasing its average luminosity at a rate of 1% in every 110 million years, or 10% over the next billion years.

Unfortunately I feel a bit doubtful about this, in particular after I looked at some irradiation measurements. But let’s recap a bit.

In the Azimuth Forum I asked for information about solar irradiance measurements . Why I was originally interested in how bright the Sun is shining is a longer story, which includes discussions about the global warming potential of methane. For this post I prefer to omit this lengthy historical survey about my original motivations (maybe I’ll come back to this later). Meanwhile there is an also a newer reason why I am interested in solar irradiance measurements, which I want to talk about here.

Strictly speaking I was not only interested in knowing more about how bright the sun is shining, but how bright each of its ‘components’ is shining. That is, I wanted to see spectrally resolved solar irradiance measurements—and in particular, measurements in the range between the wavelengths of roughly 650 and 950 nanometers.

This led me to the the Sorce mission, which is a NASA sponsored satellite mission, whose website is located at the University of Colorado. The website very nicely provides an interactive interface including a fairly clear and intuitive LISIRD interactive app with which the spectral measurements of the Sun can be studied.

As a side remark I should mention that this NASA mission belongs to the NASA Earth Science mission, which is currently threatened to be scrapped.

By using this app, I found in the 650–950 nanometer range a very strange rise in radiation between 2003 and 2016, which happened mainly in the last 2-3 years. You can see this rise here (click to enlarge):

verlauf774-51linie
spectral line 774.5nm from day 132 to 5073, day 132 starting Jan 24 in 2003, day 5073 is end of 2016

Now, fluctuations within certain spectral ranges within the Sun’s spectrum are not news. Here, however, it looked as if a rather stable range suddenly started to change rather “dramatically”.

I put the word “dramatically” in quotes for a couple of reasons.

Spectral measurements are complicated and prone to measurement errors. Subtle issues of dirty lenses and the like are already enough to suggest that this is no easy feat, so that this strange rise might easily be due to a measurement failure. Moreover, as I said, it looked as this was a fairly stable range over the course of ten years. But maybe this new rise in irradiation is part of the 11 years solar cycle, i.e., a common phenomenon. In addition, although the rise looks big, it may overall still be rather subtle.

So: how subtle or non-subtle is it then?

In order to assess that, I made a quick estimate (see the Forum discussion) and found that if all the additional radiation would reach the ground (which of course it doesn’t due to absorption), then on 1000 square meters you could easily power a lawn mower with that subtle change! I.e., my estimate was 1200 watts for that patch of lawn. Whoa!

That was disconcerting enough to download the data and linearly interpolate it and calculate the power of that change. I wrote a program in Javascript to do that. The computer calculations revealed an answer of 1000 watts, i.e., my estimate was fairly close. Whoa again!

How does this translate to overall changes in solar irradiance? Some increase had already been noticed. NASA wrote 2003 on its webpage:

Although the inferred increase of solar irradiance in 24 years, about 0.1 percent, is not enough to cause notable climate change, the trend would be important if maintained for a century or more.

That was 13 years ago.

I now used my program to calculate the irradiance for one day in 2016 between the wavelengths of 180.5 nm and 1797.62 nm, a quite big part of the solar spectrum, and got the value 627 W/m2. I computed the difference between this and one day in 2003, approximately one solar cycle earlier. I got 0.61 W/m2, which is 0.1% in 13 years, rather then 24 years. Of course this is not an average value, and not really well adjusted to the sun cycle, and fluctuations play a big role in some parts of the spectrum, but well—this might indicate that the overall rate of rise in solar radiation may have doubled. Likewise concerning the question of the sun’s luminosity: for assessing luminosity one would need to take the concrete satellite-earth orbit at the day of measurement into account, as the distance to the sun varies. But still, on a first glance this all appears disconcerting.

Given that this spectral range has for example an overlap with the absorption of water (clouds!), this should at least be discussed.

See how the spectrum splits into a purple and dark red line in the lower circle? (Click to enlarge.)

bergbildtag132tag5073at300kreis
Difference in spectrum between day 132 and 5073

The upper circle displays another rise, which is discussed in the forum.

So concluding, all this looks as if this needs to be monitored a bit more closely. It is important to see whether these rises in irradiance are also displayed in other measurements, so I asked in the Azimuth Forum, but so far have gotten no answer.

The Russian Wikipedia site about solar irradiance unfortunately contains no links to Russian satellite missions (if I haven’t overlooked something), and there exists no Chinese or Indian Wikipedia about solar irradiance. I also couldn’t find any publicly accessible spectral irradiance measurements on the ESA website (although they have some satellites out there). In December I wrote an email to the head of the section solar radiometry of the World Radiation Center (WRC) Wolfgang Finsterle, but I’ve had no answer yet.

In short: if you know about publicly available solar spectral irradiance measurements other than the LISIRD ones, then please let me know.


Information Processing in Chemical Networks

4 January, 2017

There’s a workshop this summer:

• Dynamics, Thermodynamics and Information Processing in Chemical Networks, 13-16 June 2017, Complex Systems and Statistical Mechanics Group, University of Luxembourg. Organized by Massimiliano Esposito and Matteo Polettini.

They write, “The idea of the workshop is to bring in contact a small number of high-profile research groups working at the frontier between physics and biochemistry, with particular emphasis on the role of Chemical Networks.”

Some invited speakers include Vassily Hatzimanikatis, John Baez, Christoff Flamm, Hong Qian, Joshua D. Rabinowitz, Luca Cardelli, Erik Winfree, David Soloveichik, Stefan Schuster, David Fell and Arren Bar-Even. There will also be a session of shorter seminars by researchers from the local institutions such as Luxembourg Center for System Biomedicine. I believe attendance is by invitation only, so I’ll endeavor to make some of the ideas presented available here at this blog.

Some of the people involved

I’m looking forward to this, in part because there will be a mix of speakers I’ve met, speakers I know but haven’t met, and speakers I don’t know yet. I feel like reminiscing a bit, and I hope you’ll forgive me these reminiscences, since if you try the links you’ll get an introduction to the interface between computation and chemical reaction networks.

In part 25 of the network theory series here, I imagined an arbitrary chemical reaction network and said:

We could try to use these reactions to build a ‘chemical computer’. But how powerful can such a computer be? I don’t know the answer.

Luca Cardelli answered my question in part 26. This was just my first introduction to the wonderful world of chemical computing. Erik Winfree has a DNA and Natural Algorithms Group at Caltech, practically next door to Riverside, and the people there do a lot of great work on this subject. David Soloveichik, now at U. T. Austin, is an alumnus of this group.

In 2014 I met all three of these folks, and many other cool people working on these theme, at a workshop I tried to summarize here:

Programming with chemical reaction networks, Azimuth, 23 March 2014.

The computational power of chemical reaction networks, 10 June 2014.

Chemical reaction network talks, 26 June 2014.

I met Matteo Polettini about a year later, at a really big workshop on chemical reaction networks run by Elisenda Feliu and Carsten Wiuf:

Trends in reaction network theory (part 1), Azimuth, 27 January 2015.

Trends in reaction network theory (part 2), Azimuth, 1 July 2015.

Polettini has his own blog, very much worth visiting. For example, you can see his view of the same workshop here:

• Matteo Polettini, Mathematical trends in reaction network theory: part 1 and part 2, Out of Equilibrium, 1 July 2015.

Finally, I met Massimiliano Esposito and Christoph Flamm recently at the Santa Fe Institute, at a workshop summarized here:

Information processing and biology, Azimuth, 7 November 2016.

So, I’ve gradually become educated in this area, and I hope that by June I’ll be ready to say something interesting about the semantics of chemical reaction networks. Blake Pollard and I are writing a paper about this now.


Give the Earth a Present: Help Us Save Climate Data

28 December, 2016

getz_ice_shelf

We’ve been busy backing up climate data before Trump becomes President. Now you can help too, with some money to pay for servers and storage space. Please give what you can at our Kickstarter campaign here:

Azimuth Climate Data Backup Project.

If we get $5000 by the end of January, we can save this data until we convince bigger organizations to take over. If we don’t get that much, we get nothing. That’s how Kickstarter works. Also, if you donate now, you won’t be billed until January 31st.

So, please help! It’s urgent.

I will make public how we spend this money. And if we get more than $5000, I’ll make sure it’s put to good use. There’s a lot of work we could do to make sure the data is authenticated, made easily accessible, and so on.

The idea

The safety of US government climate data is at risk. Trump plans to have climate change deniers running every agency concerned with climate change. So, scientists are rushing to back up the many climate databases held by US government agencies before he takes office.

We hope he won’t be rash enough to delete these precious records. But: better safe than sorry!

The Azimuth Climate Data Backup Project is part of this effort. So far our volunteers have backed up nearly 1 terabyte of climate data from NASA and other agencies. We’ll do a lot more! We just need some funds to pay for storage space and a server until larger institutions take over this task.

The team

Jan Galkowski is a statistician with a strong interest in climate science. He works at Akamai Technologies, a company responsible for serving at least 15% of all web traffic. He began downloading climate data on the 11th of December.

• Shortly thereafter John Baez, a mathematician and science blogger at U. C. Riverside, joined in to publicize the project. He’d already founded an organization called the Azimuth Project, which helps scientists and engineers cooperate on environmental issues.

• When Jan started running out of storage space, Scott Maxwell jumped in. He used to work for NASA—driving a Mars rover among other things—and now he works for Google. He set up a 10-terabyte account on Google Drive and started backing up data himself.

• A couple of days later Sakari Maaranen joined the team. He’s a systems architect at Ubisecure, a Finnish firm, with access to a high-bandwidth connection. He set up a server, he’s downloading lots of data, he showed us how to authenticate it with SHA-256 hashes, and he’s managing many other technical aspects of this project.

There are other people involved too. You can watch the nitty-gritty details of our progress here:

Azimuth Backup Project – Issue Tracker.

and you can learn more here:

Azimuth Climate Data Backup Project.


Saving Climate Data (Part 3)

23 December, 2016

You can back up climate data, but how can anyone be sure your backups are accurate? Let’s suppose the databases you’ve backed up have been deleted, so that there’s no way to directly compare your backup with the original. And to make things really tough, let’s suppose that faked databases are being promoted as competitors with the real ones! What can you do?

One idea is ‘safety in numbers’. If a bunch of backups all match, and they were made independently, it’s less likely that they all suffer from the same errors.

Another is ‘safety in reputation’. If a bunch of backups of climate data are held by academic institutes of climate science, and another are held by climate change denying organizations (conveniently listed here), you probably know which one you trust more. (And this is true even if you’re a climate change denier, though your answer may be different than mine.)

But a third idea is to use a cryptographic hash function. In very simplified terms, this is a method of taking a database and computing a fairly short string from it, called a ‘digest’.

740px-cryptographic_hash_function-svg

A good hash function makes it hard to change the database and get a new one with the same digest. So, if the person owning a database computes and publishes the digest, anyone can check that your backup is correct by computing its digest and comparing it to the original.

It’s not foolproof, but it works well enough to be helpful.

Of course, it only works if we have some trustworthy record of the original digest. But the digest is much smaller than the original database: for example, in the popular method called SHA-256, the digest is 256 bits long. So it’s much easier to make copies of the digest than to back up the original database. These copies should be stored in trustworthy ways—for example, the Internet Archive.

When Sakari Maraanen made a backup of the University of Idaho Gridded Surface Meteorological Data, he asked the custodians of that data to publish a digest, or ‘hash file’. One of them responded:

Sakari and others,

I have made the checksums for the UofI METDATA/gridMET files (1979-2015) as both md5sums and sha256sums.

You can find these hash files here:

https://www.northwestknowledge.net/metdata/data/hash.md5

https://www.northwestknowledge.net/metdata/data/hash.sha256

After you download the files, you can check the sums with:

md5sum -c hash.md5

sha256sum -c hash.sha256

Please let me know if something is not ideal and we’ll fix it!

Thanks for suggesting we do this!

Sakari replied:

Thank you so much! This means everything to public mirroring efforts. If you’d like to help further promoting this Best Practice, consider getting it recognized as a standard when you do online publishing of key public information.

1. Publishing those hashes is already a major improvement on its own.

2. Publishing them on a secure website offers people further guarantees that there has not been any man-in-the-middle.

3. Digitally signing the checksum files offers the best easily achievable guarantees of data integrity by the person(s) who sign the checksum files.

Please consider having these three steps included in your science organisation’s online publishing training and standard Best Practices.

Feel free to forward this message to whom it may concern. Feel free to rephrase as necessary.

As a separate item, public mirroring instructions for how to best download your data and/or public websites would further guarantee permanence of all your uniquely valuable science data and public contributions.

Right now we should get this message viral through the government funded science publishing people. Please approach the key people directly – avoiding the delay of using official channels. We need to have all the uniquely valuable public data mirrored before possible changes in funding.

Again, thank you for your quick response!

There are probably lots of things to be careful about. Here’s one. Maybe you can think of more, and ways to deal with them.

What if the data keeps changing with time? This is especially true of climate records, where new temperatures and so on are added to a database every day, or month, or year. Then I think we need to ‘time-stamp’ everything. The owners of the original database need to keep a list of digests, with the time each one was made. And when you make a copy, you need to record the time it was made.


Azimuth Backup Project (Part 2)

20 December, 2016

I want to list some databases that are particularly worth backing up. But to do this, we need to know what’s already been backed up. That’s what this post is about.

Azimuth backups

Here is information as of now (21:45 GMT 20 December 2016). I won’t update this information. For up-to-date information see

Azimuth Backup Project: Issue Tracker.

For up-to-date information on the progress of each of individual databases listed below, click on my summary of what’s happening now.

Here are the databases that we’ve backed up:

• NASA GISTEMP website at http://data.giss.nasa.gov/gistemp/downloaded by Jan and uploaded to Sakari’s datarefuge server.

• NOAA Carbon Dioxide Information Analysis Center (CDIAC) data at ftp.ncdc.noaa.gov/pub/data/paleo/cdiac.ornl.gov-pub — downloaded by Jan and uploaded to Sakari’s datarefuge server.

• NOAA Carbon Tracker website at http://www.esrl.noaa.gov/psd/data/gridded/data.carbontracker.htmldownloaded by Jan, uploaded to Sakari’s datarefuge server.

These are still in progress, but I think we have our hands on the data:

• NOAA Precipitation Frequency Data at http://hdsc.nws.noaa.gov/hdsc/pfds/ and ftp://hdsc.nws.noaa.gov/pubdownloaded by Borislav, not yet uploaded to Sakari’s datarefuge server.

• NOAA Carbon Dioxide Information Analysis Center (CDIAC) website at http://cdiac.ornl.govdownloaded by Jan, uploaded to Sakari’s datarefuge server, but there’s evidence that the process was incomplete.

• NOAA website at https://www.ncdc.noaa.govdownloaded by Jan, who is now attempting to upload it to Sakari’s datarefuge server.

• NOAA National Centers for Environmental Information (NCEI) website at https://www.ncdc.noaa.govdownloaded by Jan, who is now attempting to upload it to Sakari’s datarefuge server, but there are problems.

• Ocean and Atmospheric Research data at ftp.oar.noaa.gov — downloaded by Jan, now attempting to upload it to Sakari’s datarefuge server.

• NOAA NCEP/NCAR Reanalysis ftp site at ftp.cdc.noaa.gov/Datasets/ncep.reanalysis/ — downloaded by Jan, now attempting to upload it to Sakari’s datarefuge server.

I think we’re getting these now, more or less:

• NOAA National Centers for Environmental Information (NCEI) ftp site at ftp://eclipse.ncdc.noaa.gov/pub/ — in the process of being downloaded by Jan, “Very large. May be challenging to manage with my facilities”.

• NASA Planetary Data System (PDS) data at https://pds.nasa.govin the process of being downloaded by Sakari.

• NOAA tides and currents products website at https://tidesandcurrents.noaa.gov/products.html, which includes the sea level trends data at https://tidesandcurrents.noaa.gov/sltrends/sltrends.htmlJan is downloading this.

• NOAA National Centers for Environmental Information (NCEI) satellite datasets website at https://www.ncdc.noaa.gov/data-access/satellite-data/satellite-data-access-datasetsJan is downloading this.

• NASA JASON3 sea level data at http://sealevel.jpl.nasa.gov/missions/jason3/Jan is downloading this.

• U.S. Forest Service Climate Change Atlas website at http://www.fs.fed.us/nrs/atlas/Jan is downloading this.

• NOAA Global Monitoring Division website at http://www.esrl.noaa.gov/gmd/dv/ftpdata.htmlJan is downloading this.

• NOAA Global Monitoring Division ftp data at aftp.cmdl.noaa.gov/ — Jan is downloading this.

• NOAA National Data Buoy Center website at http://www.ndbc.noaa.gov/Jan is downloading this.

• NASA-ESDIS Oak Ridge National Laboratory Distributed Active Archive (DAAC) on Biogeochemical Dynamics at https://daac.ornl.gov/get_data.shtmlJan is downloading this.

• NASA-ESDIS Oak Ridge National Laboratory Distributed Active Archive (DAAC) on Biogeochemical Dynamics website at https://daac.ornl.gov/Jan is downloading this.

Other backups

Other backups are listed at

The Climate Mirror Project, https://climate.daknob.net/.

This nicely provides the sizes of various backups, and other useful information. Some are ‘signed and verified’ with cryptographic keys, but I’m not sure exactly what that means, and the details matter.

About 90 databases are listed here, along with some size information and some information about whether people have already backed them up or are in process:

Gov. Climate Datasets (Archive). (Click on the tiny word “Datasets” at the bottom of the page!)


azimuth_logo


Azimuth Backup Project (Part 1)

16 December, 2016

This blog page is to help organize the Azimuth Environmental Data Backup Project, or Azimuth Backup Project for short. This is part of the larger but decentralized, frantic and somewhat disorganized project discussed elsewhere:

Saving Climate Data (Part 2), Azimuth, 15 December 2016.

Here I’ll just say what we’re doing at Azimuth.

Jan Galkowski is a statistician and engineer at Akamai Technologies, a company in Cambridge Massachusetts whose content delivery network is one of the world’s largest distributed computing platforms, responsible for serving at least 15% of all web traffic. He has begun copying some of the publicly accessible US government climate databases. On 11 December he wrote:

John, so I have just started trying to mirror all of CDIAC [the Carbon Dioxide Information Analysis Center]. We’ll see. I’ll put it in a tarball, and then throw it up on Google. It should keep everything intact. Using WinHTTrack. I have coordinated with Eric Holthaus via Twitter, creating, per your suggestion, a new personal account which I am using exclusively to follow the principals.

Once CDIAC is done, and checked over, I’ll move on to other sites.

There are things beyond our control, such as paper records, or records which are online but are not within visibility of the public.

Oh, and I’ve formally requested time off from work for latter half of December so I can work this on vacation. (I have a number of other projects I want to work in parallel, anyway.)

By 14 December he was wanting some more storage space. He asked David Tanzer and me:

Do either of you have a large Google account, or the “unlimited storage” option at Amazon?

I’m using WebDrive, a commercial product. What I’m (now) doing is defining an FTP map at a .gov server, and then a map to my Amazon Cloud Drive. I’m using Windows 7, so these appear as standard drives (or mounts, in *nix terms). I navigate to an appropriate place on the Amazon Drive, and then I proceed to copy from .gov to Amazon.

There is no compression, and, in order to be sure I don’t abuse the .gov site, I’m deliberately passing this over a wireless network in my home, which limits the transfer rate. If necessary, and if the .gov site permits, I could hardwire the workstation to our FIOS router and get appreciably faster transfer. (I often do that for large work files.)

The nice thing is I get to work from home 3 days a week, so I can keep an eye on this. And I’m taking days off just to do this.

I’m thinking about how I might get a second workstation in the act.

The Web sites themselves I’m downloading, as mentioned, using HTTrack. I intended to tarball-up the site structure and then upload to Amazon. I’m still working on CDIAC at ORNL. For future sites, I’m going to try to get HTTrack to mirror directly to Amazon using one of the mounts.

I asked around for more storage space, and my request was kindly answer by Scott Maxwell. Scott lives in Pasadena California and he used to work for NASA: he even had a job driving a Mars rover! He is now a site reliability engineer at Google, and he works on Google Drive. Scott is setting up a 10-terabyte account on Google Drive, which Jan and others will be able to use.

Meanwhile, Jan noticed some interesting technical problems: for some reason WebDrive is barely using the capacity of his network connection, so things are moving much more slowly than they could in theory.

Most recently, Sakari Maaranen offered his assistance. Sakari is a systems architect at Ubisecure, a firm in Finland that specializes in identity management, advanced user authentication, authorization, single sign-on, and federation. He wrote:

I have several terabytes worth in Helsinki (can get more) and a gigabit connection. I registered my offer but they [the DataRefuge people] didn’t reply though. I’m glad if that means you have help already and don’t need a copy in Helsinki.

I replied saying that the absence of a reply probably means that they’re overwhelmed by offers of help and are struggling to figure out exactly what to do. Scott said:

Hey, Sakari! Thank you for the generous offer!

I’m setting these guys up with Google Drive storage, as at least a short-term solution.

IMHO, our first order of business is just to get a copy of the data into a location we control—one that can’t easily be taken away from us. That’s the rationale for Google Drive: it fits into Jan’s existing workflow, so it’s the lowest-friction path to getting a copy of the data that’s under our control.

How about if I propose this: we let Jan go ahead with the plan of backing up the data in Drive. Then I’ll look evaluate moving it from there to whatever other location we come up with. (Or copying instead of moving! More copies is better. :-) How does that sound to you?

I admit I haven’t gotten as far as thinking about Web serving at all—and it’s not my area of expertise anyway. Maybe you’d be kind enough to elaborate on your thoughts there.

Sakari responded with some information about servers. In late January, U. C. Riverside may help me with this—until then they are busy trying to get more storage space, for wholly unrelated reasons. But right now it seems the main job is to identify important data and get it under our control.

There are a lot of ways you could help.

Computer skills. Personally I’m not much use with anything technical about computers, but the rest of the Azimuth Data Backup gang probably has technical questions that some of you out there could answer… so, I encourage discussion of those questions. (Clearly some discussions are best done privately, and at some point we may encounter unfriendly forces, but this is a good place for roaming experts to answer questions.)

Security. Having a backup of climate data is not very useful if there are also fake databases floating around and you can’t prove yours is authentic. How can we create a kind of digital certificate that our database matches what was on a specific website at a specific time? We should do this if someone here has the expertise.

Money. If we wind up wanting to set up a permanent database with a nice front end, accessible from the web, we may want money. We could do a Kickstarter campaign. People may be more interested in giving money now than later, unless the political situation immediately gets worse after January 20th.

Strategy. We should talk a bit about what to do next, though too much talk tends to prevent action. Eventually, if all goes well, our homegrown effort will be overshadowed by others, at least in sheer quantity. About 3 hours ago Eric Holthaus tweeted “we just got a donation of several petabytes”. If it becomes clear that others are putting together huge, secure databases with nice front ends, we can either quit or—better—cooperate with them, and specialize on something we’re good at and enjoy.


azimuth_logo