5 January, 2015

I was disappointed when Google gave up. In 2007, the company announced a bold initiative to fight global warming:

### Google’s Goal: Renewable Energy Cheaper than Coal

Creates renewable energy R&D group and supports breakthrough technologies

Mountain View, Calif. (November 27, 2007) – Google (NASDAQ: GOOG) today announced a new strategic initiative to develop electricity from renewable energy sources that will be cheaper than electricity produced from coal. The newly created initiative, known as RE<C, will focus initially on advanced solar thermal power, wind power technologies, enhanced geothermal systems and other potential breakthrough technologies. RE<C is hiring engineers and energy experts to lead its research and development work, which will begin with a significant effort on solar thermal technology, and will also investigate enhanced geothermal systems and other areas. In 2008, Google expects to spend tens of millions on research and development and related investments in renewable energy. As part of its capital planning process, the company also anticipates investing hundreds of millions of dollars in breakthrough renewable energy projects which generate positive returns.

But in 2011, Google shut down the program. I never heard why. Recently two engineers involved in the project have given a good explanation:

• Ross Koningstein and David Fork, What it would really take to reverse climate change, 18 November 2014.

But the short version is this. They couldn’t find a way to accomplish their goal: producing a gigawatt of renewable power more cheaply than a coal-fired plant — and in years, not decades.

And since then, they’ve been reflecting on their failure and they’ve realized something even more sobering. Even if they’d been able to realize their best-case scenario — a 55% carbon emissions cut by 2050 — it would not bring atmospheric CO2 back below 350 ppm during this century.

This is not surprising to me.

What would we need to accomplish this? They say two things. First, a cheap dispatchable, distributed power source:

Consider an average U.S. coal or natural gas plant that has been in service for decades; its cost of electricity generation is about 4 to 6 U.S. cents per kilowatt-hour. Now imagine what it would take for the utility company that owns that plant to decide to shutter it and build a replacement plant using a zero-carbon energy source. The owner would have to factor in the capital investment for construction and continued costs of operation and maintenance—and still make a profit while generating electricity for less than $0.04/kWh to$0.06/kWh.

That’s a tough target to meet. But that’s not the whole story. Although the electricity from a giant coal plant is physically indistinguishable from the electricity from a rooftop solar panel, the value of generated electricity varies. In the marketplace, utility companies pay different prices for electricity, depending on how easily it can be supplied to reliably meet local demand.

“Dispatchable” power, which can be ramped up and down quickly, fetches the highest market price. Distributed power, generated close to the electricity meter, can also be worth more, as it avoids the costs and losses associated with transmission and distribution. Residential customers in the contiguous United States pay from $0.09/kWh to$0.20/kWh, a significant portion of which pays for transmission and distribution costs. And here we see an opportunity for change. A distributed, dispatchable power source could prompt a switchover if it could undercut those end-user prices, selling electricity for less than $0.09/kWh to$0.20/kWh in local marketplaces. At such prices, the zero-carbon system would simply be the thrifty choice.

But “dispatchable”, they say, means “not solar”.

Second, a lot of carbon sequestration:

While this energy revolution is taking place, another field needs to progress as well. As Hansen has shown, if all power plants and industrial facilities switch over to zero-carbon energy sources right now, we’ll still be left with a ruinous amount of CO2 in the atmosphere. It would take centuries for atmospheric levels to return to normal, which means centuries of warming and instability. To bring levels down below the safety threshold, Hansen’s models show that we must not only cease emitting CO2 as soon as possible but also actively remove the gas from the air and store the carbon in a stable form. Hansen suggests reforestation as a carbon sink. We’re all for more trees, and we also exhort scientists and engineers to seek disruptive technologies in carbon storage.

How to achieve these two goals? They say government and energy businesses should spend 10% of employee time on “strange new ideas that have the potential to be truly disruptive”.

## Wind Power and the Smart Grid

18 June, 2014

Electric power companies complain about wind power because it’s intermittent: if suddenly the wind stops, they have to bring in other sources of power.

This is no big deal if we only use a little wind. Across the US, wind now supplies 4% of electric power; even in Germany it’s just 8%. The problem starts if we use a lot of wind. If we’re not careful, we’ll need big fossil-fuel-powered electric plants when the wind stops. And these need to be turned on, ready to pick up the slack at a moment’s notice!

So, a few years ago Xcel Energy, which supplies much of Colorado’s power, ran ads opposing a proposal that it use renewable sources for 10% of its power.

But now things have changed. Now Xcel gets about 15% of their power from wind, on average. And sometimes this spikes to much more!

Every few seconds, hundreds of turbines measure the wind speed. Every 5 minutes, they send this data to high-performance computers 100 miles away at the National Center for Atmospheric Research in Boulder. NCAR crunches these numbers along with data from weather satellites, weather stations, and other wind farms – and creates highly accurate wind power forecasts.

With better prediction, Xcel can do a better job of shutting down idling backup plants on days when they’re not needed. Last year was a breakthrough year – better forecasts saved Xcel nearly as much money as they had in the three previous years combined.

It’s all part of the emerging smart grid—an intelligent network that someday will include appliances and electric cars. With a good smart grid, we could set our washing machine to run when power is cheap. Maybe electric cars could store solar power in the day, use it to power neighborhoods when electricity demand peaks in the evening – then recharge their batteries using wind power in the early morning hours. And so on.

### References

I would love if it the Network Theory project could ever grow to the point of helping design the smart grid. So far we are doing much more ‘foundational’ work on control theory, along with a more applied project on predicting El Niños. I’ll talk about both of these soon! But I have big hopes and dreams, so I want to keep learning more about power grids and the like.

Here are two nice references:

• Kevin Bullis, Smart wind and solar power, from 10 breakthrough technologies, Technology Review, 23 April 2014.

• Keith Parks, Yih-Huei Wan, Gerry Wiener and Yubao Liu, Wind energy forecasting: a collaboration of the National Center for Atmospheric Research (NCAR) and Xcel Energy.

The first is fun and easy to read. The second has more technical details. It describes the software used (the picture on top of this article shows a bit of this), and also some of the underlying math and physics. Let me quote a bit:

#### High-resolution Mesoscale Ensemble Prediction Model (EPM)

It is known that atmospheric processes are chaotic in nature. This implies that even small errors in the model initial conditions combined with the imperfections inherent in the NWP model formulations, such as truncation errors and approximations in model dynamics and physics, can lead to a wind forecast with large errors for certain weather regimes. Thus, probabilistic wind prediction approaches are necessary for guiding wind power applications. Ensemble prediction is at present a practical approach for producing such probabilistic predictions. An innovative mesoscale Ensemble Real-Time Four Dimensional Data Assimilation (E-RTFDDA) and forecasting system that was developed at NCAR was used as the basis for incorporating this ensemble prediction capability into the Xcel forecasting system.

Ensemble prediction means that instead of a single weather forecast, we generate a probability distribution on the set of weather forecasts. The paper has references explaining this in more detail.

We had a nice discussion of wind power and the smart grid over on G+. Among other things, John Despujols mentioned the role of ‘smart inverters’ in enhancing grid stability:

Smart solar inverters smooth out voltage fluctuations for grid stability, DigiKey article library.

A solar inverter converts the variable direct current output of a photovoltaic solar panel into alternating current usable by the electric grid. There’s a lot of math involved here—click the link for a Wikipedia summary. But solar inverters are getting smarter.

#### Wild fluctuations

While the solar inverter has long been the essential link between the photovoltaic panel and the electricity distribution network and converting DC to AC, its role is expanding due to the massive growth in solar energy generation. Utility companies and grid operators have become increasingly concerned about managing what can potentially be wildly fluctuating levels of energy produced by the huge (and still growing) number of grid-connected solar systems, whether they are rooftop systems or utility-scale solar farms. Intermittent production due to cloud cover or temporary faults has the potential to destabilize the grid. In addition, grid operators are struggling to plan ahead due to lack of accurate data on production from these systems as well as on true energy consumption.

In large-scale facilities, virtually all output is fed to the national grid or micro-grid, and is typically well monitored. At the rooftop level, although individually small, collectively the amount of energy produced has a significant potential. California estimated it has more than 150,000 residential rooftop grid-connected solar systems with a potential to generate 2.7 MW.

However, while in some systems all the solar energy generated is fed to the grid and not accessible to the producer, others allow energy generated to be used immediately by the producer, with only the excess fed to the grid. In the latter case, smart meters may only measure the net output for billing purposes. In many cases, information on production and consumption, supplied by smart meters to utility companies, may not be available to the grid operators.

#### Getting smarter

The solution according to industry experts is the smart inverter. Every inverter, whether at panel level or megawatt-scale, has a role to play in grid stability. Traditional inverters have, for safety reasons, become controllable, so that they can be disconnected from the grid at any sign of grid instability. It has been reported that sudden, widespread disconnects can exacerbate grid instability rather than help settle it.

Smart inverters, however, provide a greater degree of control and have been designed to help maintain grid stability. One trend in this area is to use synchrophasor measurements to detect and identify a grid instability event, rather than conventional ‘perturb-and-observe’ methods. The aim is to distinguish between a true island condition and a voltage or frequency disturbance which may benefit from additional power generation by the inverter rather than a disconnect.

Smart inverters can change the power factor. They can input or receive reactive power to manage voltage and power fluctuations, driving voltage up or down depending on immediate requirements. Adaptive volts-amps reactive (VAR) compensation techniques could enable ‘self-healing’ on the grid.

Two-way communications between smart inverter and smart grid not only allow fundamental data on production to be transmitted to the grid operator on a timely basis, but upstream data on voltage and current can help the smart inverter adjust its operation to improve power quality, regulate voltage, and improve grid stability without compromising safety. There are considerable challenges still to overcome in terms of agreeing and evolving national and international technical standards, but this topic is not covered here.

The benefits of the smart inverter over traditional devices have been recognized in Germany, Europe’s largest solar energy producer, where an initiative is underway to convert all solar energy producers’ inverters to smart inverters. Although the cost of smart inverters is slightly higher than traditional systems, the advantages gained in grid balancing and accurate data for planning purposes are considered worthwhile. Key features of smart inverters required by German national standards include power ramping and volt/VAR control, which directly influence improved grid stability.

## Carbon Emissions from Coal-Fired Power Plants

13 September, 2013

The 50 dirtiest electric power plants in the United States—all coal-fired—emit as much carbon dioxide as half of America’s 240 million cars.

The dirtiest 1% spew out a third of the carbon produced by US power plants.

And the 100 dirtiest plants—still a tiny fraction of the country’s 6,000 power plants—account for a fifth of all US carbon emissions.

According to this report, curbing the emissions of these worst offenders would be one of the best ways to cut US carbon emissions, reducing the risk that emissions will trigger dangerous climate change:

• Environment America Research and Policy Center, America’s dirtiest power plants: their oversized contribution to global warming and what we can do about it, 2013.

Some states in the US already limit carbon pollution from power plants. At the start of this year, California imposed a cap on carbon dioxide emissions, and in 2014 it will link with Quebec’s carbon market. Nine states from Maine to Maryland participate in the Regional Greenhouse Gas Initiative (RGGI), which caps emissions from power plants in the Northeast.

At the federal level, a big step forward was the 2007 Supreme Court decision saying the Environmental Protection Agency should develop plans to regulate carbon emissions. The EPA is now getting ready to impose carbon emission limits for all new power plants in the US. But some of the largest sources of carbon dioxide are existing power plants, so getting them to shape up or shut down could have big benefits.

### What to do?

Here’s what the report suggests:

• The Obama Administration should set strong limits on carbon dioxide pollution from new power plants to prevent the construction of a new generation of dirty power plants, and force existing power plants to clean up by setting strong limits on carbon dioxide emissions from all existing power plants.

• New plants – The Environmental Protection Agency (EPA) should work to meet its September 2013 deadline for re-proposing a stringent emissions standard for new power plants. It should also set a deadline for finalizing these standards no later than June 2015.

• Existing plants – The EPA should work to meet the timeline put forth by President Obama for proposing and finalizing emissions standards for existing power plants. This timeline calls for limits on existing plants to be proposed by June 2014 and finalized by June 2015. The standards should be based on the most recent climate science and designed to achieve the emissions reduction targets that are necessary to avoid the worst impacts of global warming.

In addition to cutting pollution from power plants, the United States should adopt a suite of clean energy policies at the local, state, and federal levels to curb emissions of carbon dioxide from energy use in other sectors.

In particular, the United States should prioritize establishing a comprehensive, national plan to reduce carbon pollution from all sources – including transportation, industrial activities, and the commercial and residential sectors.

Other policies to curb emissions include:

• Retrofitting three-quarters of America’s homes and businesses for improved energy efficiency, and implementing strong building energy codes to dramatically reduce fossil fuel consumption in new homes and businesses.

• Adopting a federal renewable electricity standard that calls for 25 percent of America’s electricity to come from clean, renewable sources by 2025.

• Strengthening and implementing state energy efficiency resource standards that require utilities to deliver energy efficiency improvements in homes, businesses and industries.

• Installing more than 200 gigawatts of solar panels and other forms of distributed renewable energy at residential, commercial and industrial buildings over the next two decades.

• Encouraging the use of energy-saving combined heat-and-power systems in industry.

• Facilitating the deployment of millions of plug-in vehicles that operate partly or solely on electricity, and adopting clean fuel standards that require a reduction in the carbon intensity of transportation fuels.

• Ensuring that the majority of new residential and commercial development in metropolitan areas takes place in compact, walkable communities with access to a range of transportation options.

• Expanding public transportation service to double ridership by 2030, encouraging further ridership increases through better transit service, and reducing per-mile global warming pollution from transit vehicles. The U.S. should also build high-speed rail lines in 11 high-priority corridors by 2030.

• Strengthening and expanding the Regional Greenhouse Gas Initiative, which limits carbon dioxide pollution from power plants in nine northeastern state, and implementing California’s Global Warming Solutions Act (AB32), which places an economy-wide cap on the state’s greenhouse gas emissions.

### Carbon emitted per power produced

An appendix to this report list the power plants that emit the most carbon dioxide by name, along with estimates of their emissions. That’s great! But annoyingly, they do not seem to list the amounts of energy per year produced by these plants.

If carbon emissions were strictly proportional to the amount of energy produced, that would tend to undercut the the notion that the biggest carbon emitters are especially naughty. But in fact there’s a lot of variability in the amount of carbon emitted per energy generated. You can see that in this chart of theirs:

So, it would be good to see a list of the worst power plants in terms of CO2 emitted per energy generated.

The people who prepared this report could probably create such a list without much extra work, since they write:

We obtained fuel consumption and electricity generation data for power plants operating in the United States from the U.S. Department of Energy’s Energy Information Administration (EIA) 2011 December EIA-923 Monthly Time Series.

## John Harte

27 October, 2012

Earlier this week I gave a talk on the Mathematics of Planet Earth at the University of Southern California, and someone there recommended that I look into John Harte’s work on maximum entropy methods in ecology. He works at U.C. Berkeley.

I checked out his website and found that his goals resemble mine: save the planet and understand its ecosystems. He’s a lot further along than I am, since he comes from a long background in ecology while I’ve just recently blundered in from mathematical physics. I can’t really say what I think of his work since I’m just learning about it. But I thought I should point out its existence.

This free book is something a lot of people would find interesting:

• John and Mary Ellen Harte, Cool the Earth, Save the Economy: Solving the Climate Crisis Is EASY, 2008.

EASY? Well, it’s an acronym. Here’s the basic idea of the US-based plan described in this book:

Any proposed energy policy should include these two components:

Technical/Behavioral: What resources and technologies are to be used to supply energy? On the demand side, what technologies and lifestyle changes are being proposed to consumers?

Incentives/Economic Policy: How are the desired supply and demand options to be encouraged or forced? Here the options include taxes, subsidies, regulations, permits, research and development, and education.

And a successful energy policy should satisfy the AAA criteria:

Availability. The climate crisis will rapidly become costly to society if we do not take action expeditiously. We need to adopt now those technologies that are currently available, provided they meet the following two additional criteria:

Affordability. Because of the central role of energy in our society, its cost to consumers should not increase significantly. In fact, a successful energy policy could ultimately save consumers money.

Acceptability. All energy strategies have environmental, land use, and health and safety implications; these must be acceptable to the public. Moreover, while some interest groups will undoubtedly oppose any particular energy policy, political acceptability at a broad scale is necessary.

Our strategy for preventing climate catastrophe and achieving energy independence includes:

Energy Efficient Technology at home and at the workplace. Huge reductions in home energy use can be achieved with available technologies, including more efficient appliances such as refrigerators, water heaters, and light bulbs. Home retrofits and new home design features such as “smart” window coatings, lighter-colored roofs where there are hot summers, better home insulation, and passive solar designs can also reduce energy use. Together, energy efficiency in home and industry can save the U.S. up to approximately half of the energy currently consumed in those sectors, and at no net cost—just by making different choices. Sounds good, doesn’t it?

Automobile Fuel Efficiency. Phase in higher Corporate Average Fuel Economy (CAFE) standards for automobiles, SUVs and light trucks by requiring vehicles to go 35 miles per gallon of gas (mpg) by 2015, 45 mpg by 2020, and 60 mpg by 2030. This would rapidly wipe out our dependence on foreign oil and cut emissions from the vehicle sector by two-thirds. A combination of plug-in hybrid, lighter car body materials, re-design and other innovations could readily achieve these standards. This sounds good, too!

Solar and Wind Energy. Rooftop photovoltaic panels and solar water heating units should be phased in over the next 20 years, with the goal of solar installation on 75% of U.S. homes and commercial buildings by 2030. (Not all roofs receive sufficient sunlight to make solar panels practical for them.) Large wind farms, solar photovoltaic stations, and solar thermal stations should also be phased in so that by 2030, all U.S. electricity demand will be supplied by existing hydroelectric, existing and possibly some new nuclear, and, most importantly, new solar and wind units. This will require investment in expansion of the grid to bring the new supply to the demand, and in research and development to improve overnight storage systems. Achieving this goal would reduce our dependence on coal to practically zero. More good news!

You are part of the answer. Voting wisely for leaders who promote the first three components is one of the most important individual actions one can make. Other actions help, too. Just as molecules make up mountains, individual actions taken collectively have huge impacts. Improved driving skills, automobile maintenance, reusing and recycling, walking and biking, wearing sweaters in winter and light clothing in summer, installing timers on thermostats and insulating houses, carpooling, paying attention to energy efficiency labels on appliances, and many other simple practices and behaviors hugely influence energy consumption. A major education campaign, both in schools for youngsters and by the media for everyone, should be mounted to promote these consumer practices.

No part of EASY can be left out; all parts are closely integrated. Some parts might create much larger changes—for example, more efficient home appliances and automobiles—but all parts are essential. If, for example, we do not achieve the decrease in electricity demand that can be brought about with the E of EASY, then it is extremely doubtful that we could meet our electricity needs with the S of EASY.

It is equally urgent that once we start implementing the plan, we aggressively export it to other major emitting nations. We can reduce our own emissions all we want, but the planet will continue to warm if we can’t convince other major global emitters to reduce their emissions substantially, too.

What EASY will achieve. If no actions are taken to reduce carbon dioxide emissions, in the year 2030 the U.S. will be emitting about 2.2 billion tons of carbon in the form of carbon dioxide. This will be an increase of 25% from today’s emission rate of about 1.75 billion tons per year of carbon. By following the EASY plan, the U.S. share in a global effort to solve the climate crisis (that is, prevent catastrophic warming) will result in U.S emissions of only about 0.4 billion tons of carbon by 2030, which represents a little less than 25% of 2007 carbon dioxide emissions.128 Stated differently, the plan provides a way to eliminate 1.8 billion tons per year of carbon by that date.

We must act urgently: in the 14 months it took us to write this book, atmospheric CO2 levels rose by several billion tons of carbon, and more climatic consequences have been observed. Let’s assume that we conserve our forests and other natural carbon reservoirs at our current levels, as well as maintain our current nuclear and hydroelectric plants (or replace them with more solar and wind generators). Here’s what implementing EASY will achieve, as illustrated by Figure 3.1 on the next page.

Please check out this book and help me figure out if the numbers add up! I could also use help understanding his research, for example:

• John Harte, Maximum Entropy and Ecology: A Theory of Abundance, Distribution, and Energetics, Oxford University Press, Oxford, 2011.

The book is not free but the first chapter is.

This paper looks really interesting too:

• J. Harte, T. Zillio, E. Conlisk and A. B. Smith, Maximum entropy and the state-variable approach to macroecology, Ecology 89 (2008), 2700–-2711.

Again, it’s not freely available—tut tut. Ecologists should follow physicists and make their work free online; if you’re serious about saving the planet you should let everyone know what you’re doing! However, the abstract is visible to all, and of course I can use my academic superpowers to get ahold of the paper for myself:

Abstract: The biodiversity scaling metrics widely studied in macroecology include the species-area relationship (SAR), the scale-dependent species-abundance distribution (SAD), the distribution of masses or metabolic energies of individuals within and across species, the abundance-energy or abundance-mass relationship across species, and the species-level occupancy distributions across space. We propose a theoretical framework for predicting the scaling forms of these and other metrics based on the state-variable concept and an analytical method derived from information theory. In statistical physics, a method of inference based on information entropy results in a complete macro-scale description of classical thermodynamic systems in terms of the state variables volume, temperature, and number of molecules. In analogy, we take the state variables of an ecosystem to be its total area, the total number of species within any specified taxonomic group in that area, the total number of individuals across those species, and the summed metabolic energy rate for all those individuals. In terms solely of ratios of those state variables, and without invoking any specific ecological mechanisms, we show that realistic functional forms for the macroecological metrics listed above are inferred based on information entropy. The Fisher log series SAD emerges naturally from the theory. The SAR is predicted to have negative curvature on a log-log plot, but as the ratio of the number of species to the number of individuals decreases, the SAR becomes better and better approximated by a power law, with the predicted slope z in the range of 0.14-0.20. Using the 3/4 power mass-metabolism scaling relation to relate energy requirements and measured body sizes, the Damuth scaling rule relating mass and abundance is also predicted by the theory. We argue that the predicted forms of the macroecological metrics are in reasonable agreement with the patterns observed from plant census data across habitats and spatial scales. While this is encouraging, given the absence of adjustable fitting parameters in the theory, we further argue that even small discrepancies between data and predictions can help identify ecological mechanisms that influence macroecological patterns.

## The Living Smart Grid

7 April, 2012

guest post by Todd McKissick

The last few years, in energy circles, people have begun the public promotion of what they call the smart grid.  This is touted as providing better control, prediction and utilization of our nation’s electrical grid system.  However, it doesn’t provide anyone except the utilities more benefits.  It’s expected to cost much more and to actually take away some of the convenience of having all the power you want, when you want it.

We can do better.

Let’s investigate the benefits of some changes to their so called smart grid.  If implemented, these changes will allow instant indirect control and balance of all local grid sections while automatically keeping supply in check with demand.  It can drastically cut the baseload utilization of existing transmission lines.  It can provide early benefits from running it in pseudo parallel mode with no changes at all by simply publishing customer specific real-time prices.  Once that gains some traction, a full implementation only requires adding smart meters to make it work. Both of these stages can be adopted at any rate and benefits only as much as it is adopted. Since it allows Demand Reduction (DR) and Distributed Generation (DG) from any small source to compete in price fairly with the big boys, it encourages tremendous competition between both generators and consumers.

To initiate this process, the real-time price must be determined for each customer.  This is easily done at the utility by breaking down their costs and overhead into three categories.  First, generation is monitored at its location.  Second, transmission is monitored for its contribution.  Both of these are being done already, so nothing new yet.  Third, distribution needs to be monitored at all the nodes and end points in the customer’s last leg of the chain.  Much of this is done and the rest is being done or planned through various smart meter movements.  Once all three of these prices are broken down, they can be applied to the various groups of customers and feeder segments.  This yields a total price to each customer that varies in real time with all the dynamics built in.  By simply publishing that price online, it signals the supply/demand imbalance that applies to them.

This is where the self correction aspect of the system comes into play.  If a transmission line goes down, the affected customers’ price will instantly spike, immediately causing loads to drop offline and storage systems and generation systems to boost their output.  This is purely price driven so no hard controls are sent to the customer equipment to make this happen.  Should a specific load be set to critical use, like a lifeline system for a person or business, they have less risk of losing power completely but will pay an increased amount for the duration of the event.  Even transmission rerouting decisions can be based on the price, allowing neighboring local grids to export their excess to aid a nearby shortfall.  Should an area find its price trending higher or lower over time, the economics will easily point to whatever and wherever something is needed to be added to the system.  This makes forecasting the need for new equipment easier at both the utility and the customer level.

If CO2 or some other emission charge was created, it can quickly be added to the cost of individual generators, allowing the rest of the system to re-balance around it automatically.

Once the price is published, people will begin tracking their home and heavy loading appliances to calculate their exact electrical bill.  When they learn they can adapt usage profiles and save money, they will create systems to automatically do so.  This will lead to intelligent and power saving appliances, a new generation of smart thermostats, short cycling algorithms in HVAC and even more home automation.  The result of these operations is to balance demand to supply.

When this process begins, the financial incentive becomes real for the customer, attracting them to request live billing.  This can happen as small as one customer at a time for anyone with a smart meter installed.  Both customer and utility benefit from their switchover.

A truly intelligent system like this eliminates the necessity of full grid replacement that some people are proposing.  Instead, it focuses on making the existing one more stable.  Incrementally and in proportion to adoption, the grid stability and redundancy will naturally increase without further cost. The appliance manufacturers already have many load predictive products waiting for the market to call for them so the cost to advance this whole system is fully redundant with the cost of replacement meters which is already happening or planned soon. We need to ensure that the new meters have live rate capability.

This is the single biggest solution to our energy crisis. It will standardize grid interconnection which will entice distributed generation (DG).  As it stands now, most utilities view DG in a negative light with regards to grid stability.  Many issues such as voltage, frequency and phase regulation are often topics they cite.  In reality, however, the current inverter standards ensure that output is appropriately synchronized.  The same applies to power factor issues.  While reducing power sent via the grid directly reduces the load, it’s only half of the picture.

DG with storage and vehicle-to-grid hybrids both give the customer an opportunity to save up their excess and sell it to the grid when it earns the most.  By giving them the live prices, they will also be encouraged to grow their market.  It is an obvious outgrowth for them to buy and store power from the grid in the middle of the night and sell it back for a profit during afternoon peaks.  In fact this is already happening in some markets.

Demand reduction (DR), or load shedding, acts the same as onsite generation in that it reduces the power sent via the grid.  It also acts similar to storage in that it can time shift loads to cheaper rate periods.  To best take advantage of this, people will utilize increasingly better algorithms for price prediction.  The net effect is thousands of individuals competing on prediction techniques to flatten out the peaks into the valleys of the grid’s daily profile.  This competition will be in direct proportion to the local grid instability in a given area.

According to Peter Mark Jansson and John Schmalzel [1]:

From material utilization perspectives significant hardware is manufactured and installed for this infrastructure often to be used at less than 20-40% of its operational capacity for most of its lifetime. These inefficiencies lead engineers to require additional grid support and conventional generation capacity additions when renewable technologies (such as solar and wind) and electric vehicles are to be added to the utility demand/supply mix. Using actual data from the PJM [PJM 2009] the work shows that consumer load management, real time price signals, sensors and intelligent demand/supply control offer a compelling path forward to increase the efficient utilization and carbon footprint reduction of the world’s grids. Underutilization factors from many distribution companies indicate that distribution feeders are often operated at only 70-80% of their peak capacity for a few hours per year, and on average are loaded to less than 30-40% of their capability.

At this time the utilities are limiting adoption rates to a couple percent.  A well known standardization could replace that with a call for much more.  Instead of discouraging participation, it will encourage innovation and enhance forecasting and do so without giving away control over how we wish to use our power.  Best of all, it is paid for by upgrades that are already being planned. How's that for a low cost SMART solution?

[1] Peter Mark Jansson and John Schmalzel, Increasing utilization of US electric grids via smart technologies: integration of load management, real time pricing, renewables, EVs and smart grid sensors, The International Journal of Technology, Knowledge and Society 7, 47-60.

## Azimuth on Google Plus (Part 6)

13 February, 2012

Lately the distribution of hits per hour on this blog has become very fat-tailed. In other words: the readership shoots up immensely now and then. I just noticed today’s statistics:

That spike on the right is what I’m talking about: 338 hits per hour, while before it was hovering in the low 80’s, as usual for the weekend. Why? Someone on Hacker News posted an item saying:

John Baez will give his Google Talk tomorrow in the form of a robot.

That’s true! If you’re near Silicon Valley on Monday the 13th and you want to see me in the form of a robot, come to the Google campus and listen to my talk Energy, the Environment and What We Can Do.

It starts at 4 pm in the Paramaribo Room (Building 42, Floor 2). You’ll need to check in 15 minutes before that at the main visitor’s lounge in Building 43, and someone will escort you to the talk.

But if you can’t attend, don’t worry! A video will appear on YouTube, and I’ll point you to it when it does.

I tested out the robot a few days ago from a hotel room in Australia—it’s a strange sensation! Suzanne Brocato showed me the ropes. To talk to me easily, she lowered my ‘head’ until I was just 4 feet tall. “You’re so short!” she laughed. I rolled around the offices of Anybot and met the receptionist, who was also in the form of a robot. Then we went to the office of the CEO, Trevor Blackwell, and planned out my talk a little. I need to practice more today.

But why did someone at Hacker News post that comment just then? I suspect it’s because I reminded people about my talk on Google+ last night.

The fat-tailed distribution of blog hits is also happening at the scale of days, not just hours:

The spikes happen when I talk about a ‘hot topic’. January 27th was my biggest day so far. Slashdot discovered my post about the Elsevier boycott, and send 3468 readers my way. But a total 6499 people viewed that post, so a bunch must have come from other sources.

January 31st was also big: 3271 people came to read about The Faculty of 1000. 2140 of them were sent over by Hacker News.

If I were trying to make money from advertising on this blog, I’d be pushed toward more posts about hot topics. Forget the mind-bending articles on quantropy, packed with complicated equations!

But as it is, I’m trying to do some mixture of having fun, figuring out stuff, and getting people to save the planet. (Open access publishing fits into that mandate: it’s tragic how climate crackpots post on popular blogs while experts on climate change publish their papers in journals hidden from public view!) So, I don’t want to maximize readership: what matters more is getting people to do good stuff.

Do you have any suggestions on how I could do this better, while still being me? I’m not going to get a personality transplant, so there are limits on what I’ll do.

One good idea would be to make sure every post on a ‘hot topic’ offers readers something they can do now.

But enough of this navel-gazing! Here are some recent Azimuth articles about energy on Google+.

### Energy

1) In his State of the Union speech, Obama talked a lot about energy:

We’ve subsidized oil companies for a century. That’s long enough. It’s time to end the taxpayer giveaways to an industry that rarely has been more profitable, and double-down on a clean energy industry that never has been more promising.

He acknowledged that differences on Capitol Hill are “too deep right now” to pass a comprehensive climate bill, but he added that “there’s no reason why Congress shouldn’t at least set a clean-energy standard that creates a market for innovation.”

However, lest anyone think he actually wants to stop global warming, he also pledged “to open more than 75 percent of our potential offshore oil and gas resources.”

2) This paper claims a ‘phase change’ hit the oil markets around 2005:

• James Murray and David King, Climate policy: Oil’s tipping point has passed, Nature 481 (2011), 433–435.

They write:

In 2005, global production of regular crude oil reached about 72 million barrels per day. From then on, production capacity seems to have hit a ceiling at 75 million barrels per day. A plot of prices against production from 1998 to today shows this dramatic transition, from a time when supply could respond elastically to rising prices caused by increased demand, to when it could not (see ‘Phase shift’). As a result, prices swing wildly in response to small changes in demand. Other people have remarked on this step change in the economics of oil around the year 2005, but the point needs to be lodged more firmly in the minds of policy-makers.

3) Help out the famous climate blogger Joe Romm! He asks: What will the U.S. energy mix look like in 2050 if we cut CO2 emissions 80%?

How much total energy is consumed in 2050… How much coal, oil, and natural gas is being consumed (with carbon capture and storage of some coal and gas if you want to consider that)? What’s the price of oil? How much of our power is provided by nuclear power? How much by solar PV and how much by concentrated solar thermal? How much from wind power? How much from biomass? How much from other forms of renewable energy? What is the vehicle fleet like? How much electric? How much next-generation biofuels?

As he notes, there are lots of studies on these issues. Point him to the best ones!

4) Due to plunging prices for components, solar power prices in Germany dropped by half in the last 5 years. Now solar generates electricity at levels only slightly above what consumers pay. The subsidies will disappear entirely within a few years, when solar will be as cheap as conventional fossil fuels. Germany has added 14,000 megawatts capacity in the last 2 years and now has 24,000 megawatts in total—enough green electricity to meet nearly 4% the country’s power demand. That is expected to rise to 10% by 2020. Germany now has almost 10 times more installed capacity than the United States.

That’s all great—but, umm, what about the other 90%? What’s their long-term plan? Will they keep using coal-fired power plants? Will they buy more nuclear power from France?

In May 2011, Britain claimed it would halve carbon emissions by 2025. Is Germany making equally bold claims or not? Of course what matters is deeds, not words, but I’m curious.

5) Stephen Lacey presents some interesting charts showing the progress and problems with sustainability in the US. For example, there’s been a striking drop in how much energy is being used per dollar of GNP:

Sorry for the archaic ‘British Thermal Units’: we no longer have a king, but for some reason the U.S. failed to throw off the old British system of measurement. A BTU is a bit more than a kilojoule.

Despite these dramatic changes, Lacey says “we waste around 85% of the energy produced in the U.S.” But he doesn’t say how that number was arrived at. Does anyone know?

6) The American Council for an Energy-Efficient Economy (ACEEE) has a new report called The Long-Term Energy Efficiency Potential: What the Evidence Suggests. It describes some scenarios, including one where the US encourages a greater level of productive investments in energy efficiency so that by the year 2050, it reduces overall energy consumption by 40 to 60 percent. I’m very interested in how much efficiency can help. Some, but not all, of the improvements will be eaten up by the rebound effect.

## How to Cut Carbon Emissions and Save Money

27 January, 2012

McKinsey & Company is a management consulting firm. In 2010 they released this ‘carbon abatement cost curve’ for the whole world:

Click it to see a nice big version. So, they’re claiming:

By 2030 we can cut CO2 emissions about 15 gigatonnes per year while saving lots of money.

By 2030 can cut CO2 emissions by up to 37 gigatonnes per year before the total cost—that is, cost minus savings—becomes positive.

The graph is cute. The vertical axis of the graph says how many euros per tonne it would cost to cut CO2 emissions by 2030 using various measures. The horizontal axis says how many gigatonnes per year we could reduce CO2 emissions using these measures.

So, we get lots of blue rectangles. If a rectangle is below the horizontal axis, its area says how many euros per year we’d save by implementing that measure. If it’s above the axis, its area says how much that measure would cost.

I believe the total blue area below the axis equals the total blue area above the axis. So if we do all these things, the total cost is zero.

37 gigatonnes of CO2 is roughly 10 gigatonnes of carbon: remember, there’s a crucial factor of $3\frac{2}{3}$ here. In 2004, Pacala and Socolow argued that the world needs to find ways to cut carbon emissions by about 7 gigatonnes/year by 2054 to keep emissions flat until this time. By now we’d need 9 gigatonnes/year.

If so, it seems the measures shown here could keep carbon emissions flat worldwide at no net cost!

But as usual, there are at least a few problems.

### Problem 1

Is McKinsey’s analysis correct? I don’t know. Here’s their report, along with some others:

For more details it’s good to read version 2.0:

• McKinsey & Company, Pathways to a low carbon economy: Version 2 of the global greenhouse gas abatement cost curve, 2009.

They’re free if you fill out some forms. But it’s not easy to check these things. Does anyone know papers that try to check McKinsey’s work? I find it’s more fun to study a problem like this after you see two sides of the same story.

### Problem 2

I said ‘no net cost’. But if you need to spend a lot of money, the fact that I’m saving a lot doesn’t compensate you. So there’s the nontrivial problem of taking money that’s saved on some measures and making sure it gets spent on others. Here’s where ‘big government’ might be required—which makes some people decide global warming is just a political conspiracy, nyeh-heh-heh.

Is there another way to make the money transfer happen, without top-down authority?

We could still get the job about half-done at a huge savings, of course. McKinsey says we could cut CO2 emissions by 15 gigatonnes per year doing things that only save money. That’s about 4 gigatonnes of carbon per year! We could at least do that.

### Problem 3

Keeping carbon emissions flat is not enough. Carbon dioxide, once put in the atmosphere, stays there a long time—though individual molecules come and go. As the saying goes, carbon is forever. (Click that link for more precise information.)

So, even Pacala and Socolow say keeping carbon emissions flat is a mere stopgap before we actually reduce carbon emissions, starting in 2054. But some more recent papers seem to suggest Pacala and Socolow were being overly optimistic.

Of course it depends on how much global warming you’re willing to tolerate! It also depends on lots of other things.

Anyway, this paper claims that if we cut global greenhouse gas emissions in half by 2050 (as compared to what they were in 1990), there’s a 12–45% probability that the world will get at least 2 °C warmer than its temperature before the industrial revolution:

• Malte Meinshausen et al, Greenhouse-gas emission targets for limiting global warming to 2 °C, Nature 458 (2009), 1158–1163.

Abstract: More than 100 countries have adopted a global warming limit of 2 °C or below (relative to pre-industrial levels) as a guiding principle for mitigation efforts to reduce climate change risks, impacts and damages. However, the greenhouse gas (GHG) emissions corresponding to a specified maximum warming are poorly known owing to uncertainties in the carbon cycle and the climate response. Here we provide a comprehensive probabilistic analysis aimed at quantifying GHG emission budgets for the 2000–50 period that would limit warming throughout the twenty-first century to below 2 °C, based on a combination of published distributions of climate system properties and observational constraints. We show that, for the chosen class of emission scenarios, both cumulative emissions up to 2050 and emission levels in 2050 are robust indicators of the probability that twenty-first century warming will not exceed 2 °C relative to pre-industrial temperatures.

Limiting cumulative CO2 emissions over 2000–50 to 1,000 Gt CO2 yields a 25% probability of warming exceeding 2 °C—and a limit of 1,440 Gt CO2 yields a 50% probability—given a representative estimate of the distribution of climate system properties. As known 2000–06 CO2 emissions were 234 Gt CO2, less than half the proven economically recoverable oil, gas and coal reserves can still be emitted up to 2050 to achieve such a goal. Recent G8 Communiques envisage halved global GHG emissions by 2050, for which we estimate a 12–45% probability of exceeding 2 °C—assuming 1990 as emission base year and a range of published climate sensitivity distributions. Emissions levels in 2020 are a less robust indicator, but for the scenarios considered, the probability of exceeding 2 °C rises to 53–87% if global GHG emissions are still more than 25% above 2000 levels in 2020.

This paper says we’re basically doomed to suffer unless we revamp society:

• Ted Trainer, Can renewables etc. solve the greenhouse problem? The negative case, Energy Policy 38 (2010), 4107–4114.

Abstract: Virtually all current discussion of climate change and energy problems proceeds on the assumption that technical solutions are possible within basically affluent-consumer societies. There is however a substantial case that this assumption is mistaken. This case derives from a consideration of the scale of the tasks and of the limits of non-carbon energy sources, focusing especially on the need for redundant capacity in winter. The first line of argument is to do with the extremely high capital cost of the supply system that would be required, and the second is to do with the problems set by the intermittency of renewable sources. It is concluded that the general climate change and energy problem cannot be solved without large scale reductions in rates of economic production and consumption, and therefore without transition to fundamentally different social structures and systems.

It’s worth reading because it uses actual numbers, not just hand-waving. But it seeks much more than keeping carbon emissions flat until 2050; that’s one reason for the dire conclusions.

It’s worth noting this rebuttal, which says that everything about Trainer’s paper is fine except a premature dismissal of nuclear power:

• Barry Brook, Could nuclear fission energy, etc., solve the greenhouse problem? The affirmative case, Energy Policy, available online 16 December 2011.

To get your hands on Brook’s paper you either need a subscription or you need to email him. You can do that starting from his blog article about the paper… which is definitely worth reading:

• Barry Brook, Could nuclear fission energy, etc., solve the greenhouse problem? The affirmative case, BraveNewClimate, 14 January 2012.

According to Brook, we can keep global warming from getting too bad if we get really serious about nuclear power.

Of course, these three papers are just a few of many. I’m still trying to sift through the information and figure out what’s really going on. It’s hard. It may be impossible. But McKinsey’s list of ways to cut carbon emissions and save money points to some things we start doing right now.