In the popular imagination, energy-storage technologies like batteries are a key part of the effort to reduce carbon dioxide emissions and fight climate change.
But storage has something of a dirty secret: Its net effect is often an increase in greenhouse gas emissions. The full causes and dynamics behind this are complex, having to do with what energy is being stored, what energy is being displaced when it is released, and what energy makes up for the energy lost (roughly 20 percent) in the round-trip journey to battery and back. If you want the full details, I wrote a deep-dive post on this last year.
Today I have a happier story to tell â€” about how California realized that its enthusiastic deployment of batteries was increasing emissions and figured out a way to solve the problem.
The solution it has developed is clever in its own right, but it also illustrates how computing power is going to enable a cleaner grid. Once again, California is blazing a path that other states will follow.
First, a short bit of backstory.
The California Public Utility Commission (CPUC) has a program called the Self-Generation Incentive Program (SGIP), which dates back to 2001 and the stateâ€™s energy crisis. Initially designed to reduce peaks in demand, the program has since been revised, reformed, and updated several times. In 2009, CPUC added the requirement that SGIP projects reduce greenhouse gas emissions.
Though SGIP has always included a range of eligible technologies, from biogas to waste heat recovery to wind turbines, it has tended to focus on a few. In the early 2000s, SGIP mostly supported solar panels, spurring the enormous growth of that industry. Then, for a few years, it was big on fuel cells. In 2011, it made energy storage eligible. In 2017, it shifted the programâ€™s funding so that 75 percent went to energy-storage projects, overwhelmingly batteries.
In 2015, the CPUC made explicit that the three goals of SGIP projects were to â€śimprove reliability of the distribution and transmission system, reduce emissions of greenhouse gases, and lower grid infrastructure costs.â€ť Note thatâ€™s an â€śand,â€ť not an â€śor.â€ť
The same year, the CPUC also boosted the required round-trip efficiency (RTE) of SGIP storage projects to at least 66.5 percent. The assumption was that batteries would be used to absorb excess renewable energy during the day and discharge it at night â€” in other words, reduce emissions â€” and thus, RTE was seen as a rough proxy for emission reductions.
But that is not how things went. As it turns out, if the only metric is financial benefit to the battery owner, batteries tend to charge with cheap, dirty power at night and discharge during the day for peak reduction (to reduce commercial demand charges) â€” that is, they tend to be operated in a way that increases emissions.
To the CPUCâ€™s credit, it did not ignore the problem. It brought in research firm Itron to do a formal 2016 storage-impact evaluation (released in 2017). It found that while SGIP projects had reduced overall emissions, the storage projects had actually increased emissions. The net increase is relatively trivial in the grand scheme of things â€” less than 1,000 tons of emissions in a state with well over 700 million tons annually â€” but it clearly revealed that the program was not accomplishing one of its three goals with regards to storage.
When it comes to batteries and emissions, the report revealed that timing is everything. If theyâ€™re charging and discharging at the right times, even a low RTE will reduce emissions. If theyâ€™re charging and discharging at the wrong time, no RTE is high enough. In other words, RTE is not a good proxy for emissions impact.
A subsequent 2017 impact evaluation (released in 2018) confirmed the bad news was getting worse: It found that SGIP commercial-storage projects increased annual GHG emissions by about 1,436 metric tons, and residential-storage systems by another 116. Still relatively trivial, but still bad â€” thatâ€™s still a positive, not negative, growth in emissions.
Again to its credit, CPUC did not ignore the report. In 2017, it convened a working group to analyze possible solutions. (Hereâ€™s the groupâ€™s final report.) In May 2019, the CPUC issued an official decision approving the working groupâ€™s proposed changes, scheduled to go into effect in April 2020.
What are those changes, exactly? Remember, the problem is that battery operators are charging and discharging at the wrong times â€” they are optimizing for financial returns, which is not the same as optimizing for emissions reductions. They donâ€™t have any incentive to optimize around emissions, and even if they did, they donâ€™t have the information they would need to do so.
The solution is twofold: provide both the incentive and the information.
As for the incentive, under the proposal, new commercial-storage installations will still get the same amount of SGIP money â€” but only 50 percent will be paid up front. The other 50 percent will be paid out over five years based on demonstrated reductions in annual emissions, which must amount to 5 kilograms of CO2 for every kWh of capacity.
Residential-battery installations are eligible if they are paired with solar panels (from which they draw at least 75 percent of their charge), have a single-cycle round-trip efficiency of at least 85 percent, and are enrolled in some kind of time-varying rate program.
Legacy commercial projects will be subject to the same reduction requirements; legacy residential projects, meanwhile, are exempt if they join a time-of-use rate program.
Thatâ€™s the incentive. But what about the information? Thatâ€™s the really cool part.
The question is: Even if storage-project owners want to reduce emissions, how can they? How can they know when to charge and when to discharge? Sometimes there are more natural-gas generators online and the grid is dirtier; sometimes more solar and wind are online and the grid is cleaner. The exact mix is constantly changing.
After much discussion, the working group decided that what was needed is a â€śGHG signalâ€ť â€” real-time information about the carbon intensity, or dirtiness, of the grid, as well as a 24-hour forecast about the expected carbon intensity of the grid, available to all battery operators. Thatâ€™s the information they need to plan their operations.
The CPUC held an open bidding process to find the provider of the signal and the winner was WattTime, a nonprofit tech company that has, since 2017, been operating as part of the Rocky Mountain Institute.
Faithful readers may find the name familiar. Earlier this year, WattTime rolled out Automated Emissions Reduction, a consumer-facing program that uses exactly this kind of real-time grid-emissions data to help customers better manage their distributed energy resources (DERs). Then, in May, it announced a program whereby it would use satellites and AI to track real-time emissions data at every power plant in the world, which could enable DER owners the world over to maximize their GHG impact.
WattTime uses EPA data on the emissions of power plants â€” combined with wholesale market prices, fuel costs, wind and weather data, various other inputs, and a whole bunch of AI â€” to produce day-ahead forecasts of grid intensity at a granular level.
Best of all, WattTime is making its work open source in California. Thereâ€™s an API that battery operators can tap into for free, which means forecasts are automatically included in their operation algorithms. (WattTime wrote a piece on the program that is worth reading.)
The good news is, WattTimeâ€™s modeling found that optimizing battery operation around even a modest GHG signal led to a 32 percent improvement in emissions performance with less than a 0.1 percent reduction in revenue. A broader look at this same question (the trade-off between emissions performance and revenue) published in the journal Energy found that â€śmarginal storage-induced CO2 emissions can be decreased significantly (25â€“50%) with little effect on revenue (1â€“5%).â€ť
Itâ€™s clear that operating storage purely based on revenue tends to increase emissions. The hope of everyone in California, especially those who sell battery systems, is that operating storage based on emissions performance will only modestly reduce revenue. Itâ€™s difficult to know for sure until the SGIP changes go into effect.
What a cool experiment, though!
By way of concluding, I want to briefly emphasize three themes that this story highlights.
As more variable renewables and DERs come online, grid operation is becoming more fluid and complex, and the GHG impact of a given technology depends increasingly on time and place. Exactly when and where energy is being generated, stored, and released determines its effect on emissions.
Thus, maximizing emission reductions â€” not just for batteries, but for any flexible energy resource â€” crucially involves understanding the state of the grid on a minute-by-minute basis, what kind of energy is on it, what energy is available to it, and both its present and anticipated carbon intensity.
Thatâ€™s the kind of information WattTime is making available. The company notes that forecasts â€” which it is working on extending to 48 or 72 hours â€” are somewhat easier in California, since thereâ€™s no coal or nuclear on the grid, only natural gas and renewables (which makes for fewer variables). Itâ€™s a more complex undertaking in other, more mixed grids, which is why the company charges a fee for access to that information.
But it is safe to say that this kind of information will eventually be available about all grids, representing a radical new level of transparency and empowerment for DER operators.
Eric Hittinger, a policy professor at the Rochester Institute of Technology, makes a point in this Twitter thread about the SGIP changes (and in the papers linked therein) that is worth emphasizing: Itâ€™s a mistake to deploy batteries, or energy storage in general, as though they will inevitably reduce emissions. They might or might not. Indeed, itâ€™s probably a mistake to think of them as emissions-reducing technologies at all.
Rather, itâ€™s better to think of storage as akin to transmission lines. Wires can carry both clean and dirty energy; their impact on emissions depends on local circumstances. Their primary purpose is not to reduce emissions, though, but to make the grid run more smoothly. Theyâ€™re a grid tech, not a decarbonization tech. The same applies to batteries.
As it happens, making the grid more stable will have the effect of allowing more renewables to be integrated, thus reducing emissions. But they are nonetheless distinct tasks, and batteries should be deployed mainly with the first task in mind.
After all, it may be that some battery installations in California will want to provide grid services, emergency backup, or functions other than emission reductions. Being forced to reduce emissions might make it more difficult for storage to pursue those other revenue possibilities.
To be clear, Hittinger and I both think these SGIP changes are for the better. Itâ€™s good to use whatever policy tools are at hand. But in the larger picture, clean-energy types need to rethink where storage is categorized in their mental model.
A theme I have returned to in several recent posts is: A big part of the clean-energy transition is going to be using computing power to enable technologies and techniques that allow us to obtain the energy services we need (transportation, heat, etc.) using less labor and material.
Computing power is one of the few things in the modern world that consistently and reliably gets cheaper and more powerful. As it does, it helps us better understand and predict complex systems (like an energy grid) in real time, which in turn enables us to produce energy services more efficiently.
Californiaâ€™s SGIP solution is a great example. Before and after both involve the same stuff, the same machines. What was added were new rules and new information that allowed those rules to be followed. That type of information, the kind WattTime is providing, is a result of computing power and algorithms unavailable even a few years ago.
In the end, just as much as money or policy, it is information that will accelerate the clean-energy transition.
As climate change makes fires more intense, one solution is to put the forest under surveillance.