FERC Gaslights America


FERC Gaslights America

Our power grid is in shambles, and the Federal Energy Regulatory Commission is asleep at the wheel.

Our confused and halting response to climate change is already exposing Americans to new and potentially devastating risks during periods of extreme heat and cold. But, notwithstanding the myriad news stories to the contrary, this isn’t the fault of record temperatures or increases in extreme weather, but rather the ill-considered way our attempts to cut greenhouse gas emissions in the power sector have weakened the electric grid. 

Electric grids truly are an engineering marvel. They must maintain a delicate balance whereby real-time power consumption equals real-time power generation, every second of every day. While total power demand varies, a certain minimum baseload power is always needed. For instance, a summer day in Texas might see electrical consumption reach a peak of 75 gigawatts (GW) at 6 p.m. and drop to a low of 50 GW at 4 a.m. In that scenario, the grid would need about 50 GW of the baseload power generated by sources that run all of the time, an additional 25 GW of peak load-generating capacity from more nimble generators adjusting output in response to varying demand throughout the day, and roughly an additional 12 to 15 GW of reserve margin to keep the system online when generators go down for maintenance or for times of unusual stress.

Our ability to consistently meet changing demand has been weakened as intermittent sources like wind and solar have steadily displaced ultra-stable sources like nuclear and coal. Because the sun does not always shine and the wind does not always blow, solar and wind are not always available when called upon. Increasing use of intermittent energy generation makes the grid less resilient in times of stress. Serious problems manifest during periods of unusual heat or cold when electricity demand tends to spike. When a demand spike coincides with low available supply, grid operators are forced to ask customers to reduce power and, if the imbalance persists, eventually to order load shedding (i.e., blackouts) to prevent total grid collapse. 

The 2021 Texas power crisis is a striking demonstration. Things went awry when a cold snap caused wind turbines to fail, triggering a chain reaction that knocked the Texas grid down for days, killing at least 210 people and costing almost $300 billion. This was the deadliest American blackout in recent memory, but it won’t be the last. The North American Electric Reliability Corporation (NERC) estimated that two-thirds of the United States faces heightened risks of power outages. This summer, grid operators in Texas, California, and New England had to repeatedly ask customers to conserve power to narrowly avoid blackouts. And meteorologists are warning that, due to the coincidence of a weak polar vortex and La Niña conditions, we may be in for an unusually cold winter. 

You’d think, then, that careful policymaking to improve grid reliability would be a priority for federal energy regulators. But that’s not what we’re getting. Instead, a reflexive insistence that our problems stem from an increasingly bad climate rather than increasingly bad policy is the predominant response. Consider Richard Glick, chairman of the Federal Energy Regulatory Commission (FERC), the agency principally responsible for ensuring grid reliability throughout the country. Chairman Glick opined last September that the “main threat” to grid reliability is “extreme weather associated with climate change: extreme heat, record-setting cold, droughts that go on for years, wildfire seasons that start earlier and last longer and more ferocious hurricanes.” As a result, Glick has argued, reliability is served by the timely “retirement of coal-fired generators,” which, he says, “exacerbate the intensity and frequency of these extreme weather events.” 

In other words, rather than strengthening the grid, Glick’s way forward is to reduce global temperatures. This “solution” is both delusional and hubristic. American coal plants account for a minuscule portion of global greenhouse-gas emissions. And the effects of climate change on weather and temperature have so far been marginal. Current climate and weather conditions in the U.S. are well within the range of what other parts of the world have long been able to operate in. Dubai’s grid is reliable despite it being much hotter there than in the U.S. As many have pointed out, moreover, extreme weather is not a new phenomenon. And “[e]xtreme doesn’t mean rare,” as John Moura, director of reliability assessment and performance analysis at NERC, explained in a media briefing. “We know these conditions are not rare.”

The data bear this out. For example, while cold weather is unusual in Texas, similar freezes occurred in 1983, 1989, 2003, and 2011. Moreover, whatever may be happening to the climate, extreme cold events are not increasing. Record daily low temperatures in the U.S. have decreased over the last century, and it is more likely that the frequency of extreme cold weather will decrease further with a warming climate. 

As Chairman Glick’s colleague, Commissioner James Danly, has explained, the real “problem is federal and state policies which, by mandate or subsidy, spur the development of weather dependent generation resources at the expense of the dispatchable resources needed for system stability and resource adequacy.” In other words, “the main threat” to reliability is an energy policy that handsomely rewards wind and solar whenever they happen to show up, but fails to penalize them when they are not available to meet peak demand. 

See also  Sebastian Kohn got Monkeypox.

The Decline of Stable Baseload Power

For those who pay attention to our energy policy, this is not news. Over the last decades, there has been a dramatic transformation in how America generates electricity and manages its grids. The coal and nuclear plants that sustained America’s post-WWII boom are being phased out. The steady and consistent baseload power generation necessary for an affordable and reliable grid is also being replaced with intermittent sources of power like solar and wind. Coal and nuclear power combined made up roughly three-quarters of America’s net electricity generation in the ‘80s and ‘90s. Today, they account for only 40 percent. At the same time, renewables have grown to 20 percent and natural gas to 38 percent.

Credit: Authors

The reasons for these changes are complicated, but largely boil down to the rise of cheap natural gas and the ever-increasing push to decarbonize. Under the current regulatory model, much of the country’s electricity generation and transmission is managed by regional transmission coordinators—known as Regional Transmission Organizations (RTOs) and Independent System Operators (ISOs)—that hold day-ahead, hourly, and real-time auctions to determine what power plants can most cheaply meet demand. These regional coordinators are fairly new entities. Prior to the 1990s, most electric utilities owned both the generation plants and the transmission lines. A typical utility was highly regulated, limited to charging a “just and reasonable rate,” and penalized for any electricity disruption. 

But a desire for increased competition led FERC to establish the RTO framework late in 1999. Under the new system, generation and transmission operate more independently with the aim of eliminating incumbent advantages. Often, these new “deregulated” retail utilities take responsibility primarily for distribution to customers. They purchase generated electricity in RTO-run wholesale energy markets and re-sell that electricity at a retail level to consumers—“rate payers,” in industry argot. 

In these quasi-governmental energy markets, the retailers specify how much electricity is needed, and energy generators bid generation quantities and price. The lowest bidders are selected in order until all demand is met, and winning generators receive the final selected price. In theory, dispatching units by lowest cost allows the market to meet energy demand at the lowest possible price. Today, about two-thirds of U.S. electricity demand is served by RTOs and ISOs.

In running these wholesale auctions, RTOs are fuel-neutral. But the price to generate the next megawatt of electricity is dictated primarily by forces outside of the market. How low a price a given plant can sell at depends on their overhead (which often includes paying off financed infrastructure), the regulatory environment, changing fuel costs, and, importantly, government subsidization, which is decidedly not fuel-neutral. Because of heavy subsidization, wind and solar can underbid more traditional power plants. Natural gas has been able to compete even on this uneven playing field: one study found that for each 0.88 percent increase in solar or wind production, a 1 percent growth in natural-gas-generation capacity resulted. This connection owes to the ability of gas-turbine generators to rapidly ramp up power in the event of a lull in sun or wind and historically low gas prices following the shale revolution. But no such benefit exists for coal and nuclear—when they are underbid often enough, such plants become unprofitable and have slowly but surely been retired.

There are some real advantages to the increased use of wind, solar, and natural gas. Wholesale electricity prices are low, and our power sector is remarkably clean. But all isn’t sunny. Wind energy generates on average about 35 percent of its total capacity and solar only 25 percent—at night, it’s zero percent. This low capacity means that far more transmission is required to connect generators that may or may not be available, necessitating hundreds of billions of dollars in investment in transmission to accommodate wind and solar projects that typically are located very far from people and power demand. The increasing costs of transmission have kept retail prices stagnant even with low natural-gas prices. And, of course, the direct costs of subsidies don’t show up on your electric bill.

An even bigger problem is that this bidding system reduces stable baseload capacity. Now, much more of our grid’s nominal capacity isn’t available at any given time. And while natural gas has helped lower electricity costs and improved American energy independence, in our current system it simply is not as dependable as coal and nuclear, both of which store ample reserve fuel on site. Regulations discourage onsite storage of liquid natural gas (for good reason!), meaning that most natural-gas plants require just-in-time delivery, which can be disrupted if something goes wrong with the pipelines. (The market also provides no storage incentives.) Compounding this is FERC’s increasing hostility to new natural-gas-pipeline infrastructure owing to concerns about their effect on global climate change. 

The Emerging Reliability Crisis

Our grid is therefore vulnerable to what Meredith Angwin, author of Shorting the Grid, describes as the “fatal trifecta”: overreliance on weather-dependent solar and wind, just-in-time natural-gas backstops, and imports of electricity from neighboring states. Grid Brief editor Emmet Penney put it pithily, “If the sun doesn’t shine, the wind doesn’t blow, the gas doesn’t flow, and your neighbors don’t show, you’re out of luck.” 

See also  Joe Biden Grovels sends

That’s exactly what happened in Texas in February 2021. In 2003, Texas generated electricity mainly with a mix of coal, nuclear, and natural gas. But over the last two decades the Lone Star state has become a leader in wind generation which, buoyed by generous federal taxpayer subsidies, has been driving more conventional generation out. Texas now gets nearly 20 percent of its power from wind and has more than 28 GW of wind capacity. But when the freeze struck, only an average of 3.8 GW was generated from wind. This shortfall started a chain reaction. When demand spiked late on February 14th, ERCOT, Texas’s ISO, ordered a shutdown to prevent collapse. 

Before the blackout, gas-supply issues from wellhead and midstream facility freeze offs amounted to an aggregate loss of about 20 percent. But after the blackout, natural-gas-compressor stations—having been electrified to reduce greenhouse-gas emissions—could no longer supply the natural-gas generators with fuel. Gas supply plummeted, and those generators added another 20 GW to the deficit, making the system utterly unrecoverable. Ten million people lost power. For many, power did not come back on for several days.

This wasn’t a surprise. Texas in fact has a program to help people buy home generators precisely because it knew that a crisis like this was virtually inevitable. It’s an intriguing idea, but in Texas at least, this “solution” worked much better in theory than in practice. Those most needing generators are often the least likely to prepare or be able to afford them. 

Despite the obvious need for better strategies to increase baseload power, however, climate idealists like Chairman Glick continue to beat the climate-change drum as the best solution. “Make no mistake, climate change is the reliability challenge we face. If we fail to take serious action to mitigate climate change, we are signing up for ever-more-serious reliability problems in the years ahead.” 

This is, to put it very gently, not true. What makes today’s grid fragile isn’t an increase in extreme weather driven by climate change, but our changing sources of electricity, changes made in large part to combat climate change. The damaging effect of this policy on reliability is then itself blamed on climate change. The prescription for this worsening condition is then to have more wind and solar, and when that makes things even worse, we’ll presumably be told we need an even higher dose, and so on and on. Blaming climate change for blackouts in the way Chairman Glick does thus isn’t just misleading—it’s regulatory Munchausen by proxy.

To be fair, FERC is at least taking some positive steps. A recent proposed rule would order NERC to modify its reliability standards to require better analysis of what power resources will be available during extreme heat and cold and to force transmission providers to plan accordingly. But while this is a (small) step in the right direction, it will take several years to be fully implemented. As a result, it won’t and can’t fix the high risk of blackouts projected for the coming months. While the proposed regulations might encourage improved resilience in natural-gas pipelines, it cannot fix the fundamental intermittency of wind and solar. Nor is it guaranteed to drive the necessary policy reforms within ISOs and RTOs. The U.S. Court of Appeals for the D.C. Circuit recently struck down FERC’s approval of New England’s winter reliability plan, which involved paying some power-plant owners to keep three days of fuel on site for harsh winter conditions, precisely to improve reliability and to incentivize coal and nuclear plants to stay online. 

More fundamental changes are needed. As currently constituted, RTOs hardly even resemble a market. The entire purpose of RTOs, which operate under FERC-approved agreements, is to leverage regional resources to provide lower electricity rates to customers while maintaining reliability. Arguably, every action that mandates or subsidizes intermittent and unreliable generation or the otherwise-unnecessary transmission lines upon which they depend is a violation of that operating principle. In the current arrangement, moreover, there’s no effective way to hold generating sources accountable if they fail to deliver the promised power. 

The complex interaction of state and federal policies has prevented energy price and reliability from having the last word on which generators win the RTO/ISO bidding battle. Instead, who wins is determined largely by administrative fiat filtered through an amalgam of regulatory counterbalances, all of which is channeled through the RTO bureaucracies. The result is a regulatory unicorn: a “market-based system” that both increases costs and decreases reliability. Is it any wonder then, that the remaining traditional utilities—including that venerable New Deal dinosaur, the Tennessee Valley Authority—can outperform their “market-based” RTO and ISO peers? 

One frequently proposed solution to our current debacle is to build out battery storage to create reserve capacity so that when the sun isn’t shining or the wind isn’t blowing, there’s still power there to meet demand. A Kepler-level scientific revolution in battery technology could provide such a solution. But, barring that, NERC’s Moura is right: “Batteries aren’t going to do it.” Even the largest battery-storage site in the world, California’s Moss Landing, can only supply 0.4 GW. To put that in perspective, California’s peak summer demand averages around 40 GW. And Moss Landing’s 0.4 GW can only be sustained for 4 hours before the battery’s capacity is depleted. 

See also  What is the Middle Age Spread?

The number of batteries that would be required to sustain the grid through a several-hour—much less a several-day—sun-and-wind lull is staggering: analysis suggests generating just 12 hours of storage would require 5,400 gigawatt-hours of storage, or more than 3,000 Moss Landings. To make matters worse, batteries do not themselves generate any energy. This means an even greater volume of solar and wind capacity would need to be built so that there’s enough generation capacity to both power the grid and to charge the batteries, held in reserve for when wind and solar can’t deliver. ISO New England is planning on using this “fix,” and it has estimated that its system’s reserve-generation capacity would need to increase from the current 15 percent to 300 percent by 2040 as more solar and wind are added. Who knows how much carbon dioxide would be released in the process of making such a behemoth, not to mention the cost. Battery storage isn’t a solution; it’s a pipe dream.

The Hard Work of Getting Back on the Right Path

An affordable and reliable grid will require policy solutions that incentivize both of those goals, rather than picking favored technologies and then scrambling to cover for their shortcomings. You can’t just give the money to the lowest bidder, you have to make sure that it’s for the lowest bidder who actually shows up, even when it’s cloudy or becalmed. Only this kind of system will actually send meaningful market signals for long-term investments in the grid.

Subscribe Today

Get weekly emails in your inbox

Further, if our low-carbon ambitions will be met, this will likely require new—and, perhaps, next-generation—nuclear-power plants. As has been shown time and again, nuclear is the most reliable form of electricity generation, it is the safest, it is the cleanest, and its fuel is abundantly domestically available. There have been several recent pieces of good news on this front: Abilene Christian University applied for a construction license on an experimental molten-salt reactor, NuScale’s small modular-reactor design was approved for further testing at Idaho National Lab, and Southern Company’s Vogtle’s Unit 3 was finally authorized to begin operation. But this good news comes after decades of mismanagement. The Nuclear Regulatory Commission has approved few new reactors and even fewer plants since its inception in 1975. Those that were approved have been beset by Kafkaesque regulatory barriers that have led to perpetual construction delays and cost overruns. This 50-year hiatus in nuclear development means that the U.S. is lacking in valuable expertise and experience that would make new nuclear construction quickly achievable. Moreover, the NRC’s regulatory apparatus is not well-suited to accommodate new development, and regulations will need to be reduced if there is to be any hope of building smaller units.

We can—and should—move faster to catch up. Further increasing our use of natural gas would also certainly help keep carbon emissions low. But the hard truth is that our ambitious climate goals will take a backseat to the need to keep the power on, an entirely predictable and tragic consequence of climate idealists’ irrational hostility to nuclear power. The rest of the world is ahead of us on this path. Germany is replacing Russian gas with coal, and it came perilously close to shutting down its remaining nuclear plants. The Netherlands have dropped environmental restrictions on coal until 2024. China has announced it will increase coal output by 300 million tons this year, and India wants to raise its coal production by more than 400 million tons by the end of 2023, most of which it will use to generate power domestically. China and India’s additional 700 million tons of coal alone will undo the greenhouse-gas-emission reductions achieved by the U.S. between 2005 and 2020. In the U.S, California Governor Gavin Newsom has declared a state of emergency ordering reduction in electricity usage in an effort to stave off blackouts, even asking electric vehicle owners to limit charging. And coal plants scheduled for closure across the country—like Missouri’s Rush Island plant—are being kept open in a bid to maintain grid reliability. Nothing Chairman Glick, the RTOs, or anyone else in the U.S. can do will change this. 

“We cannot blame our problems on the weather,” as Commissioner Danly writes. The U.S. has the resources to build a more reliable natural gas network and the next generation of nuclear reactors necessary for a clean, reliable energy sector. But until we lift the irrational regulatory barriers and incentivize reliability, we’re in for a lot of blackouts and a lot of coal.

Read More

Previous post Europe’s Minsky Moment Approaches
Next post Today In Trans Tyranny