Power Grid Reliability
U.S. POWER GRID GETS LESS RELIABLE
IS THE GRID REALLY UNRELIABLE?
(This article was written in 2011. You may find some details that have changed over time. The basic idea still remains the same… the grid is a mess.)
Those concerned with preparedness don’t want to live off-grid but fear that they may have to. They see a compelling business case for investing in power outage contingencies because a significant risk of outages is considered a given.
Many in preparedness not only think the grid has become unreliable, but they also say that it’s getting less and less reliable.
Recently we asked ourselves: Does this have a solid basis in fact? The answer, examined from any angle, by any authority in the field: Yes.
But before delving into all the problems, it is important to know the specific questions we set out to answer.
Has the bulk power system in the U.S. and Canada been failing more often and for longer periods?
If so, is it likely to get worse?
These questions address solely outages caused for reasons under the control of the companies that collectively run the bulk power systems. Examples include failures due to human error, component failure, and inadequate system resources and “smartness” to prevent a small outage from cascading into a big one.
Thus, this summary leaves out a substantial proportion of outages — those caused by external events that could not be controlled, like hurricanes.
Because of how reliability data is collected, excluding acts of God skews our conclusions toward understating the problem. Take the example of a heavy rain storm with high but not extraordinary (50 mph) winds. Most of the outages from such storms are due to tree branches falling on power lines.
Many of such failures are avoidable, according to the California Public Utilities Commission. They took Pacific Gas and Electric to task a few years ago, compelling PG&E to hire more personnel to trim trees that posed an obviously high risk to power lines.
Nonetheless, failures from natural disasters — preventable or not — are not included here in our definition of “outage”.
RESULTS OF OUR REVIEW SURPRISED US
No matter what source we consulted, we found remarkable agreement. The electrical grid serving the U.S. and Canada has been getting less and less reliable. The root cause of the failures, greater stress on the system, make it easy to say that we will see more and more outages like the one in early September 2011, in Arizona, Southern California, and Mexico, that left 5 million people without power.
How can we say this? Because the points of greatest stress are well known. One article written three years ago proved prescient. After noting that excessive congestion is a major source of failures, the article noted that one of the two areas in the U.S. “…with excessive congestion in Southern California. Changes to the grid structure are needed to relieve stress in this area…”
NUMBERS THAT DESCRIBE HOW UNRELIABLE
An article published in 2011 in IEEE Spectrum (See footnote 1) said:
The U.S. electrical grid has been plagued by ever more and ever worse blackouts over the past 15 years. In an average year, outages total 92 minutes per year in the Midwest and 214 minutes in the Northeast. Japan, by contrast, averages only 4 minutes of interrupted service each year.
The study excludes interruptions caused by extraordinary events such as fires or extreme weather. When those impacts are included, the trend is even worse.
For the past 15 years, utilities have invested less money than required to keep it at the same capacity, age and state of repair.
The Report Card for America’s Infrastructure http://www.infrastructurereportcard.org/fact-sheet/energy), prepared by the American Society of Civi Engineers, says:
The U.S. power transmission system is in urgent need of modernization. Growth in electricity demand and investment in new power plants has not been matched by investment in new transmission facilities. Maintenance expenditures have decreased 1% per year since 1992. Existing transmission facilities were not designed for the current level of demand, resulting in an increased number of “bottlenecks,” which increase costs to consumers and elevate the risk of blackouts.
WHY THE GRID’S RELIABILITY MATTERS TO SIMPLE PUMP
The Simple Pump hand pump can be economically justified in a number of different ways, for use in a variety of applications, in some of the poorest countries, to some of the most affluent.
For example, the Simple Pump is the lowest-cost and highest-reliability approach to the delivery of water to the poorest rural populations in the world, most notably in Africa and Haiti. On the other hand, its reliability and narrow yet strong profile enable it to fit alongside submersible pumps in most wells. More and more well owners in the U.S. and Canada are installing it to ensure continued access to water in the case of power failures.
It is this last trend the prompted us to ask: Is this application in “rich” nations really as compelling as other Simple Pump applications? Yes, unfortunately. There are a number of additional factors that will likely drag down reliability for at least the next decade.
ROOT CAUSE OF THE DECLINING U.S. AND CANADIAN GRID
The preponderance of factors driving reduced reliability stem from the deregulation of the grid that started in the late 1970s. Explaining just a bit about that deregulation makes it much easier to understand the forces at work now.
The bulk power industry was partially deregulated in the wave that deregulated the U.S. airline, telecommunications, banking, health care, and natural gas industries.
New Federal law forced utilities to purchase electricity from any qualified producer. To qualify, the power generator had to use alternative technologies like wind or solar, or meet an efficiency standard so lax that natural gas qualified. The intention was as with other industries: Cut prices for the little guy by enabling competition amongst providers.
This was a huge change from the status quo — which is covered next.
THE 1960S WERE THE GOOD OLD DAYS
One article noted that our current grid dates from the time before man walked on the moon, and years before cell phones were invented.
In those days, electric utilities generated power for and were regulated within, local areas. Each utility handled everything in the supply chain of electricity production and distribution. Each utility’s transmission system was set up to do just that — serve their local customers. Transmission lines tied systems together only to cover problems arising during emergencies.
Utilities were regulated as monopolies, and, by in large, invested to achieve a quality standard, passing the required costs to customers, with regulators’ approval. The critical difference between that regulated time and every period since was that, just like all other costs of local utilities, the upkeep costs for transmission lines were funded. Obviously necessary for operation, they were kept in good repair.
As important, the lines were not routinely stressed by power pulled through a local utility’s grid to serve remote customers. The wear imposed by power flow was mostly incurred, and paid for, within each utility’s local area.
BULK POWER DEREGULATION: 1980 INTO THE 21ST CENTURY
In the years after 1980, there was a move toward free-market capitalism. The purpose of a utility, under the new model, was to make money for its stockholders. Growth was an important objective. In some states, utilities were forced to divest their assets, with the idea that the smaller pieces would encourage competition.
Electricity became a commodity like any other commodity, with widespread trading in electricity contracts, futures, and power plants were bought and sold. The new buyers were not necessarily in the utility business — some were hedge funds.
While deregulation enabled competition amongst power producers, there was not enough thinking about the transmission system — the grid. With the emphasis on buying from the cheapest supplier with little regard to what wear and tear transmission from that provider would impose, many utilities “generated” power by buying out of state.
But additional power flowing over lines causes premature wear. What deregulation did for the bulk power industry was, in some cases, to make power generation cheaper. But it ignored the cost of transport. It is as if they mandated competitive pricing for a commodity that could be provided from anywhere in the U.S., but gave away use of the Interstate highway system. No one had to pay what it was worth for transport.
At one point, the marketing of electrical energy became a huge source of revenue, apart from the actual generation of the revenue. Derivatives were created based upon future energy and capacity delivery.
If this sounds familiar, you probably remember Enron; in 2001 it became the poster child for deregulation-related excess. The result was some pullback on deregulation at the state level. However, the hallmarks of deregulation that raise havoc with the grid are still in place. There is still widespread trading of electricity across long distances and the use of derivatives and other financial instruments.
Today, U.S. states that are some of the largest consumers of electricity are importing over 25% of all their power! These over-25%-importers include California, Massachusetts, Minnesota, Maryland, New Jersey, and Virginia.
What we have now boggles the mind. We have a system that was designed in the 1960s as an array of nearly atomic, fully-integrated, and self-sufficient utilities, with little thought given to cross-region transmission, now supporting the interstate delivery of a good chunk of all electricity consumed. Most of all, no one entity owns the grid, or even its planning and management, so businesses playing within this setup avoid outlays if at all possible — replacing components only when they fail, rather than replacing near the anticipated end of life.
PRECIPITOUS DROP IN R&D SPENDING
First, some background: On average, equipment in the bulk power industry has a useful life of 40 years. Companies therefore are allowed each year to write off as an expense 1/40 of what it originally cost to buy the equipment. Starting in 1995, the industry-wide total of that write-off of historical costs (depreciation and amortization) has exceeded utility construction expenditures.
In other words, for the past 15 years, utilities have invested less in the grid than required to replace existing equipment at the prices paid up to 40 years ago. The bulk power business has harvested more than they have planted. The result is an increasingly stressed grid. Indeed, some experts say that grid operators should be praised for keeping the lights on while managing a system with diminished shock absorbers.
Rather than merely replacing equipment with the same technology, one way to work around the problem could be to invest in R&D. For example, existing computer technology could be adapted to build a “smart grid”. Things look even bleaker there. The IEEE Spectrum article says it best:
R&D spending for the electric power sector dropped 74 percent, from a high in 1993 of US $741 million to $193 million in 2000. R&D represented a meager 0.3 percent of revenue in the six-year period from 1995 to 2000, before declining even further to 0.17 percent from 2001 to 2006. Even the hotel industry put more into R&D.
By comparison, the computer industry invests almost 13% of revenue; pharmaceuticals invest over 10%.
THE IMPACT OF DEREGULATION ON ELECTRICAL TRANSMISSION
There are a number of consequences beyond those already discussed — declining investment and overuse between utilities. Some of these include:
1. MORE RAPID DETERIORATION
After deregulation, to maximize profits by selling electrical power from the plant that can produce it most cheaply, there is much more cycling on and off of power plants and the structures involved in transmission. As a result, metal is heated and cooled far more frequently, accelerating deterioration.
2. UNPLANNED ADDITIONS TO THE GRID: RENEWABLE
According to NERC (North American Electric Reliability Corporation,
Nearly 30 states over 4 provinces have Renewable Portfolio Standards in place in one form or another. Wind and solar are added to the grid, with the expectation that the grid will accommodate them.
There are a number of unplanned additions to the grid. States are mandating increased generation from renewables — but many of the abundant renewable resources are far away from load centers. So, as more alternative generation sources come online, just to keep the grid performing at the same level, additional lines must be built to bring wind, solar, and geothermal energies to market. But that investment is not planned.
Note that, without including the true cost of getting alternative power to the ultimate customer, prior to the decision to mandate the use of the new technologies, the size of the implicit subsidy is obscured.
3. OTHER UNPLANNED ADDITIONS TO THE GRID
“Merchant” (investor-owned) natural gas power plants are also added to the grid, sometimes without adequate consideration as to whether sufficient grid capacity exists to accommodate the additional production.
4. DIFFICULTY IN ASSIGNING COSTS BACK
Since the industry is more fragmented, if any transmission lines are added, the cost must somehow be allocated back to the many participants who will benefit. Ultimately, the cost must be paid by a consumer. Depending on the area involved, and therefore the state public utility commission with jurisdiction, these consumer rates may in fact be capped, so it may be difficult to recover the additional cost.
5. LITTLE INCENTIVE TO ADD GENERATING CAPACITY
Deregulation not only makes managing the grid much more complex. It also makes utilities wary of investing in new plants. As long as electricity can be bought and sold, utilities defer starting up major projects.
6. AGING WORKFORCE
The “aging workforce” and its impending impact on reliability has been a recurring theme in NERC’s recent Long-Term Reliability Assessments. (See Footnote 2.) Quoting NERC, a corporation now solely responsible for creating and enforcing reliability standards in the U.S. and Canada:
In 2007, NERC reported that, according to a recent Hay Group study, about 40 percent of senior electrical engineers and shift supervisors in the electricity industry will be eligible to retire in 2009. This loss of expertise, exacerbated by the lack of new recruits entering the field, is one of the more severe challenges facing reliability today.
A 2007 study by NERC confirmed industry concern on the issue, ranking the aging workforce as both highly likely to occur and of having a severe impact on the reliability of the bulk power system. It’s no wonder; KEMA says that one in three U.S. workers was age 50 or older in 2010. Meanwhile, the demand for workers is increasing. A 25 percent increase in demand for industry workers is anticipated by 2015.
Exacerbating the problem of a declining workforce is a simultaneous decline in the number of potential recruits from colleges and universities, as well as vocational schools. During the past two decades, reduced demand for industry workers has led to the decline and closure of many electric power engineering programs at colleges and universities.
7. FUTURE ADEQUACY & CAPACITY MARGINS
According to NERC,
…projected increases in peak demands continue to exceed projected committed resources beyond the first few years of the ten-year planning horizon.
8. NATURAL GAS DEPENDENCY
Natural gas has become the fuel of choice for a new-build generation as gas-fired plants are typically easy to construct, require little lead time, emit less CO2, and are generally cheaper to construct than their coal and oil counterparts. Certain states have placed a moratorium on building new coal plants, citing environmental and emissions concerns as justification. These trends are expected to continue over the next several years, further increasing the number of new-build natural gas plants in areas with already high dependence.
19% of the U.S. electric industry’s generation is powered by natural gas — and is expected to rise to 22% in ten years. But Canadian imports recently peaked.
This supply gap is expected to be filled by new supplies of Liquefied Natural Gas (LNG) from overseas, which will require siting and construction of LNG terminals throughout North America. However, this terminal infrastructure is facing delays in most locations where it has been proposed.
GLIMMERS OF HOPE FOR THE GRID
The issue of freeloading use of the grid by energy producers now has the full attention of the Federal Energy Regulator Commission. In late July 2011, it issued a new federal rule requiring grid expansion be paid for only by those who benefit from it, and guaranteeing that costs align with benefits as the country seeks to upgrade and expand its power-transmission infrastructure.
However, the same day as the new rule was announced, a number of “stakeholders” asked the U.S. Senate to oppose it. Clearly this will take a long time to sort out.
HACKERS ARE ANOTHER THREAT
And, in a perverse turn of events, the grid may be less reliable because of the very technology that has been implemented to make the grid “smarter” and more efficient. The design point for the earliest smart grid devices — ease of use and interoperability — makes it far too easy for anyone to maliciously hack the grid. In fact, there has been at least one hacking attempt coordinated from China, according to the Wall Street Journal (“Electricity Grid in U.S. Penetrated by Spies, April 8, 2009).
THE ROOT PROBLEM IS ELECTRICAL
Deregulation of a number of industries has shown that price competition ultimately helps the consumer. But power transmission is a unique problem, because of the very nature of electricity.
Power flows throughout transmission networks along paths of least impedance, regardless of contractual obligations or political boundaries. Bulk power distribution decisions made by regulators in one location can conceivably have some impact on everyone in Canada and the U.S. Deregulating power generation only works if the power providers pay the real cost of supplying, including transmission.
Finally, in 2007, the FERC (Federal Energy Regulatory Commission) acquired the authority (delegated to the NERC, the North America Electric Reliability Corporation) to fine operators who don’t hold to standards. (However, it is instructive to note that, even today, some reliability standards have not been completely defined by NERC.)
So, part of the problem is solved, because one entity (NERC) has responsibility for defining and regulating reliability. However, after-the-fact enforcement, without the power to compel who will pay for grid projects, is insufficient. The Public Utility Commissions in 50 U.S. States, also exert considerable control over who pays for what, even though it has been obvious for some time that costs can be shifted from state to state. An MIT study comments:
Electric power industry policy is a hodgepodge, rooted in the federalism of 50 state laboratories. There is no coherent national vision and policy.
To take the best example, also from that MIT study:
Allocation of grid expansion project costs is often the most contentious issue a proposed high voltage transmission project encounters. Difficulties increase geometrically in proportion to the number of states involved.
The grid needs to be managed with one steady hand. Grid management and planning must include funding (and therefore chargeback) decisions, particularly when the average high-voltage transmission line takes 14 years to gain approval.
As it stands currently, FERC and its appointed agent, NERC, face the daunting challenge of herding 50 regulatory agencies. NERC does not have the authority to mandate cost allocation of grid-related costs back to any power, distribution or transmission company in U.S. and Canada. And granting it the authority to do that is not on the horizon. Until that happens, the sheer complexity of the political situation will mitigate against an effective solution.
TO LEVERAGE SMART GRID, U.S. FEDERAL POLICY MUST MATURE QUICKLY
Many experts think that applying technology to manage the grid is the clearest way out. The “smart grid” could significantly reduce the amount of power that needs to be generated to get the same amount of power to consumers. Granular and accurate control of the grid would also make big rolling outages far less likely — but only if all the technology is designed to communicate in one unified control scheme.
One of the biggest benefits touted for smart grid is increased ability for grid operators to add variable renewables, especially wind, to their systems. Experts (including NERC) agree that the transmission capacity to support the currently-mandated renewables buildout over the next decade is just not there, so the Smart Grid could play a major role in making renewable buildout possible.
Robin Lunt of the National Association of Regulatory Utility Commissioners (NARUC) said state regulators have been hoping smart grids would help achieve renewable portfolio standards and clean power to meet EPA standards. (http://energy.aol.com/2011/09/19/gridweek-analysis-smart-grid-losing-to-epa/?icid=related1).
Yet, the pending spate of EPA rules tightening sulfur, nitrogen, mercury and particulate emissions, with deadlines hitting coal plants in the next four years, will force investment dollars into abatement projects and away from longer-term efforts like the smart grid.
“The EPA bubbles to the top,” said Jon Hawkins of Public Service Company of New Mexico (PNM). “We have to invest hundreds of millions at our coal plant. That elbows out smart grid funding.”
The obvious answer is to get EPA, Energy and technology companies together so the right decisions get made. You would have to be very hopeful to expect that this will happen quickly.
HOPE FOR THE BEST, PLAN FOR THE WORST
It is heartening to see that the fundamental problems that must be addressed are recognized by the regulators at the U.S. and Canada federal levels. It is even better to say that direct action has been taken, and will continue to be, in the right direction. However, the sheer complexity of the political change that must take place in order to have a chance at getting this right is daunting.
Fixing enough jurisdictional problems to start implementing a long-term plan will not happen next week or next year. This reminds us that it is always good to hope for the best while planning for the worst. And in this case, any model for predicting the worst case should assume continued declines in reliability for the next few years.
Footnotes
1. “U.S. Electrical Grid Gets Less Reliable”, by S. Massoud Amin / January 2011. IEEE Spectrum is the flagship publication of the IEEE (Institute of Electrical and Electronics Engineers), explores the development, applications and implications of new technologies. (http://spectrum.ieee.org/energy/policy/us-electrical-grid-gets-less-reliable)
2. As of June 18, 2007, the U.S. Federal Energy Regulatory Commission (FERC) granted NERC the legal authority to enforce reliability standards with all users, owners, and operators of the bulk power system in the United States, and made compliance with those standards mandatory and enforceable. Reliability standards are also mandatory and enforceable in Ontario and New Brunswick, and NERC is seeking to achieve comparable results in the other Canadian provinces. NERC will seek recognition in Mexico once the necessary legislation is adopted.
NERC is a non-government organization that has a statutory responsibility to regulate bulk power system users, owners, and operators through the adoption and enforcement of standards for fair, ethical, and efficient practices.