The US electricity system, which precisely balances supply and demand while delivering electromagnetic energy that propagates at nearly light speed, has often been described as the most complex machine ever built. Indeed, in 2003 the US National Academy of Engineering named the electricity system as the greatest engineering achievement of the 20th century. The academy noted that the system is ingeniously engineered; it is a catalyst for new technologies and new industries; and it has an enormous impact on improving quality of life. It is arguably the most influential machine of the 20th century.
For much of that century, the electricity sector enjoyed steadily declining costs. But by 1970, diminishing economy-of-scale returns, combined with decelerating demand growth, higher fuel costs, and more rigorous environmental requirements, began to challenge the traditional declining-cost business model and to affect the economic and regulatory structure of the electric utility industry. The past 30 years have produced many well-intended efforts—culminating in a patchwork of competitive restructuring—all of which were focused on restoring the sector’s tradition of declining cost. But those efforts have failed to meet the challenge. And given today’s aging electricity infrastructure, no silver bullets are likely to succeed in the near future.
The period of well-intended efforts has also seen electricity become increasingly politicized as a retail entitlement. As a result, volatile market prices have essentially been allowed to move in only one direction—downward—often at the expense of sustaining the infrastructure for the long term. The combination of rising cost and artificially constrained price has created for electricity a monetary vise that is steadily tightening. The squeeze has led to a growing concern among industry insiders that the electricity sector’s aging infrastructure, workforce, and institutions are losing touch with the needs and opportunities of the 21st century.
Thomas Edison is best known for his inventions—particularly the incandescent lamp—but his contributions toward the development of the electric power grid are often underappreciated. Edison and his team designed the entire electrical system down to the wall outlet and in 1881 established the first power company. Edison’s system was in the Wall Street section of New York City. Even today, vestiges of it supply DC power to about 2000 customers.
The grid was originally developed in so-called cohesive electrical zones. In most urban centers, electric utilities were given franchise rights to be the exclusive provider of electricity to a geographic area. In exchange for those rights, they typically agreed to pay franchise taxes to the cities and to serve any and all customers at an agreed-upon reasonable cost.
Beginning in the 1930s, relatively small, isolated electrical systems gradually melded into ever larger interconnected ones. During the 1950s and 1960s, that evolution produced extensive integrated systems and large regional pools. Higher transmission voltages allowed bulk delivery, over long distances, of enormous quantities of power that originated at large generating plants having easy access to fuel sources. The markets made possible by interconnected systems in turn supported the building of bigger generating units. With economies of scale, price declined, which further fueled demand.
During the past century, the role of electric power has grown steadily in both importance (see figure 1) and scope. In recent decades, developments in key digital technologies such as microprocessors, electric lighting, motor drive systems, computers, and telecommunications have altered the way we live and have reshaped commercial and industrial productivity. Nowadays, as figure 2 shows, worldwide electricity use is strongly correlated to economic output.
How the grid works
The electricity grid remains a mystery to most people. Its ubiquity and high reliability over the past 50 or more years has rendered it nearly invisible, more a backdrop for the workings of modern society than its central nervous system—at least until the lights go out. The blackout of 14 August 2003 brought the operation of the grid momentarily into prominence and raised questions about how it works and why it fails. How could a small local problem bring the lives of 50 million people to a standstill in a matter of minutes?
The grid represents the high-voltage transmission system that connects bulk power generation with the medium-voltage distribution systems that supply most consumers. (Some large, typically industrial consumers are connected directly to the grid.) In Canada and the US, the grid includes more than 200 000 miles of high-voltage lines that operate at or above 230 kilovolts and ultimately serve more than 300 million consumers. The US electricity delivery system, which consists of the grid and the downstream distribution system, is a $360 billion asset. Figure 3 represents schematically the relationships among the generating system, grid, and distribution systems.
Physically, the grid is a network of wires and other devices, collectively called circuits, that weave together electrical loads (clusters of consumer demand) with the sources of electrical power generation. It predominantly comprises three-phase AC circuits, though the US grid includes a number of high-voltage DC lines as well. Three-phase circuits are systems with three equal voltages having a fixed phase difference of 120°. Those voltages result from generators that have three sets of equally spaced magnets.
AC circuits predominate in the US transmission system because they are compatible with transformers—devices that can step up voltage before electricity is transported or step it down before electricity is distributed to consumers. Transmission voltages in the US are typically 115, 138, 230, 345, or 500 kV, although there are a few extra-high voltage lines at 765 kV. The 230-kV system represents the backbone of the US electricity grid.
A variety of sources, including dammed water, nuclear reactions, oil, coal, and natural gas, are exploited to generate electrical energy, typically at medium voltages of 13–25 kV. Higher voltages would potentially be more efficient, but to obtain them, one would need to insulate generators differently. That would unreasonably complicate generator construction.
Reliability and protection
Many different kinds of entities own a share of the electricity grid. In the main, the grid belongs to investor-owned utilities, but federal agencies, rural electric cooperatives, and municipalities own pieces of it as well. Prior to the 1960s, loosely connected cohesive electrical zones offered reasonable reliability to the nation’s consumers. But following the great Northeast blackout of 1965, policymakers and industry executives alike became increasingly concerned about the grid’s reliability. In response, the industry formed regional reliability organizations to coordinate activities related to the grid’s performance.
Nowadays, reliability is administered by approximately 150 control area operators in North America, coordinated by approximately 20 regional reliability councils. The regional councils are organized under the North American Electric Reliability Council (NERC), which has established operating and planning standards based on seven concepts that include balancing generation and demand, maintaining system stability, and preparing for emergencies. NERC’s standards are actually guidelines, and compliance is voluntary. Until the NERC standards become federal law, the nation will be at the mercy of their varying interpretations and implementations.
Controlling the dynamic behavior of the interconnected electricity system is a great engineering and operational challenge. After all, power flow responds to the laws of physics: It flows freely over all available paths, roughly in inverse proportion to the magnitude of the impedance. And demand for electricity is constantly changing. Millions of consumers switch lights or air conditioners on and off; businesses cycle their office equipment and production processes. Generation and demand must be kept in balance over large regions to ensure that voltage and frequency are maintained within narrowly prescribed limits—from 59.98 to 60.02 Hz, say, for frequency. If either voltage or frequency strays too far from its prescribed range, the resulting mechanical stresses can severely damage power-system equipment. Thus the electric grid requires a protective overlay to minimize damage and to ensure that system operators can rapidly restore power when problems arise.
Throughout the network, protection schemes that rely upon calibrated circuit breakers and relays ensure that component failures and electrical faults from short circuits, lightning strikes, surges, and so forth are quickly isolated. Otherwise, the failures can propagate. Once an overloaded facility either fails or is removed from service by protective relaying, power moves to other system facilities along available routes. Those facilities may in turn become overloaded and either fail or be taken out of service by other protective relays. Such a repeated, uncontrolled cycle of overload and equipment removal or failure is called a cascading outage. In the August 2003 blackout, a local failure led to a cascading outage from the Great Lakes region to Canada and New York City—all in a few minutes (see figure 4). The system did what it was designed to do: protect itself. If not for its protection schemes, critical equipment would have been damaged and the restoration of power could have taken weeks or months instead of hours or days.
The electricity system may be likened to a giant mobile with suspended weights. Snipping off one of the weights will cause oscillations to propagate through the entire system. Compensation must be made for the loss of the weight (representing a load) if the oscillatory patterns are to damp out quickly. If the oscillations are not damped out, they could break the fragile lines of other suspended weights and become amplified. Thus the initial snipping event would evolve into large-scale failure.
Power-system operators must perform a balancing act analogous to managing the weights in the mobile: They balance active and reactive power. Active power is that used for such tasks as lighting a room or turning a motor shaft. It may be expressed as I 2 R, where I is the current and R is the resistance, the real part of the impedance. Reactive power results from the interaction between AC power and the various wires, cables, transformers, and appliances connected to the system. It may be expressed as I 2 X, where X, the reactance, is the imaginary part of the impedance. Reactive power neither consumes nor supplies energy.
Power plants produce a combination of active and reactive power. System operators must understand and manage reactive power levels so that adequate amounts of active power can be conveyed from generators to consumers and their energy-consuming devices. If active and reactive power are not properly balanced, voltage collapse may occur in one part of the system and could propagate system failure. The question of whether there was adequate reactive power, called VAR (volt-amperes-reactive) support, in the Ohio region is a lingering controversy of the 2003 blackout.
The power delivery system is largely based on technology developed in the 1950s or earlier and installed as much as 50 years ago. The strain on this aging system is beginning to show, particularly as consumers ask it to do things it was not designed to do. Energy transmission has been further complicated by efforts to deregulate power generation and by the confusion arising from the overlapping jurisdictional authority of federal and state regulators. Among the numerous challenges facing the electricity industry are the rapid increase in wholesale transactions between such entities as independent power producers and distribution utilities; increasing grid congestion; continuing low levels of infrastructure investment; the application of technology to allow more options for consumers; the growing need for better grid security; and the precision power requirements of a digital society.
An additional and significant stress on the North American power delivery system results from the discrepancy between the growth in demand for power and the expansion of the delivery system to meet that demand. From 1988 to 1998, US electricity demand rose by nearly 30% while the transmission network’s capacity grew by only 15%. In its Electricity Technology Roadmap: 2003 Summary and Synthesis, the Electric Power Research Institute (EPRI) anticipates that the disparity will further increase during the period 1999–2009: The institute projects demand to grow by 20% and system capacity to increase by just 3.5%.
In the 1990s, capital expenditures of the US electricity sector were about 12% of total revenues. That life-support level of investment, about half the historic average, was previously approached only during the depths of the Great Depression and World War II, times when private investment was generally very low. Such low levels of investment are dangerous and unacceptable. Moreover, a large share of the investment during the 1990s was in power generation rather than improvements in the power-delivery infrastructure. That period of low investment saw the economic cost of power disturbances, from minor blips to major outages, grow to roughly $100 billion per year in the US, according to an EPRI survey of key industries. In other words, for every dollar spent on electricity, consumers are spending at least 50 cents on other goods and services to cover the costs of power failures.
Why is the commitment to infrastructure so low? Electricity service is, by its nature, an extraordinarily capital-intensive, politically constrained enterprise. It involves more than 3000 companies; is owned by investors, municipalities, cooperatives, and federal agencies; and is regulated by state and federal authorities whose jurisdictions overlap. Beginning with the Public Utility Regulatory Policy Act of 1978 and accelerating with the Energy Policy Act of 1992, orders to deregulate the electricity enterprise led to the partial dismantling of the institutional structure for the electricity sector. But that structure has not been replaced by an alternative with coherent institutions and rules. As a consequence, elements of the diverse electricity sector are in crisis, and the effects have been spreading. The investor-owned utilities, in particular, are laboring under inconsistent and conflicting regulations. Market reforms—worthy, limited experiments that have shown mixed success—have resulted in rules that differ from state to state and, in many cases, from utility to utility within a state. Order 888, issued by the Federal Energy Regulatory Commission in 2000, required open access for all transactions on interstate transmission circuits. Federal open-access orders, however, have not given clear direction for how open competition is to be implemented. Changing environmental policies have added to the uncertainty in the electricity sector, even though many proposals were intended to reduce uncertainty.
Any one of the problems just described might have been manageable. But the simultaneous convergence of several independent difficulties has caused serious turmoil in the business aspects of the electricity sector. Increasingly, wholesale markets are thwarted by the inability of the aging US power delivery system to support transactions. Further expansion of retail deregulation has essentially come to a stop. Credit markets have shut out nearly all of the high-risk energy trading companies, whose business has turned from boom to bust over the past several years. Other industry members have seen their credit ratings drop, and financing costs for the industry have risen dramatically. Those strains on business have affected any number of stakeholders in the electricity sector; they are unable to plan, unwilling to invest, and stalemated in their attempts to devise a way out of the current dilemma.
An additional threat to the power system is external—terrorist attack. Protecting against a determined attack is particularly complicated because the system’s equipment and facilities are so dispersed. Furthermore, the concern is not only about physical vulnerabilities in the power system, but also about disruptions to computer networks or communication systems. Terrorists might, for example, exploit the power system’s increasingly centralized control to magnify the effects of a localized strike. An attack that led to even a momentary interruption of power could be inconvenient for consumers, who have become ever more dependent on sensitive electronic systems, and could adversely affect the economy. A 20-minute outage at an integrated circuit fabrication plant, for example, could cost as much as $30 million.
Competition in US wholesale electricity markets has increased the stress on the North American power-delivery system. Companies that can generate power at low cost sell it to others so as to meet demand in a not-too-distant area, and they can do so by virtue of the interconnected delivery system. The desire to obtain the lowest-cost power has led to an increase in the number of power transactions carried over the system.
The North American power-delivery system, however, was not designed to meet the pace and rigor of competitive markets. The large number of wholesale transactions breaks down the cohesiveness of the system. Kirchhoff’s law, not federal law, governs power flow, and so the industry had to quickly come up with a method to adjudicate contracts between buyers and sellers of electricity. The system now in place allows for any of the more than 150 grid area operators in the US to declare that they are unable to accommodate a transaction: That declaration is made through a request for transmission loading relief. Figure 5 shows the number of level two (“I cannot fully accommodate the transaction”) or higher calls for transmission loading relief in North America. Such requests have increased about 10-fold over the past seven years—a direct result of growing congestion on the lines. If new power-delivery equipment is not built, problems associated with congestion will only worsen.
Efforts to expand consumer choice pose additional challenges for the electricity sector. The expanded options—all operating through the same delivery infrastructure—are part of a worldwide restructuring of the electricity industry. In addition to the services they want, consumers can choose their service providers, which can be utilities or other companies that offer various arrangements for billing and pricing. Real-time pricing, for example, is a growing trend. In that scheme, the price of electricity varies at different times according to the commodity’s marginal cost. In so-called interruptible load programs, consumers receive rate reductions in exchange for a commitment to curtail certain loads during peak periods. Also, as part of the restructuring, a limited number of small electricity generation and storage devices are being distributed throughout the power system on or near consumers’ premises. Those distributed energy resources can supplement traditional delivery mechanisms.
Probably the greatest long-term challenge to the electricity sector is that, due to digital technology, the nature of electricity demand is undergoing a profound change. When the personal computer was introduced 20 years ago, few people foresaw the widespread proliferation of smart devices. Today, for every microprocessor inside a computer, 30 operate in standalone applications; the result is a digitization of society. In applications ranging from industrial sensors to home appliances, microprocessors now number more than 12 billion in the US alone.
Digital devices are highly sensitive to even the slightest interruption of power; an outage of less than a fraction of a single cycle can disrupt their performance. They are also quite sensitive to variations in power quality due to transients, harmonics, and voltage surges and sags. Digital-quality power, with sufficient reliability and quality to serve growing digital loads, now represents about 10% of total electrical load in the US. The EPRI Electricity Technology Roadmap projects that the digital-quality power load will reach 30% by 2020 under business-as-usual conditions, and as much as 50% if the power system is revitalized to provide digital-grade service.
The current US electricity infrastructure, though, was designed decades ago to serve analog, or continuously varying, electric loads. It cannot consistently provide the level of digital-quality power required by the nation’s digital manufacturing assembly lines, information systems, and, increasingly, home appliances.
Smart power delivery
We in the US cannot afford to abandon or entirely replace our power delivery system. And we don’t need to. What we do need is to use advanced technology to modernize and enhance the use of the existing asset base.
Computers, sensors, and computational ability have transformed every major industry in the Western world except the electric power industry. Still, technology innovation in the electricity sector has been a cornerstone of global economic progress. The worldwide growth in gross domestic products over the past 50 years has been accompanied by improvements in both energy intensity (energy per dollar GDP) and labor productivity. Improved efficiencies in energy-consuming devices also provide direct environmental benefits.
Several available or emerging technologies will help transform the grid into a smart power system capable of supporting the digital society of the 21st century. In broad strokes, the transformed “intelligrid” will be an integrated, self-healing, electronically controlled electricity supply system of extreme resilience and responsiveness that is capable of responding in real time to the billions of decisions made by consumers and their increasingly sophisticated microprocessor agents. The transformation, we believe, will open the door to a convergence of electricity and communication that will usher in a new era of productivity and prosperity. Here is a partial list of enabling technologies.
Advanced conductors . Various techniques can increase the amount of power carried along existing transmission corridors. Some of them, but not all, involve new materials. The methods range from reconfiguring existing lines to using new types of conductors with carbon-fiber cores. The new conductors have higher current-carrying capability, and because of their greater strength and lighter weight, they sag less at the high temperatures associated with high power-flow rates. In the future, high-temperature superconducting cables in underground systems might carry triple the current of conventional conductors, perhaps more. They may also be suitable for retrofitting in some underground and ground-based conduits.
Distributed energy resources . Small generation and storage devices distributed throughout and seamlessly integrated with the power delivery system (see figure 6) offer potential solutions to several challenges the electric power industry currently faces. Those challenges include the needs to increase the resilience and reliability of the power-delivery infrastructure, make a range of services available to consumers, and provide low-cost, digital-quality power.
Automation. This is key to providing high levels of reliability and quality. To a distribution-system operator, automation may mean that in an emergency, a distribution feeder, local distributed energy resources, or both would be automatically isolated from the grid. To a power-system operator, automation could mean a self-healing, self-optimizing power-delivery system that anticipates and quickly responds to disturbances. As a result, power disruptions would be minimized or eliminated altogether.
Power-electronics controllers. Based on solid-state components, these devices offer control of the power-delivery system with the speed and accuracy of a microprocessor, but at a power level 500 million times higher.
Computer modeling of market tools. To accommodate changes in retail power markets worldwide, market-based mechanisms will need to offer appropriate incentives to buyers and sellers, facilitate efficient planning for the expansion of the power-delivery infrastructure, and effectively allocate risk. Computer modeling will play an important role in testing market models.
Communications architecture. To realize the vision of the smart power-delivery system, standardized communications architecture must first be developed and overlaid on today’s system. EPRI recommends that integrated energy and communications-system architecture be based on publicly available standards.
Energy portals . Distribution systems were designed to perform one function—to distribute power to consumers. But many value-added retail services require two-way information exchange between the consumer and the marketplace. An energy portal, which would sit between a consumer’s in-house communications network and a widearea access network, would enable two-way, secure communication between a consumer’s equipment and energy-service or communications providers.
A worthwhile investment
Technological innovation in this century could transform the reliability, quality, security, and value of electricity. The potential implications of that transformation are profound. Right now, however, a large and growing gap exists between the performance capability of the electricity system and society’s needs and expectations. A commitment to infrastructure investment at the level needed to fully develop and deploy a smart, digital power system is fundamental to closing that gap. To maintain an artificially low electricity cost structure at the expense of system modernization is simply not sustainable: Eventually, consumers, investors, and society all must pay the price.
The current rate of investment in the power delivery system, some $18–20 billion per year, is just enough to meet demand growth and to replace failed equipment. To correct deficiencies in the existing system and to enable the smart power system of the future would require an additional $8–10 billion annually. That additional investment would lead to an average increase of 3–5% in consumers’ electric bills. But we believe the investment would pay for itself many times over by increasing the nation’s economic growth rate and reducing the cost to consumers for power disturbances.
The overarching priority of the electricity sector and all its stakeholders for the next 10 years will be to pursue the policies and actions needed to stimulate infrastructure investment. A technology program funded with a portion of that investment should address power-system reliability, environmental performance, and the development of higher-efficiency services. The result would be a valuable, smart electricity supply system capable of meeting the escalating needs and aspirations of 21st-century society.
Clark Gellings is vice president for power delivery and markets at the Electric Power Research Institute in Palo Alto, California. Earlier this year, Kurt Yeager retired as president and chief executive officer of EPRI.