Important Note: This website contains historical data from the INSP project. As of 2004 the site is no longer maintained and certain sections do not work correctly.



U.S. Nuclear Power Plant Performance

Information provided by Nuclear Energy Institute, April 1998


Table of Contents
Key Facts
The 1990s: Building on a Decade of Improvement
Drive for Excellence
Nuclear Electricity: Becoming More Competitive


Key Facts

  • In 1997, nuclear energy generated more electricity--631 billion kilowatt-hours--in the United States than any other fuel source except coal.

  • In 1997, U.S. nuclear power plants achieved an average capacity factor of 70.3%, based on gross generation.* (Capacity factor, a yardstick for plant performance, measures the amount of electricity actually produced compared with the maximum output achievable.) The 1997 average is nearly 16 percentage points higher than the 1980 average.

  • In 1997, 70% of U.S. nuclear power plants achieved a capacity factor of 70% or higher. Excluding the 10 units that did not operate at all during the year, 77% of America's nuclear power plants achieved capacity factors of 70% or higher. In 1980, only 19% of U.S. nuclear power plants operated at that level.

The 1990s: Building on a Decade of Improvement

Since 1980, more than 40 U.S. nuclear power plants have entered service. The number of nuclear power plants in commercial service now stands at 105, up from 68 in 1980.

U.S. nuclear power plant performance reached an all-time high during the mid-1990s. Power plant performance is commonly measured by capacity factor, which expresses the amount of electricity actually produced by a plant compared with the maximum achievable. U.S. nuclear power plants achieved a record capacity factor of 75.9% in 1995, and though it dipped to 70.3% in 1997, it remains significantly higher than the 1980 average of 54%. The decline in 1997 can be attributed to the fact that 10 units did not operate at all during the year; clearly, plants that are closed depress the national average. Excluding the plants that didn't operate, the 1997 average capacity factor for the remaining plants was 77.6%.

Seventy percent of U.S. nuclear plants (75 plants) operated at a capacity factor of 70% or better; only 19% (13 plants) achieved that level in 1980. Fifty-two percent (56 plants) had a capacity factor of 80% or higher in 1997, compared with only 6% in 1980.

Nationally, each percentage point increase in capacity factor is roughly equivalent to bringing another 1,000 megawatts of generating capacity on line. Improved nuclear power plant performance thus helps meet the growing demand for electricity in the United States.

The Nuclear Energy Institute now calculates capacity factors based on gross generation--the format used outside the United States--rather than on net generation and reported maximum dependable capacity (MDC), which slightly overstates operating efficiency. Because of this change, made in 1998, U.S. capacity factors are several percentage points lower than when they were based on net generation and MDC.

In 1997, nuclear power plants provided 631 billion kilowatt-hours, one-fifth of the electricity generated by utilities in the United States. The rise in capacity factor over the past decade is the result of plant modifications, improved operating and maintenance practices, and more attention to training of nuclear plant personnel.

Drive for Excellence

During the 1980s, U.S. utilities committed to a major nuclear power plant improvement program. Its success is partly due to the initiatives of the Institute of Nuclear Power Operations (INPO). INPO is an industry-sponsored organization that evaluates U.S. nuclear power plants and sets goals for excellence in operations.

As part of its program, INPO monitors 10 key performance indicators, such as unplanned automatic shutdowns, safety system actuations and heat rate. INPO collects these data from each nuclear unit, then calculates national averages, and submits them to the World Association of Nuclear Operators (WANO). Each of WANO's performance indicators reveals significant improvement since 1980.

Unit capability factor is the percentage of maximum energy generation a plant is capable of supplying. Although it slipped to 81.6% in 1997 from 82.5% in 1996, compared with 82.5% in 1996, the industry's unit capability factor has risen nearly 20 percentage points since 1980--a jump of more than 30%.

Unplanned automatic scrams are plant shutdowns caused by some imbalance in operations. They are measured for 7,000 hours of operation (about one year). This number fell from 7.3 per plant in 1980 to virtually zero in 1997. The decrease in recent years points to the effectiveness of training and maintenance programs.

Thermal performance measures "gross heat rate," or the number of British thermal units (Btu) required to produce a kilowatt-hour of electricity. The lower the number, the more efficient the plant. The gross heat rate has fallen from 10,504 Btu/kWh in 1980 to 10,138 in 1997.

Collective radiation exposure has trended downward since 1980 at both boiling water reactors (BWRs) and pressurized water reactors (PWRs). At BWRs, radiation exposures fell from 859 man-rem per unit in 1980 to an all-time low of 184 in 1997. At PWRs, exposures fell from 417 man-rem per unit in 1980 to a record low of 124 in 1997. The declines show the effectiveness of radiological protection programs.

Industrial safety accident rates have been cut by nearly 80% since 1980 and now number 0.45 industrial accidents per 200,000 man-hours worked.

Volume of solid low-level radioactive waste produced both by PWRs and BWRs were less than the industry's goals for the year 2000. Low-level waste volume from PWRs averaged 18 cubic meters in 1997, substantially better than the 500 cubic meters produced in 1980. The typical BWR generated 77 cubic meters of low-level waste in 1997, down from 950 cubic meters in 1980.

Nuclear Electricity: Becoming More Competitive

The nuclear industry is a mature business. Nuclear power plants are operating more safely, more productively and more competitively. Since 1980, the industry has made significant changes in the way it operates nuclear power plants. These changes, which required increased staffing and safety improvement work, boosted plant performance, reliability and output. At the same time, they pushed up operating and maintenance (O&M) costs.

As these changes became institutionalized in utility programs, however, O&M costs stabilized. Average O&M costs for nuclear plants--measured in 1996 dollars--were 1.76 cents per kWh in 1990, 1.69 cents in 1991, 1.72 cents in 1992, and 1.68 cents in 1993, 1.48 cents in 1994, 1.39 cents in 1995, and 1.36 cents in 1996, based on figures from the Utility Data Institute, an independent research organization. Moreover, nuclear energy is competitive with other sources of electricity. With average production costs--including fuel--of 1.91 cents per kWh in 1996, nuclear is only marginally more costly than coal at 1.82 cents/kWh, and considerably less expensive than natural gas at 3.38 cents/kWh and oil at 4.14 cents/kWh.

To help improve efficiency, the U.S. nuclear energy industry has created mechanisms to share good economic practices. A major industrywide benchmarking program was launched in 1995 to study work management and outage management practices at top-performing plants both in the United States and overseas.


^top

----------
Please write to us at insp@pnl.gov
About this Web Site

https://insp.pnnl.gov:80/?ukuscon/usnucprog
The content was last modified on 02/04/99 .

Security & Privacy