Nuclear Is Not the Way
__Read a related essay from the Autumn 2006 issue, "____Nuclear Power Is the Future____" by Max Schulz.__
Decades after the promise of nuclear energy “too cheap to meter” was swamped in a sea of red ink and trampled by the Three Mile Island accident in 1979, the nuclear power industry is seeking to reinvent itself by claiming that it will help save the world from the perils of global warming. It has found an ally in the Bush administration, which has spurned the Kyoto Protocol as too costly even as it beats the drum for nuclear power at home and abroad. Last year, the administration persuaded Congress to pass an energy bill authorizing billions of dollars in potential subsidies for new nuclear power plants.
Could nuclear power really help save the world from what is arguably the worst environmental scourge ever to confront humanity? History suggests the need for two things: caution about the nuclear industry’s messianic proclamations, and careful analysis.
The technical facts are reasonably clear. In the United States, the largest source of carbon dioxide (CO2), the most important greenhouse gas, is the electric power sector, followed closely by transportation. Together, these sectors accounted for nearly 72 percent of U.S. greenhouse-gas emissions in 2004. Coal, the dirtiest of the fossil fuels, supplies 50 percent of U.S. electricity. By contrast, nuclear power emits far lower levels of CO2, even when uranium mining, enrichment, and fuel fabrication are taken into consideration.
At first blush, these facts would seem to support the promoters of nuclear energy. But a shortage of low- or zero-CO2 sources of energy is not the problem we face in confronting global warming. The scarce commodity is money. What will give the biggest bang for the global warming buck? A related question: What other problems may be created in the process of reducing CO2 emissions? Any energy source must meet the tests of safety, reliability, and cost. In addition, there are unique problems associated with nuclear energy: the potential for nuclear weapons proliferation arising from the fact that developing and using nuclear power creates the twin byproducts plutonium (in the spent nuclear fuel) and nuclear know-how. Moreover, an expansion of nuclear power would require a vast increase in the world’s uranium enrichment capacity—the very technology that the United States and other countries now desperately want to prevent Iran from acquiring. While commercial-power reactor fuel cannot be used in a nuclear bomb, commercial enrichment plants can be reconfigured to produce weapons-grade uranium.
Taken together, cost, proliferation, and accident risks made the promise of nuclear power as a “magical” energy source, as Alvin Weinberg, the first director of Oak Ridge National Laboratory, once put it, evaporate the first time around. How serious will these risks become if nuclear power has a second life?
The most important consideration is how many nuclear plants would be needed to significantly reduce future CO2 emissions. A 2003 study by researchers at the Massachusetts Institute of Technology, The Future of Nuclear Power, considered a reference case in which 1,000 one-gigawatt (GW) nuclear plants would be in operation around the world by 2050. (A gigawatt is enough electricity to power a U.S. city of half a million.) Even with such an increase, however, the proportion of electricity supplied by nuclear power worldwide would rise only slightly, from about 16 percent in 2000 to about 20 percent in 2050. As a result, the number of fossil fuel power plants, and thus the amount of CO2 emissions, would continue to increase.
A more serious effort to limit carbon emissions through the use of nuclear power would require a larger number of reactors. In Insurmountable Risks: The Dangers of Using Nuclear Power to Combat Climate Change (2006), one of us used the same projected growth in electricity demand employed in the MIT report to estimate the number of reactors required simply to maintain the electricity sector’s CO2 emissions at their 2000 levels. Some 2,500 one-GW nuclear plants would be needed by midcentury. To meet that goal, one plant would have to come online somewhere in the world every six days between 2010 and 2050.
The largest risk of such an expansion of nuclear power is likely to be the increased potential for proliferation of nuclear weapons. It has been known since the dawn of the nuclear age that nuclear power and proliferation are inextricably linked. In order to fuel 2,500 reactors, the world’s uranium enrichment capacity would need to increase by approximately six times. Just one percent of that capacity could supply enough highly enriched uranium to create 500 nuclear weapons every year. The Iranian enrichment facility at Natanz that has created an international uproar and rumblings of war would, if completed, represent less than 0.1 percent of the enrichment capacity needed to fuel 2,500 reactors. If the plutonium in the spent fuel discharged from that number of reactors each year was separated, it would be enough to make more than 60,000 nuclear bombs, about twice the number in the world’s nuclear arsenals today.
Proposals to reduce proliferation risks require intrusive inspections and a consensus that countries will not use commercial technology for weapons purposes even in a crisis. The 1970 Treaty on the Non-Proliferation of Nuclear Weapons (NPT) gives more than 180 non–nuclear weapon states that are signatories the “inalienable right” to nuclear power technology. It also requires the five recognized nuclear-armed states that are signatories to get rid of their weapons, according to a World Court advisory interpretation of the NPT. Yet the United States and the other four powers show no signs of moving toward fulfillment of that commitment. Without a clear movement toward disarmament, the desire for at least the potential to build nuclear weapons will remain widespread, and the acquisition of commercial nuclear technology will remain the most attractive means of keeping that potential alive. No overt move toward nuclear weapons is required. But it is interesting that Brazil opened a commercial uranium enrichment plant in 2005 and Argentina has announced that it is returning to pursuit of commercial enrichment.
The Bush administration’s proposed Global Nuclear Energy Partnership may be accelerating the trend toward national nuclear capability. The proposal, which the administration is pursuing in cooperation with Russia, is to have countries with existing facilities supply fresh fuel to other countries and take back the spent fuel under international guarantee. Essentially, the proposal would void the “inalienable right” guarantee for those countries without enrichment or plutonium separation technology.
Another unique danger of nuclear power is the potential for a catastrophic accident or well-coordinated terrorist attack to release a large amount of radiation. Such a release could have severe health and environmental consequences, as the 1986 Chernobyl accident showed. The accident at Three Mile Island was not a radiological catastrophe of Chernobyl’s magnitude because the secondary containment (the concrete wall encasing the entire reactor structure) held. But even an accident without a breach of the secondary containment would cost a great deal.
The Three Mile Island accident was followed by a rapid escalation of nuclear power plant costs, partly because of necessary new safety rules and partly because of rapidly rising interest rates. The largest bond default in utility history, which was a major element in the collapse of Chemical Bank, occurred in the early 1980s because of unrecoverable investments in canceled nuclear power reactors in Washington State. The accident and the bond default figured significantly among the factors that made Wall Street skittish about financing more nuclear power plants, and that hesitation persists today.
The risk of an accident is very difficult to estimate. The calculations rely on estimates of failure where data are scant; the result is that there are many subjective factors in such estimates. William D. Ruckelshaus, the head of the U.S. Environmental Protection Agency under Presidents Richard M. Nixon and Ronald Reagan, cautioned that “risk assessment data can be like the captured spy: If you torture it long enough, it will tell you anything you want to know.”
Uncertainties in risk estimates make it much more difficult for Wall Street to assess the risk that an investment will go sour. As Peter Bradford, a former commissioner of the Nuclear Regulatory Commission, told The New York Times last year, “The abiding lesson that Three Mile Island taught Wall Street was that a group of NRC-licensed reactor operators, as good as any others, could turn a $2 billion asset into a $1 billion cleanup job in about 90 minutes.”
In the nearly 3,000 reactor-years of experience at U.S. nuclear plants, there have been one partial core meltdown and a number of near misses and close calls. By comparison, the total number of reactor-years worldwide if 2,500 reactors were to be built between now and 2050 would be roughly 46,000, assuming a constant rate of growth. Using the median accident probability derived from the American experience, and assuming that future plants will be 10 times safer than today’s, we find a likelihood of better than one chance in two that at least three accidents comparable to the one at Three Mile Island would occur by midcentury. A single severe accident by itself could bring the whole approach of using large numbers of nuclear reactors to a screeching halt, leaving plans for CO2 reduction in disarray.
Finally, there is the difficulty of managing radioactive waste. Building 2,500 reactors by 2050 would lead to nearly a quadrupling of the average rate at which spent fuel is generated. Assuming a constant rate of growth, one repository with the legal capacity of the U.S. government’s Yucca Mountain facility in Nevada would have to come online somewhere in the world every three years. The seriousness of that challenge is illustrated by the fact that Yucca Mountain itself is years from being operational. Its opening was originally scheduled for 1998. It is now set, at the earliest, for 2017, and even that target is unlikely to be met. And the U.S. Department of Energy has already spent nearly $9 billion on Yucca Mountain—money that federal law requires nuclear utilities to charge their ratepayers. In the meantime, the cost of storing spent fuel at the country’s 66 reactor sites has been soaring, and utilities have sued the Energy Department for breach of contract for not removing their spent fuel. The lack of a repository has become a major stumbling block to the expansion of nuclear power.
Alternatives to repository disposal are unlikely to be feasible. Reprocessing the spent fuel, as some propose, would greatly increase the dangers of nuclear power because it involves the separation of weapons-usable plutonium from fission products. While proponents claim that reprocessing would greatly reduce the space needed for a repository, the claim depends largely on the assumption that uranium, which constitutes 95 percent of the weight of spent fuel, would be disposed of in shallow storage facilities of the type used for “low-level” radioactive waste, even though it is far too radioactive for such disposal. The authors of the 2003 MIT study argued against reprocessing. Instead, they proposed interim storage of nuclear wastes accompanied by expanded research on a technique called deep borehole disposal. At several thousand feet, the boreholes would be deeper than a typical geologic repository, and in concept, each borehole would contain less spent fuel while the great depth would produce smaller environmental impacts. But the costs and pitfalls of this strategy are not yet well understood.
Committing to a large increase in the rate of waste generation based only on the potential plausibility of a new waste management strategy such as deep boreholes would be to repeat the central error of the past. The concept of repositories like Yucca Mountain dates back to at least 1957, but not one spent fuel rod has yet been permanently disposed of.
Even with optimistic but plausible assumptions for cost improvements, nuclear power is likely to remain an expensive source of electricity compared to fossil fuels. According to the 2003 MIT study and a 2004 study by researchers at the University of Chicago, both of which advocated the pursuit of nuclear power, electricity from new nuclear plants is likely to cost between six and seven cents per kilowatt hour (kWh). By contrast, new coal fired plants produce power at about four cents per kWh (without CO2 sequestration). Proponents of nuclear power speak of further cost improvements, but these remain speculative. Rising interest rates and skepticism on Wall Street, the ultimate underwriter of any nuclear expansion in the United States, suggest that costs actually could be much higher.
Are there any reasonable alternatives that can reduce CO2 emissions for the same cost? Of the available near-term options (i.e., those likely to be available during the next 10 years), the two most important in the United States are an increase in efficiency and an expansion of the development of wind power. Efficiency is a no-brainer, since it comes without any price tag—in fact, it comes with a net economic gain. So we will assume that any approach would adopt all economical efficiency measures. What about supply?
At approximately four to six cents per kWh, wind power at favorable sites in the United States is already competitive with natural gas and new nuclear power. With the proper priorities on upgrading the transmission and distribution infrastructure and changing regulations, wind power could expand rapidly. Without any major changes in the existing electricity grid, wind power could generate 15 to 20 percent of the U.S. electricity supply—almost the same fraction as nuclear power now supplies. In other words, wind energy can accomplish what nuclear advocates claim, at a lower cost, and without the proliferation headaches, so long as the total amount of wind energy is less than about 20 percent. (Because wind is an intermittent energy source whose availability varies from day to day, boosting wind’s share of the electricity supply in the United States beyond these levels would require the development of new energy storage facilities.) Wind energy development in sensitive or scenic areas is not necessary to achieve this. The potential wind energy supply in the Midwest, Southwest, and Rocky Mountain states, where the prospect of substantial royalties makes turbines very attractive to farmers and ranchers, is two-and-a-half times total U.S. electricity generation and 12 times total U.S. nuclear power generation.
As for solar power, recent technological breakthroughs in thin-film solar cells promise to lower costs from about $5 or $6 per peak watt today to only $1 to $1.50 per peak watt in less than five years. (A peak watt is a measure of output at the peak of sunshine in the summer.) This would put solar in about the same cost category as wind. But solar has the advantage of low transmission and distribution costs, since the units can be located right where their output is used. On-site solar can be put into the same grid as off-site wind in an arrangement called a “distributed grid.” Such a grid can reduce the fluctuations associated with each of these intermittent power sources by capitalizing on the fact that they often do not fluctuate in tandem.
Still, intermittency remains a challenge. For instance, there are many times when the wind falls off after sunset, but electricity is still needed. The problem can be overcome in two ways. The first is to invest in some form of storage. The second is to install capacity that can operate on demand—that is, capacity that is not dependent on the weather. These can be used in complementary fashion.
The most immediately available form of storage is pumped hydropower. Wind and solar electricity can be used to pump water into existing reservoirs, from which hydroelectricity could be generated during periods of insufficient sunlight or wind. Also immediately available are gas turbines and “combined-cycle” power plants; these are already in use today. Natural gas is now so expensive as a fuel that it would pay to idle a part of existing capacity of gas-fired power plants to keep it available for use when electricity generated from the wind and sun is not available in sufficient amounts. When used together, wind, sun, pumped hydro, and natural gas can provide as large a share of electricity as coal does today (about 50 percent) for about the same cost as new nuclear power. And that’s only at current prices. In the future, nuclear power will likely be more expensive than promised, while wind and solar costs have been coming down steadily and are likely to continue falling.
Yet another non-nuclear route to reducing CO2 emissions lies in applying new techniques and technologies to today’s largest and dirtiest source of electricity. Integrated gasification combined-cycle (IGCC) plants turn coal into a gas that can be burned, making it easier to capture coal-related pollutants, including toxic metals such as mercury, and, most important, CO2. The captured CO2 can be injected into geologic formations, such as exhausted oil and gas fields, where it is estimated they can remain for centuries or longer. Injection is not an exotic technique; it has been used as a way to enhance oil recovery since at least 1972. And the energy industry has demonstrated the feasibility of sequestering CO2 at both the Sleipner gas fields in the North Sea and the In Salah natural gas fields in Algeria. The Sleipner sequestration project began after the imposition of a tax on carbon emissions by the Norwegian government, and the In Salah project was undertaken, in part, to further demonstrate the viability of geologic storage of CO2. While the costs of such strategies are more uncertain than those of other mitigation options, estimates of the cost of electricity from IGCC plants with carbon sequestration range from 4.2 to 8 cents per kWh. Of course, this technique does not overcome other disadvantages of coal, such as the destruction wrought by surface mining, which can only be mitigated by government regulation.
Physics was never an obstacle to nuclear power. In theory, fission could be the world’s biggest source of electrical power. But the nuclear promise was defeated by engineering realities that led to high costs, the risk of accidents with consequences for many generations, waste disposal headaches, and, most worrisome of all, a much increased potential for the proliferation of nuclear weapons. To rely upon nuclear power to combat global warming would pose risks so severe that they should, by any sensible accounting, be unacceptable, given that safer alternatives exist. These alternatives are not cost free. But if our children don’t like to look at windmills or solar panels, they can always do away with them. The same cannot be said of nuclear weapons and nuclear waste spread to the far corners of the world.
__Max Schulz responds:__
It will serve nobody’s interest to offer a line-by-line rebuttal of Arjun Makhijani’s and Brice Smith’s essay. The space allotted is too short. In any event, I have offered my own argument, which I am happy to let stand on its own. Still, there are a few broad points I want to address about their piece, which, while eloquently argued, I believe ultimately falls short.
Messrs. Makhijani and Smith hang their criticism of nuclear power on three things: “cost, proliferation, and accident risks.” They cite these as the principal things to deal a blow to the nuclear industry after the 1979 Three Mile Island accident, and they go on to wonder—ominously—“How serious will these risks become if nuclear power has a second life?”
Let’s start with the question of cost. First, it is fair to say the issue of cost is not really a “risk,” at least not like the very real ones associated either with proliferation or an accident at a nuclear plant. As to what the precise costs are for generating electricity from nuclear power, as opposed to other sources such as wind (cited by the authors as a “reasonable alternative . . . for the same cost”), observant readers will note that we use different figures. What gives? As with many points, determining these costs is open to interpretation.
What is not open to interpretation, however, is the fact that nuclear power produces about 20 percent of America’s electricity. Wind generates a fraction of one percent. Whether nuclear power is an economic method to produce electricity or not is something I am happy to let the market decide. Same with wind and other renewables. So far, the market is making a very definitive statement about the relative merits of these technologies.
The authors hint at wind and solar’s problems. Because the wind doesn’t always blow and the sun doesn’t always shine, the power to be derived from these can be intermittent and unreliable. (They do not point out, though, the flipside, which is that the obvious advantage to nuclear is that it can provide huge volumes of high-grade and absolutely reliable power around the clock.) But they dismiss this, arguing as if it is just a minor problem to be tweaked. And how to do that? No problem. Just “develop . . . new energy storage facilities,” as if this were as easy as building a huge storage tanker to hold electricity generated when the wind is blowing hardest and the sun shining brightest.
But it doesn’t work that way. Electricity cannot be stored like coal or gas. It must be generated essentially at the time of use. The idea of a large-scale storage facility for electricity is something akin to cold fusion or the perpetual-motion machine: wonderful ideas, but existing only in the imagination. Messrs. Makhijani and Smith hold out pumped hydropower as an example of storage, where “wind and solar electricity can be used to pump water into existing reservoirs, from which hydroelectricity could be generated during periods of insufficient sunlight or wind.” But there is something of a Rube Goldberg feel to pumped hydropower, which requires two different reservoirs at varying heights, massive quantities of water, and hydropower turbines to generate electricity, in addition to the large-scale wind facilities or massive solar panels that would be needed to power pumps to move the water from the lower reservoir to the upper one. Such a system would require huge amounts of space. Is this really a practical and economical answer to meeting the challenge of greatly increased electricity demand in the coming decades, not to mention supplanting any electricity already being generated by nuclear power or coal?
The authors’ other objections to nuclear power center on the risks of proliferation as well as the possibility of a nuclear accident. These are not illegitimate concerns, and they have been with us since the dawn of the nuclear age. We could decommission every commercial nuclear energy plant in the United States tomorrow, however, and we would still face a serious proliferation threat worldwide. The nuclear genie is out of the bottle, to use the old cliché, and there is no putting it back. (And that would have been the case had there been no Manhattan Project and had the United States foresworn pursuing creation of an atomic weapon. The Nazis or the Soviets certainly would have developed one, given enough time.) The emergence of the threat posed by a nuclear Iran or North Korea owes much more to the malign ambitions of evil regimes and the impotence of international institutions such as the United Nations, the International Atomic Energy Agency, and the Nonproliferation Treaty, than to the inherent dangers of nuclear fission. The IAEA has effectively abdicated its responsibility to police the nuclear arena, and the world is far less safe as a result. As such, nonproliferation must be among the very highest priorities for every American presidential administration.
The possibility of an accident at a nuclear plant, similarly, is not to be dismissed out of hand or taken lightly. This is particularly the case, as the authors point out, if we see a widespread construction of new nuclear plants around the world in the coming decades. Still, one thing they failed to mention is that nuclear power has experienced huge advances in operational safety since Three Mile Island. Part of this is the result of safety procedures mandated by the Nuclear Regulatory Commission. Part is due to self-regulation on the part of industry. Part is the product of improvements and efficiencies in modern technologies after a quarter century. All the newest next-generation nuclear research around the world is aimed at developing systems that are safer and more resistant to proliferation than the ones presently in use.
No energy technology is without risks. Nuclear is no exception. The questions are whether the benefits outweigh its risks and whether it outperforms the alternatives. In the case of nuclear energy, the answer to both questions is yes.
This article originally appeared in print