Nuclear sees power wane: soaring costs and cheap natural gas have dealt more setbacks to the industry

DEVELOPERS of the Vogtle nuclear project in Georgia announced costs had swelled to more than $25 billion and predicted completion will be delayed 18 more months. (John Bazemore Associated Press)

The nuclear energy industry has had a bad couple of weeks.

On July 31, utilities in South Carolina announced that they are stopping work on two new reactors at the V.C. Summer Nuclear Generating Station, saying cost estimates came to more than $20 billion, almost twice what was expected. About $9 billion has already been spent announced costs had swelled from $14 billion to more than $25 billionon the project since 2008.

Then, two days later, developers of another nuclear project in the South — the Vogtle Electric Generating Plant in Georgia — and predicted completion will be delayed an additional 18 months.

The two announcements come at a particularly bad time for the industry. No nuclear power planthas been built in the United States for 30 years, the nation’s fleet of 99 reactors is getting older and 10 existing plants have announced plans to shut down in the coming years, including Diablo Canyon, the last remaining nuclear plant in California.

And in March, Westinghouse Electric Co., long considered the leader in nuclear power development, filed for bankruptcy protection.

Then there is nuclear’s problem when it comes to competing with natural gas.

Driven by developments in hydraulic fracturing and horizontal drilling techniques, oil and gas producers in sites such as the Marcellus shale formation have dramatically increased the amount of natural gas across the U.S.

The abundance has driven down prices, and utilities have increasingly turned to natural gas as an alternative to nuclear, as well as coal.

“I think what happened to the operating plants is the price of natural gas fell to levels that no one had ever predicted,” said Jay Silberg, a nuclear energy lawyer and partner at the Washington law firm of Pillsbury Winthrop Shaw Pittman.

Another factor is the increasing amount of renewable energy on the grid.

“It doesn’t surprise me at all that these plants are getting canceled,” said Rochelle Becker, a longtime critic of nuclear power and executive director of Alliance for Nuclear Responsibility in San Luis Obispo.

“They’re like very expensive dominoes that are falling…. With the price of natural gas and the availability of renewable and new sources that are continuing to hit the market, nuclear is pretty much dead in this country.”

According to the most recent numbers by the California Energy Commission, renewable energy made up 27.9% of in-state generation of electricity in 2016, almost twice as much as in 2009.

In addition, California is one of 28 states that have instituted Renewable Portfolio Standards (RPS), mandating utilities to include increasing amounts of clean-energy sources such as wind and solar in their grids.

The most recent iteration of California’s RPS calls for the state to derive 50% of its electricity from sources that do not emit carbon by 2030, and there’s a bill in the Legislature this session that would take the target all the way to 100% by 2045.

Like most states, California does not classify nuclear as part of its clean-energy portfolio.

Another stumbling block is the radioactive waste that accompanies nuclear power plants.

Even though the San Onofre Nuclear Generating Station has not produced electricity for more than five years, 3.55 million pounds of spent nuclear fuel remains at the plant within sight of the Pacific Ocean.

A CONSTRUCTION vehicle at the Vogtle plant. On July 31, utilities in nearby South Carolina announced that they are stopping work on two new reactors. (Erik S. Lesser European Pressphoto Agency)

As seen at nuclear sites across the country, San Onofre waste has been stranded because the federal government has not fulfilled its promise to complete a storage facility where nuclear waste can be deposited.

But nuclear energy still has its supporters.

The projects in Georgia and South Carolina each adopted an advanced reactor design called AP1000 developed by Westinghouse. Though the design has its share of critics, a former president of the American Nuclear Society defended it.

“The AP1000 is an excellent technology,” said Ted Quinn, who runs a consulting firm in Dana Point. “In the area of technology, we’re good. In the area of construction practices, it’s a combination of the workforce and the type of contracts that are written. We’re challenged in that area.”

Quinn and others say nuclear energy needs to be retained because of its ability to ensure reliable base load power for the grid, and they maintain that the industry’s ability to generate large amounts of energy without emitting greenhouse gases makes it essential in reaching targets to reduce global warming.

“Without an aggressive buildout of nuclear power, climate goals are still attainable, but at much greater expense,” Jeffrey Sachs, director of Sustainable Development Solutions Network, told Bloomberg News. “We’d make a big mistake if we decide right now we don’t need it.”

A report from Environmental Progress, a pro-nuclear environmental group in Berkeley, said California’s power-sector emissions are 21/2 times higher today than they would have been had the state kept open and built planned nuclear plants.

Before San Onofre was shut down, nuclear power accounted for 18% of California’s in-state generation. Since it has been closed, the figure dropped to 9%, with Diablo Canyon as the only nuclear plant left. Natural gas’ contribution to the mix — and that of renewables — has gone up.

The Brattle Group, an international consulting company, came out with a study in December 2016 that said premature retirements of nuclear plants could increase greenhouse gas emissions. Its research showed that reductions made today have more effect than those made in the future.

“Since CO2 emissions persist for many years in the atmosphere, near-term emission reductions are more helpful for climate protection than later ones,” the study said. “Thus, preserving existing nuclear plants will improve the effectiveness of any climate policy approach, by holding down cumulative emissions.”

But Becker said nuclear’s waste issues blunt that argument.

“For 60 years we haven’t been able to find a solution to the waste that’s left behind at these nuclear plants,” she said. “What you have is very expensive back-end costs. In fact, the back-end costs can be as large as the front-end costs.”

The nuclear industry sees promise in a new generation of plants, including “small modular reactors” (SMRs) that take up a fraction of the space of current facilities and can be used in a multitude of locations, including remote sites.

San Diego-based General Atomics has been working on what’s called the Energy Multiplier Module. But no SMRs are online yet. General Atomics hopes to have its project ready in 2030.

Internationally, the forecast is mixed. Russia and China are building nuclear projects, with China expected to complete five new plants this year alone.

But Germany swore off nuclear power after the Fukushima disaster in Japan, and two countries that have embraced nuclear in the past may be making an about-face.

The environment minister in France said last month that the country may close up to 17 reactors to reduce its reliance on nuclear power and boost its amount of renewable energy, and South Korea just elected a new president who has promised to deemphasize nuclear power.

“We will abolish our nuclear-centered energy policy and move toward a nuclear-free era,” Moon Jae-In said in June.

For its critics, nuclear is becoming yesterday’s news.

“The future is not big, base-load plants,” Becker said. “It’s distributed generation and renewable and other energy that is on the table that we haven’t even talked about.”

But nuclear’s supporters see a brighter future, even if the present is problematic.

“I don’t think we’ll see large nuclear power plants in the near future,” Silberg said. “We may see SMRs if they in fact get designed and licensed. But I don’t think at this point in the next 15 or 20 years utility management is going to want to invest very much in a large nuclear plant project, unless of course natural gas prices go back up and the industry figures out how to overcome the problems that showed up at Vogtle and Summer.”

Originally written for the San Diego Union Tribune

Posted on Categories News

Atomic power stations out at sea may be better than inland ones

Floating reactors are on their way. Submarine ones may follow.

AFTER the events of March 11th 2011, when an earthquake and tsunami led to a meltdown of three nuclear reactors at the Fukushima Dai-ichi power plant in Japan, you might be forgiven for concluding that atomic power and seawater don’t mix. Many engineers, though, do not agree. They would like to see more seawater involved, not less. In fact, they have plans to site nuclear power plants in the ocean rather than on land—either floating on the surface or moored beneath it.

At first, this sounds a mad idea. It is not. Land-based power stations are bespoke structures, built by the techniques of civil engineering, in which each is slightly different and teams of specialists come and go according to the phase of the project. Marine stations, by contrast, could be mass-produced in factories using, if not the techniques of the assembly line, then at least those of the shipyard, with crews constantly employed.

That would make power stations at sea cheaper than those on land. Jacopo Buongiorno, a nuclear engineer at the Massachusetts Institute of Technology, reckons that, when all is done and dusted, electricity from a marine station would cost at least a third less than that from a terrestrial equivalent. It would also make them safer. A reactor anchored on the seabed would never lack emergency cooling, the problem that caused the Fukushima meltdown. Nor would it need to be protected against the risk of terrorists flying an aircraft into it. It would be tsunami-proof, too. Though tsunamis become great and destructive waves when they arrive in shallow water, in the open ocean they are mere ripples. Indeed, were it deep enough (100 metres or so), such a submarine reactor would not even be affected by passing storms.

Water power

All these reasons, observes Jacques Chénais, an engineer at France’s atomic-energy commission, CEA, make underwater nuclear power stations an idea worth investigating. Dr Chénais is head of small reactors at CEA, and has had experience with one well-established type of underwater reactor—that which powers submarines. He and his team are now assisting Naval Group, a French military contractor, to design reactors that will stay put instead of moving around on a boat. The plan is to encase a reactor and an electricity-generating steam turbine in a steel cylinder the length of a football pitch and with a weight of around 12,000 tonnes.

The whole system, dubbed Flexblue, would be anchored to the seabed between five and 15km from the coast—far enough for safety in case of an emergency, but near enough to be serviced easily. The electricity generated (up to 250 megawatts, enough for 1m people) would be transmitted ashore by an undersea cable. For refuelling and maintenance unmanageable from a submarine, the cylinder would be floated to the surface with air injected into its ballast tanks. And, when a station came to the end of its useful life, it could be towed to a specialist facility to be dismantled safely, rather than requiring yet another lot of civil engineers to demolish it.

Naval Group has not, as yet, attracted any customers for its designs. But a slightly less ambitious approach to marine reactors—anchoring them on the surface rather than below it—is about to come to fruition in Russia. The first such, Akademik Lomonosov, is under construction at the Baltic Shipyard, in St Petersburg (see picture). According to Andrey Bukhovtsev of Rosatom, the agency that runs Russia’s civil nuclear programme, it is 96% complete. It will be launched later this year, towed to Murmansk, and thence transported to Pevek, a port in Russia’s Far East, where it will begin generating power in 2019.

Akademik Lomonosov consists of two 35MW reactors mounted on a barge. The reactors are modified versions of those used to power Taymyr-class icebreakers. As such, they are designed to be able to take quite a battering, so the storms of the Arctic Ocean should not trouble them. To add to their safety, the barge bearing them will be moored, about 200 metres from shore, behind a storm-and-tsunami-resistant breakwater.

Altogether, Akademik Lomonosov will cost $480m to build and install—far less than would have to be spent constructing an equivalent power station on land in such a remote and hostile environment. And, on the presumption that the whole thing will work, plans for a second, similar plant are being laid.

Nor is Russia alone in planning floating reactors. China has similar ambitions—though the destinations of the devices concerned are more controversial than those of Russia’s. Specifically, the Chinese government intends, during the 2020s, to build up to 20 floating nuclear plants, with reactors as powerful as 200MW, to supply artificial islands it is building as part of its plan to enforce the country’s claim to much of the South China Sea—a claim disputed by every other country in the area.

The firms involved in this project intend to tsunami-proof some of their reactors in the same way as the French, by stationing them in water too deep for massive tsunami waves to form. Because they are at the surface, though, that will not save them from storms—and locating them far from shore means the Russian approach of building sheltering breakwaters will not work either. That matters. Typhoons in the South China Sea can whip up waves with an amplitude exceeding 20 metres.

To withstand such storms, the barges will have anchors that are attached to swivelling “mooring turrets” under their bows. These will cause a barge to behave like a weather vane, always pointing into the wind. Since that is the direction waves come from, it will remain bow-on to those waves, giving it the best chance of riding out any storm that nature cares to throw at it. The barges’ bows will also be built high, in order to cut through waves. This way, claims Mark Tipping of Lloyd’s Register, a British firm that is advising on the plants’ design, they will be able to survive a “10,000-year storm”.

The South China Sea is also a busy area for shipping, so any floating power stations there will need to be able to withstand a direct hit by a heavy-laden cargo vessel travelling at a speed of, say, 20 knots—whether that collision be accidental or the result of hostile action. One way to do this, says Chen Haibo, a naval architect working on the problem at Lloyd’s Register’s Beijing office, is to fit the barges with crumple zones packed with materials such as corrugated steel and wood.

Not everyone is delighted with the idea of marine nuclear power. Rashid Alimov, head of energy projects at Greenpeace Russia, an environmental charity, argues that offshore plants could be boarded by pirates or terrorists, be struck by an iceberg or might evade safety rules that are hard to enforce at sea. On July 21st Greenpeace scored a victory when Rosatom said that Akademik Lomonosov’s nuclear fuel would be loaded in an unpopulated area away from St Petersburg.

That, though, is a pinprick. The future of marine nuclear power stations is more likely to depend on the future of nuclear power itself than on the actions of pressure groups such as Greenpeace. If, as many who worry about the climate-changing potential of fossil-fuel power stations think, uranium has an important part to play in generating electricity over coming decades, then many new nuclear plants will be needed. And if that does turn out to be the case, siting such plants out at sea may well prove a good idea.

This article appeared in the Science and technology section of the print edition under the headline “Putting to sea”

Posted on Categories News

Sulfur Injections for a Cooler Planet

To keep global temperature rise in check, annual stratospheric sulfur injections would have to be similar in scale to the eruption of Mount Pinatubo on 12 June 1991.
PHOTO: ARLAN NAEG/STRINGER/GETTYIMAGES

Science 21 Jul 2017:
Vol. 357, Issue 6348, pp. 246-248
DOI: 10.1126/science.aan3317

Achieving the Paris Agreement’s aim to limit the global temperature increase to at most 2°C above preindustrial levels will require rapid, substantial greenhouse gas emission reductions together with large-scale use of “negative emission” strategies for capturing carbon dioxide (CO2) from the air (1). It remains unclear, however, how or indeed whether large net-negative emissions can be achieved, and neither technology nor sufficient storage capacity for captured carbon are available (2). Limited commitment for sufficient mitigation efforts and the uncertainty related to net-negative emissions have intensified calls for options that may help to reduce the worst climate effects (3). One suggested approach is the artificial reduction of sunlight reaching Earth’s surface by increasing the reflectivity of Earth’s surface or atmosphere.

Research in this area gained traction after Crutzen (4) called for investigating the effects of continuous sulfur injections into the stratosphere—or stratospheric aerosol modification (SAM)—as one method to deliberately mitigate anthropogenic global warming. The effect is analogous to the observed lowering of temperatures after large volcanic eruptions. SAM could be seen as a last-resort option to reduce the severity of climate change effects such as heat waves, floods, droughts, and sea level rise. Another possibility could be the seeding of ice clouds—an artificial enhancement of terrestrial radiation leaving the atmosphere—to reduce climate warming (5).

SAM technologies are presently not developed. Scientists are merely beginning to grasp the potential risks and benefits of these kinds of interventions (6). Earth-system model simulations have been used to explore idealized scenarios and thereby improve the understanding of the climatic impacts of such approaches within the geoengineering model intercomparison project (GeoMIP). Results suggest that use of SAM would mitigate greenhouse gas–induced changes in global temperatures and extreme precipitation. However, different models consistently identify side effects; for example, the reduction of incoming solar radiation at Earth’s surface reduces evaporation, which in turn reduces precipitation (7). This slowing of the hydrological cycle affects water availability, mostly in the tropics, and reduces monsoon precipitation.

Model studies have helped to improve the understanding of sulfur aerosol microphysics and transport; for example, models have successfully reproduced aerosol distributions after recent volcanic eruptions (8). It has also become clear that the cooling efficiency—that is, the cooling per injected unit of sulfur—falls with increasing injection rate (9). Thus, the more SAM is done, the less effective further injections are at reducing temperatures (see the figure, left panel). But the extent of injection required for a given level of cooling is uncertain, varying widely between models (10). The magnitude of the cooling effect also depends on injection location, height, and area and differs between models of different complexity.

Furthermore, the aerosol distribution patterns that result from SAM are uncertain and depend on aerosol microphysics and transport in the models. Stratospheric sulfate absorbs terrestrial radiation and thereby warms the stratosphere. This warming affects stratospheric dynamics; for example, it may increase the wind velocity in the equatorial wind systems of the stratosphere, increasing the tropical confinement of the aerosols and reducing the poleward transport of aerosols (11). This has consequences not only for sulfate but for the transport of all stratospheric constituents. In models, changes in stratospheric chemistry caused by SAM have been shown to affect stratospheric ozone concentrations and cause a delay of the Antarctic ozone recovery by several decades (12).

Aerosols with different characteristics than sulfur may eventually be developed to reduce some side effects (13). Small-scale experiments in the stratosphere have been proposed to further understand chemical and aerosol microphysical characteristics of sulfur and alternative aerosols (14). Those experiments, however, will not contribute to the understanding of largescale climate impacts of SAM due to SAM-related changes in stratospheric temperature and dynamics. Global changes and impacts can only be assessed with Earth system models, although it is difficult to attribute impacts to SAM.

GRAPHIC: N. CARY/SCIENCE

Most current Earth system models do not adequately capture important interactions, such as the coupling between stratospheric aerosols, chemistry, radiation, and climate. They cannot, therefore, simulate the full impact of the interventions. A comprehensive description of these interactions in models as well as coupling with ice, ocean, and land are expected to provide a better estimation of the uncertainties and risks. Processes in Earth system models can be further improved through expanded continuous observations of the atmosphere’s composition. Such observation capability would also ensure high-quality measurements after rare large volcanic eruptions.

Beyond the scientific assessments of possible impacts, it is crucial to understand the economic costs and technological requirements of stratospheric sulfur injection. Assuming a scenario in which aggressive mitigation and large-scale carbon capture and removal start as late as 2040, sulfur must be injected for 160 years, with a peak injection of 8 TgS/year, to limit the temperature increase to 2°C above preindustrial levels (see the figure, right panel) (15); this injection amount is equivalent to one Mount Pinatubo eruption per year (see the photo). Without the intervention, temperatures would have risen by 3°C. The estimated delivery cost of sulfur into the stratosphere for ∼1°C of cooling with aircraft newly developed for SAM is US$20 billion/year (10), requiring 6700 flights per day. The cost would increase for higher injection rates because of the decreasing cooling efficiency (9).

Additional costs arise from the need to set up a comprehensive observation system with which to monitor atmospheric changes, including aerosol distribution, impact on chemistry, and climate. The necessary amount of sulfur injection would need to be estimated according to comprehensive forecast models, requiring extensive modeling capabilities. The total cost of SAM would also need to include compensation for potential side effects and would thus be much higher than the delivery costs (16).

Currently, a single person, company, or state may be able to deploy SAM without in-depth assessments of the risks, potentially causing global impacts that could rapidly lead to conflict. As such, it is essential that international agreements are reached to regulate whether and how SAM should be implemented (3). A liability regime would rapidly become essential to resolve conflicts, especially because existing international liability rules do not provide equitable and effective compensation for potential SAM damage (17). Such complexities will require the establishment of international governance of climate intervention, overseeing research with frequent assessments of benefits and side effects.

Climate intervention should only be seen as a supplement and not a replacement for greenhouse gas mitigation and decarbonization efforts because the necessary level and application time of SAM would continuously grow with the need for more cooling to counteract increasing greenhouse gas concentrations. A sudden disruption of SAM would cause an extremely fast increase in global temperature. Also, SAM does not ameliorate major consequences of the CO2 increase in the atmosphere, such as ocean acidification, which would continue to worsen.

References and Notes
1. ↵ R. Joeri et al., Nature 534, 631639 (2016).
2. ↵ K. Anderson, G. Peters, Science 354, 182 (2016).Abstract/FREE Full Text
3. ↵ J. Pasztor, Science 357, 231 (2017).Abstract/FREE Full Text
4. ↵ P. J. Crutzen, Clim. Change 77, 211 (2006).
5. ↵ U. Lohmann, B. Gasparini, Science 357, 248 (2017).Abstract/FREE Full Text
6. ↵ A. Robock, Earth’s Future 4, 644 (2016).
7. ↵ U. Niemeier et al., J. Geophys. Res. 118, 11905 (2013).
8. ↵ M. J. Mills et al., J. Geophys. Res. 121, 2332 (2016).
9. ↵ U. Niemeier, C. Timmreck, Atmos. Chem. Phys. 15, 9129 (2015).
10. ↵ R. Moriyama et al., Mitig. Adapt. Strat. Global Change 21, 1 (2016).
11. ↵ D. Visioni, G. Pitari, V. Aquila, Atmos. Chem. Phys. 17, 3879 (2017).
12. ↵ S. Tilmes, R. Müller, R. Salawitch, Science 320, 1201 (2008).Abstract/FREE Full Text
13. ↵ D. Keith, D. K. Weisenstein, J. A. Dykemaa, F. N. Keutsch, Proc. Natl. Acad. Sci. U.S.A. 113, 14910 (2016).Abstract/FREE Full Text
14. ↵ J. Dykema, D. Keith, J. G. Anderson, D. Weisenstein, Philos. Trans. R. Soc. A 372, 20140059 (2014).Abstract/FREE Full Text
15. ↵ S. Tilmes, B. M. Sanderson, B. C. O’Neill, Geophys. Res. Lett. 43, 8222 (2016).
16. ↵ J. Reynolds, A. Parker, P. Irvine, Earth’s Future 4, 562 (2016).
17. ↵ B. Saxler et al., Law Innov. Technol. 7, 112 (2015).
18. Acknowledgments: We thank B. Sanderson, Y. Richter, and H. Schmidt for very valuable comments and A. Jones and C. Kleinschmidt for providing data for the figure.

Posted on Categories News

The Problem With Electric Cars? Not Enough Chargers

Public charging stations that exist are in parking lots and at businesses in cities where early adopters live.PHOTO: DANIA MAXWELL/BLOOMBERG NEWS

It’s the dawn of the age of the electric vehicle. For real, this time. Probably.

The evidence: Tesla’s delivery of its first “affordable” compact sedans, the Model 3, and the road maps of more or less every other automaker on the planet promising widely available electric cars in the next three to five years.

Within a decade, electric cars will even have similar sticker prices to their gasoline competitors, says Stephen Zoepf, executive director of the Center for Automotive Research at Stanford. Some analyses say EVs are already cost-competitive, if you factor in savings on fuel and maintenance.

Aggressive pricing and sales projections are all part of the seemingly self-fulfilling prophecy of rapid EV adoption. To hit Chief Executive Elon Musk’s targets, Tesla must sell 430,000 cars by the end of 2018 and continue to sell 10,000 a week after that.

But if Tesla and its competitors succeed, they face a new problem: Where are all those cars going to plug in?

A ChargePoint Inc. charging station in Los Angeles, one of 44,000 charging stations in the U.S. Many more may be needed in coming years.PHOTO: DANIA MAXWELL/BLOOMBERG NEWS

At present, electric cars represent only about 1% of cars sold in the U.S., and 0.2% of our total automobile fleet. They aren’t yet taxing our electrical grid or fighting each other for the roughly 44,000 public charging stations now available in the U.S. Yet if anything like analysts’ projections come to pass, they could rapidly dwarf that number.

Electric-car owners at present overwhelmingly charge at home. What public stations exist are found in parking lots and at businesses in cities and wealthy suburbs where early adopters reside. But the current charging infrastructure offers little support for a larger pool of people who have both the income and the impetus to buy EVs: city dwellers who lack garages.

“You see models that say, ‘We’ll sell a million EVs this year, then two, then four and so on,’ but I have concerns about the practicalities of this transition,” says Francis O’Sullivan, director of research for the MIT Energy Initiative.

“All things cannot be sorted before the industry starts,” says Pasquale Romano, chief executive of ChargePoint, which controls the largest U.S. network of charging stations.

On-street parking doesn’t offer many options for adding charging stations. Shown, a Tesla Model S, a full-size luxury liftback, in Trondheim, Norway, in October 2016. PHOTO: ISTOCK

Charging infrastructure is adequate to meet current demand, and there’s no reason to believe it won’t continue to scale in line with future demand, he argues. ChargePoint makes and sells charging stations to businesses, individuals and governments, charging monthly to maintain the stations and accepting payments for the electricity they provide.

ChargePoint was part of an initiative in Los Angeles to put charging stations in existing lampposts, says Matt Petersen, until recently L.A.’s chief sustainability officer. (The city has installed 82 so far.)

That makes sense because a good chunk of a new charging station’s cost—which can hit $5,000—is installing it and wiring it up, ChargePoint’s Mr. Romano says.

German firm Ubitricity is pioneering relatively low-cost, low-power plugs that go directly into lampposts, and can be accessed with an internet-connected “smart” power cable that handles all metering and billing.

Kieran Fitsall, head of service improvement and transformation for the Westminster City Council of central London, says it has installed 20 Ubitricity plugs in street lamps. The plan is to increase that to 100 by March 2018.

A Tesla charging station at Cochran Commons shopping center in Charlotte, N.C., on June 24. The U.S. power grid is handling demand from electric vehicles now but might not be up to it in the future.PHOTO: CHUCK BURTON/ASSOCIATED PRESS

One of Ubitricity’s advantages is the plugs don’t require the council to designate EV-only parking spots, which are unpopular with people who don’t drive them, Mr. Fitsall says. Ubitricity currently has no U.S. presence but is seeking investment to expand, says company co-founder Knut Hechtfischer.

While these efforts may show where the technology is headed, it isn’t clear that it’s rolling out at anywhere close to the pace automakers anticipate they will sell vehicles.

The biggest challenge for those building out charging infrastructure is that no one can predict the demand for charging as EVs become commonplace, says MIT’s Dr. O’Sullivan. In fact, he calls some of the behavioral factors needed to make such predictions “exceptionally opaque.” These include the time of day people will choose to charge, how responsive they will be to price incentives on electricity designed to encourage them to charge at the “right” time, and how often they’ll use “superchargers” versus lower-power outlets for overnight charging.

This brings us to another looming issue: America’s often-overtaxed power grids won’t be able to handle a large influx of new demand without careful management. This generally won’t be a problem if cars charge at night, when the power grid is underutilized. But as EVs proliferate, drivers who can’t charge them at home will want to charge them at work, during the day. They’ll also seek superchargers, which typically are installed along highways and designed for fast charging and long-distance travel.

“Superchargers are enormous power draws,” says Jesse Jenkins, a researcher at the MIT Energy Initiative. “Chargers in parking garages or superchargers at rest stops are not a solution for charging EVs en masse unless we are OK with significant costs to upgrade distribution grids.”

Even the regular charger found in homes and businesses could present a costly problem when cars charge during demand peaks. Anything that increases peak demand could increase the cost of electricity for everyone, says Stanford’s Dr. Zoepf.

The sheer scale of the transformation of the electrical grid to accommodate mainstream adoption of EVs boggles the mind. A major portion of the energy currently trapped in automotive fuels will have to arrive in the form of electrons, instead. While some analyses indicate America’s existing electrical grid can handle it, it may be only if millions of American consumers can be coaxed to play along and charge at the right place and time.

That’s also assuming private companies and public utilities can get the needed charging infrastructure to the public at a price they are willing to pay.

If Elon Musk and his competitors succeed at selling as many electric vehicles as they project, keeping them all full of electricity will be a long, hugely expensive and potentially contentious undertaking. It could also be quite lucrative for the people who figure it out.

Posted on Categories News

Tab Swells to $25 Billion for Nuclear-Power Plant in Georgia

The Vogtle Electric Generating Plant in Waynesboro, Georgia, has seen its costs double since it was first proposed in 2008. PHOTO: GEORGIA POWER/REUTERS

The cost of building the only nuclear power plant under construction in the U.S. has ballooned to more than $25 billion, but chief owner Southern Co. SO -0.84% said it isn’t ready to throw in the towel on the project.

The company released the new cost estimate for Georgia’s Vogtle Electric Generating Plant on Wednesday, adding that it expects completion of the plant, which has already seen years of delays and rising costs, to be delayed by another 18 months until February 2021 at the earliest.

The price would be split between Southern, three regional power companies, which are partners in the project, and Toshiba Corp. TOSYY -0.31% , whose subsidiary, Westinghouse Electric Corp., went bankrupt earlier this year while building the plant.

The disclosure from Southern comes two days after Scana Corp. pulled the plug on a similar nuclear plant in South Carolina. It also had years of delays and cost increases that put final completion of that facility above $25 billion as well.

Southern Chairman and Chief Executive Thomas A. Fanning said during a conference call with analysts that he wasn’t ready to give up on the Vogtle plant.

“When you abandon, you have nothing to show for the amount of money you have spent,” he said. “If you go forward, you have a nuclear plant that will serve us for decades to come.”

He then added: “But please understand there has been no decision made.”

The escalating expenses have heightened concern that what was supposed to be a rebirth of the nuclear power industry in the U.S., driven by Westinghouse reactors, is becoming a costly failure.

In 2008, Southern’s plant was supposed to cost $14 billion. Scana’s plant was projected at $11.4 billion.

The plants have identical designs, using a new approach that is supposed to be simpler and easier to build. But numerous changes—some for safety enhancements, others because construction began while final plans were still being developed—drove up costs.

Mr. Fanning did not disclose any specific reasons for the most recent estimates, but he said that since Southern took over construction of the Georgia plant earlier this summer, productivity has improved.

“Our near term experience tells us that we can do a better job than Westinghouse should we go down that road,” he said.

Southern said it would make a recommendation to Georgia regulators later this month about whether it would proceed with the project. Construction at the Georgia facility is 44% complete, compared to 35% for the South Carolina plant.

Vogtle is the only nuclear power plant still under construction in the U.S., and the first to be started since the 1980s.

The new plants’ troubles come at a time when the idea of generating electricity from nuclear power has received a boost. Some environmentalists have supported nuclear plants as a way of providing power that doesn’t emit carbon dioxide. And President Donald Trump said earlier this summer he wanted to “revive and expand our nuclear-energy sector.”

Atlanta-based Southern is going through what Mr. Fanning called “tumultuous times.” The company on Wednesday reported a $1.38 billion loss for the second quarter, compared to a $623 million profit in the same period a year earlier. It is only the second loss posted by the company since 1993, according to data from S&P Capital IQ.

The loss was due to a $2.8 billion pre-tax charge the company took related to an expensive, and ultimately unsuccessful, attempt to build a “clean coal” power plant in Mississippi.

In June, Mississippi regulators said they were unwilling to pass any additional costs onto to electricity customers. The plant cost $7.5 billion and seven years to build, but Southern couldn’t get the carbon dioxide technology to operate properly for extended periods.

Posted on Categories News

Scientists dim sunlight, suck up carbon dioxide to cool planet

A facility for capturing CO2 from air of Swiss Climeworks AG is placed on the roof of a waste incinerating plant in Hinwil, Switzerland July 18, 2017. Photo by Arnd Wiegmann

OSLO (Reuters) – Scientists are sucking carbon dioxide from the air with giant fans and preparing to release chemicals from a balloon to dim the sun’s rays as part of a climate engineering push to cool the planet.

Backers say the risky, often expensive projects are urgently needed to find ways of meeting the goals of the Paris climate deal to curb global warming that researchers blame for causing more heatwaves, downpours and rising sea levels.

The United Nations says the targets are way off track and will not be met simply by reducing emissions for example from factories or cars – particularly after U.S. President Donald Trump’s decision to pull out of the 2015 pact.

They are pushing for other ways to keep temperatures down.

In the countryside near Zurich, Swiss company Climeworks began to suck greenhouse gases from thin air in May with giant fans and filters in a $23 million project that it calls the world’s first “commercial carbon dioxide capture plant”.

Worldwide, “direct air capture” research by a handful of companies such as Climeworks has gained tens of millions of dollars in recent years from sources including governments, Microsoft founder Bill Gates and the European Space Agency.

If buried underground, vast amounts of greenhouse gases extracted from the air would help reduce global temperatures, a radical step beyond cuts in emissions that are the main focus of the Paris Agreement.

Climeworks reckons it now costs about $600 to extract a tonne of carbon dioxide from the air and the plant’s full capacity due by the end of 2017 is only 900 tonnes a year. That’s equivalent to the annual emissions of only 45 Americans.

And Climeworks sells the gas, at a loss, to nearby greenhouses as a fertilizer to grow tomatoes and cucumbers and has a partnership with carmaker Audi, which hopes to use carbon in greener fuels.

Jan Wurzbacher, director and founder of Climeworks, says the company has planet-altering ambitions by cutting costs to about $100 a tonne and capturing one percent of global man-made carbon emissions a year by 2025.

“Since the Paris Agreement, the business substantially changed,” he said, with a shift in investor and shareholder interest away from industrial uses of carbon to curbing climate change.

But penalties for factories, power plants and cars to emit carbon dioxide into the atmosphere are low or non-existent. It costs 5 euros ($5.82) a tonne in the European Union.

And isolating carbon dioxide is complex because the gas makes up just 0.04 percent of the air. Pure carbon dioxide delivered by trucks, for use in greenhouses or to make drinks fizzy, costs up to about $300 a tonne in Switzerland.

Other companies involved in direct air capture include Carbon Engineering in Canada, Global Thermostat in the United States and Skytree in the Netherlands, a spinoff of the European Space Agency originally set up to find ways to filter out carbon dioxide breathed out by astronauts in spacecrafts.

Not Science Fiction

The Paris Agreement seeks to limit a rise in world temperatures this century to less than 2 degrees Celsius (3.6 Fahrenheit), ideally 1.5C (2.7F) above pre-industrial times.

But U.N. data show that current plans for cuts in emissions will be insufficient, especially without the United States, and that the world will have to switch to net “negative emissions” this century by extracting carbon from nature.

Riskier “geo-engineering” solutions could be a backstop, such as dimming the world’s sunshine, dumping iron into the oceans to soak up carbon, or trying to create clouds.

Among new university research, a Harvard geo-engineering project into dimming sunlight to cool the planet set up in 2016 has raised $7.5 million from private donors. It plans a first outdoor experiment in 2018 above Arizona.

“If you want to be confident to get to 1.5 degrees you need to have solar geo-engineering,” said David Keith, of Harvard.

Keith’s team aims to release about 1 kilo (2.2 lbs) of sun dimming material, perhaps calcium carbonate, from a high-altitude balloon above Arizona next year in a tiny experiment to see how it affects the microphysics of the stratosphere.

“I don’t think it’s science fiction … to me it’s normal atmospheric science,” he said.

Some research has suggested that geo-engineering with sun-dimming chemicals, for instance, could affect global weather patterns and disrupt vital Monsoons.

And many experts fear that pinning hopes on any technology to fix climate change is a distraction from cuts in emissions blamed for heating the planet.

“Relying on big future deployments of carbon removal technologies is like eating lots of dessert today, with great hopes for liposuction tomorrow,” Christopher Field, a Stanford University professor of climate change, wrote in May.

Jim Thomas of ETC Group in Canada, which opposes climate engineering, said direct air capture could create “the illusion of a fix that can be used cynically or naively to entertain policy ideas such as ‘overshoot'” of the Paris goals.

But governments face a dilemma. Average surface temperatures are already about 1C (1.8F) above pre-industrial levels and hit record highs last year.

“We’re in trouble,” said Janos Pasztor, head of the new Carnegie Climate Geoengineering Governance Project. “The question is not whether or not there will be an overshoot but by how many degrees and for how many decades.”

Faced with hard choices, many experts say that extracting carbon from the atmosphere is among the less risky options. Leaders of major economies, except Trump, said at a summit in Germany this month that the Paris accord was “irreversible.”

Barking Mad

Raymond Pierrehumbert, a professor of physics at Oxford University, said solar geo-engineering projects seemed “barking mad”.

By contrast, he said “carbon dioxide removal is challenging technologically, but deserves investment and trial.”

The most natural way to extract carbon from the air is to plant forests that absorb the gas as they grow, but that would divert vast tracts of land from farming. Another option is to build power plants that burn wood and bury the carbon dioxide released.

Carbon Engineering, set up in 2009 with support from Gates and Murray Edwards, chairman of oil and gas group Canadian Natural Resources Ltd, has raised about $40 million and extracts about a tonne of carbon dioxide a day with turbines and filters.

“We’re mainly looking to synthesize fuels” for markets such as California with high carbon prices, said Geoffrey Holmes, business development manager at Carbon Engineering.

But he added that “the Paris Agreement helps” with longer-term options of sucking large amounts from the air.

Among other possible geo-engineering techniques are to create clouds that reflect sunlight back into space, perhaps by using a mist of sea spray.

That might be used locally, for instance, to protect the Great Barrier Reef in Australia, said Kelly Wanser, principal director of the U.S.-based Marine Cloud Brightening Project.

Among new ideas, Wurzbacher at Climeworks is sounding out investors on what he says is the first offer to capture and bury 50 tonnes of carbon dioxide from the air, for $500 a tonne.

That might appeal to a company wanting to be on forefront of a new green technology, he said, even though it makes no apparent economic sense.

Editing by Anna Willard

Posted on Categories News

The Gene Editors Are Only Getting Started

ILLUSTRATION: KEN FALLIN

Would you eradicate malaria-carrying insects? Change your baby’s DNA? Scientists soon may have the power to do both.

Rewriting the code of life has never been so easy. In 2012 scientists demonstrated a new DNA-editing technique called Crispr. Five years later it is being used to cure mice with HIV and hemophilia. Geneticists are engineering pigs to make them suitable as human organ donors. Bill Gates is spending $75 million to endow a few Anopheles mosquitoes, which spread malaria, with a sort of genetic time bomb that could wipe out the species. A team at Harvard plans to edit 1.5 million letters of elephant DNA to resurrect the woolly mammoth.

“I frankly have been flabbergasted at the pace of the field,” says Jennifer Doudna, a Crispr pioneer who runs a lab at the University of California, Berkeley. “We’re barely five years out, and it’s already in early clinical trials for cancer. It’s unbelievable.”

The thing to understand about Crispr isn’t its acronym—for the record, it stands for Clustered Regularly Interspaced Short Palindromic Repeats—but that it makes editing DNA easy, cheap and precise. Scientists have fiddled with genes for decades, but in clumsy ways. They zapped plants with radiation to flip letters of DNA at random, then looked for useful mutations. They hijacked the infection mechanisms of viruses and bacteria to deliver beneficial payloads. They shot cells with “gene guns,” which are pretty much what they sound like. The first one, invented in the 1980s, was an air pistol modified to fire particles coated with genetic material.

Crispr is much more precise, as Ms. Doudna explains in her new book, “A Crack in Creation.” It works like this: An enzyme called Cas9 can be programmed to latch onto any 20-letter sequence of DNA. Once there, the enzyme cuts the double helix, splitting the DNA strand in two. Scientists supply a snippet of genetic material they want to insert, making sure its ends match up with the cut strands. When the cell’s repair mechanism kicks in to fix the cut, it pastes in the new DNA.

It’s so exact that Crispr blurs the meaning of “genetically modified organism.” The activists yelling about “frankenfish” are generally upset about transgenic plants and animals—those with DNA inserted from other species. But what about using Crispr to alter only a few letters of an organism’s own genome, the kind of mutation that could happen naturally?

Last year a professor at Penn State created blemish-resistant mushrooms by knocking out a gene that causes them to turn brown when handled. “It attracted attention,” Ms. Doudna says, “because the U.S. Department of Agriculture ruled that that type of plant product would not be regulated as a genetically modified organism.”

Ms. Doudna welcomes this kind of streamlining as the Food and Drug Administration considers its own approach to Crispr crops. “It’s crazy. It takes years and years and years to bring a plant to market,” she says. “I’m all for safety of course and that has to come first. But I think it has to be done with knowledge of the science that makes sense.”

Medical labs are also putting Crispr to work, since it is potentially meticulous enough for routine use on people. The human genome is 3.2 billion letters, and in the wrong place a single typo—a dozen or so misplaced atoms—can create misery. For patients with disorders like cystic fibrosis, the obstacles to fixing the genetic glitch with Crispr seem mostly practical.

First, there’s delivery: A human body contains some 50 trillion cells. How do you get Crispr to the affected ones, and what percentage need to be edited successfully to matter? Ms. Doudna says injecting Crispr-laden viruses into animal tissues has resulted in rates of editing on the order of 70%—enough to have a therapeutic benefit: “In muscular dystrophy, for example, it looks like you only need to have somewhere between 10% to 20%.”

Second, there’s the risk: Although Crispr aims at a 20-letter DNA sequence, occasionally it can hit a partial match and make an unintended edit. “For any drug that we’re developing for treatment, you’re going to have some kind of risk factors,” Ms. Doudna says. “In this case it might be changes to DNA, and you have to decide what’s the right level that you would tolerate.” There are ways to minimize the mistakes, and some studies show so few off-target edits “that it’s difficult to distinguish them from just errors in DNA sequencing.”

What seems to merit the risk today? “Sickle-cell disease,” Ms. Doudna says: “Well-known mutation. Single gene is involved. No treatments right now for people. They have these horrible crises where they’re in terrible pain.” Moreover, the faulty red blood cells can be drawn from a vein and isolated. “The actual DNA editing can be done outside the body,” she says, “validated first, and then the cells implanted and allowed to repopulate the blood supply.” The approach may work for cancer, too: A Crispr clinical trial awaiting FDA approval would pull white blood cells, give them tumor-killing superpowers, and then put them back into action.

It would be technically simpler, rather than working in fully grown patients, to fix genetic disorders early, in human eggs, sperm or embryos. But this raises thorny moral questions, since edits made to these cells would pass down to future generations, who can’t consent to having their genes tweaked. In debates about this, the word “eugenics” comes up.

At first, Ms. Doudna was reflexively opposed. “I’m not a religious person,” she says, “but it’s more, just—I don’t know—sort of an intrinsic reaction, that it feels like a realm where maybe we shouldn’t be messing around.” Her position softened, somewhat to her own surprise, as she heard from hundreds of people facing horrific genetic diseases. “They’re reaching out because they’re desperate,” she says. “A lot of them are asking me the questions you’re asking about: How soon? How long will it be? Is there hope for my child?”

Ms. Doudna recalls an email from a 26-year-old woman who’d found out she carried a mutation in the gene BRCA1 that is associated with a 60% risk of breast cancer by age 70: “She said, ‘Should I have a mastectomy?’ ”—this was right after Angelina Jolie, worried about a similar mutation, did the same—“ ‘Or do you think that gene-editing is going to come along in time for me? Or if not for me, at least so that I can get rid of this mutation in my eggs?’ ”

There was a man who watched his father die of Huntington’s disease and had three sisters diagnosed. There was a woman whose daughter had given birth to a child with Fragile X syndrome, which causes intellectual disability, but deeply wanted to conceive again. “She was very emotional,” Ms. Doudna recounts. “She said, ‘If there were a way to use this, and if I could use it in embryos or germ cells, I would have absolutely no hesitation about doing it.’ ”

A few bioethicists have even argued that research on editing human embryos is a “moral imperative,” since roughly 6% of all babies have “serious birth defects.” As for the risk of “off target” edits, merely smoking cigarettes can create mutations in a man’s sperm. One academic joked that if old-fashioned sex were up for regulatory review, the FDA would never sign off.

Not everyone has the same reaction. A reporter interviewing Ms. Doudna once revealed she had a son with Down syndrome. “She said, ‘I just want you to know that he’s perfect just the way he is.’ It was very touching,” Ms. Doudna recalls, her voice flickering with emotion. Even if Crispr could have fixed that genetic defect, the woman said she wouldn’t change it. Some people in the deaf community feel the same way, and Ms. Doudna respects that. “Everyone’s feeling about DNA and about their inheritance and their children is going to be different,” she says. “It has to be a choice. People can decide what they want to do.”

Ms. Doudna remains opposed to nontherapeutic editing, often characterized as “designer babies,” and she says regulators won’t allow it, at least in the U.S. But other countries are less stringent. Is news of the first Crispr baby simply going to break one day? “It would be naive to think that that won’t happen at some point,” she says. Pressure to push forward will come not only from desperate people but also clinics abroad that may drum up business by saying: “We’ll do things here that will be advantageous for your children that are not allowed elsewhere.”

That’s why Ms. Doudna sees the ethical debate as vital. “It’s very hard to enforce any kind of global regulations on anything, but certainly on science,” she says. “So I think the next best thing is to try to encourage a global consensus that is strong enough that people feel some pressure to conform to it.”

The plan to eradicate the Anopheles mosquito presents a similar problem of collective decision-making. One iteration would involve a version of the insect edited to carry DNA that creates sterile females. That trait could then be forced into the wild population using a “gene drive.” Recall the basic rules of heredity—think back to that Punnett square from high school. Normally, an edited male in the wild would pass on the sterility gene to only half its offspring. Over many generations, the edited DNA would be diluted into oblivion.

That’s where the gene drive comes in. Scientists using Crispr in the lab have given the mosquito DNA that causes its cells to create Crispr. The result is a recursive, self-propagating gene that slices its reproductive competition. The edited mosquito passes on the sterility gene to nearly 100% of its offspring—which in turn do the same. Theoretically, releasing a single gene-drive insect, or letting one escape out an air-conditioning vent, could spread the edited DNA to the entire species.

Theoretically. “Although we understand that these gene drives can work in a laboratory setting efficiently in fruit flies and things like that, how well would they really work environmentally?” Ms. Doudna asks. “Evolution is a very strong force. If you put a species in a wild setting where they have to compete with other species, if they have a disadvantage reproductively, even if it’s a small disadvantage, they’re going to lose out.”

Ms. Doudna still needs to be convinced, too, of the wisdom of letting loose a gene drive. She cites her native Hawaii. “Species were introduced to that environment that ended up having large unintended consequences,” she says. Seeing that made her “very respectful of nature and very cautious about human beings’ thinking they have the knowledge to predict what will happen.”

A final Crispr worry is that it makes DNA editing so easy anybody can do it. Simple hobby kits sell online for $150, and a community biotech lab in Brooklyn offers a class for $400. Jennifer Lopez is reportedly working on a TV drama called “C.R.I.S.P.R.” that, according to the Hollywood Reporter, “explores the next generation of terror: DNA hacking.”

Ms. Doudna provides a bit of assurance. “Genetics is complicated. You have to have quite a bit of knowledge, I think, to be able to do anything that’s truly dangerous,” she says. “There’s been a little bit of hype, in my opinion, about DIY kits and are we going to have rogue scientists—or even nonscientists—randomly doing crazy stuff. I think that’s not too likely.”

Still, a couple of years ago Ms. Doudna had a dream in which a colleague asked her to explain gene-editing to someone very important. Turns out it was Hitler, except with the face of a pig. This, she says now, was her awakening to Crispr’s potential. “Try to imagine: We’re biochemists here, we’re futzing around with bacteria, just fartin’ around the lab, and students are doing experiments,” she says. “Then suddenly you have this discovery that you realize can be harnessed in a very different way.”

A few moments later she adds: “It was just this growing realization that this is no joke. This is a really seriously powerful technology.”

Mr. Peterson is the Journal’s deputy editorial features editor.

Posted on Categories News

Coal, Nuclear on the Losing End of Power Shift

Coal is losing out to natural gas and renewables for electricity generation. A coal-fired plant in Wyoming operated by PacifiCorp. PHOTO: JIM URQUHART/REUTERS

Not long ago, coal provided 98% of the electricity for the pulp-and-paper mills and iron-ore producers around the western edge of Lake Superior, as well as the port city of Duluth, Minn. That was 2005. Today, coal use is plunging, and by 2025 is expected to power just one-third of this region.

This is all part of a plan released last month by the local utility, Minnesota Power, ALE 0.08% to generate 44% of its electric power from renewable sources like wind farms. It also plans to build a new high-efficiency natural-gas power plant and has already shut down six of its eight coal-fired units.

This is an extreme example of the transition happening across the U.S. power grid. Natural gas, wind and solar power are expanding rapidly, while electricity generation from coal and nuclear reactors is shrinking.

The transition in the grid comes as the Trump administration has signaled it would like to help coal make a comeback, and last week President Donald Trump said he wanted to “revive and expand our nuclear-energy sector” and announced a policy review. Because U.S. power demand isn’t growing, promoting coal and nuclear would come at the expense of gas and renewables—and vice versa. This has set up a power-grid showdown: Will new federal policies bring back coal and nukes, or will gas and renewables continue to grab market share?

It isn’t just small utilities like Minnesota Power that are changing their generating mix.

Duke Energy Corp. , a large utility based in Charlotte, N.C., with power plants in five states, generated 7% of its power from gas and renewables in 2005. Last year, Duke got 32% from those new sources, and it expects the portion to hit 44% by 2026.

Last week, Oregon utility PacifiCorp, which is owned by Warren Buffett’s Berkshire Hathaway Energy, filed plans to spend $3.5 billion on wind generation and transmission projects. The parent company had previously said it planned to spend $13.6 billion between 2017 and 2019 primarily on wind and solar projects. PacifiCorp said its renewable projects are “the most cost-effective option to meet customers’ energy needs over the next 20 years,” the company said.

Overall, gas, wind and solar now meet 40% of U.S. power needs, up from 22% a decade ago, according to the U.S. Energy Information Administration.

As gas and renewables have grown, coal, the mainstay of American electricity generation for decades, the past few years have been a bloodbath. Three of every 10 coal generators has closed permanently in the past five years.

Nukes, another mainstay for decades, are imperiled, too. By 2023, there may be 54 nuclear-power plants, down from 65 a decade earlier. Only new state subsidies can keep more from closing, plant operators argue.

Coal and nuclear plants have provided so-called base-load power for decades, running around the clock to ensure a reliable stream of electricity. As those sources of power lose ground to gas and renewables, some worry the grid could become unstable.

Natural gas has been the main agent of change, mostly because the advent of hydraulic fracturing unlocked vast new natural gas reserves in the U.S., creating very low prices for the fuel.

“That is what is making coal go away,” said Pat Vincent-Collawn, chairman and chief executive of PNM Resources Inc., a New Mexico utility. It expects coal to drop from 51% of its generation last year to 41% next year.

Natural gas, wind and solar now meet 40% of U.S. power needs. PHOTO: WILL VRAGOVIC/ZUMA PRESS

In addition to inexpensive and abundant gas, new power plants are much more efficient than they were even five years ago.

“Not only have we gotten better at getting the natural gas molecule out of the ground, but we have gotten much better at getting as much electricity out of that molecule as possible,” said Josh Rhodes, a research fellow at the University of Texas Energy Institute.

Until a few years ago, gas plants typically operated about 30% of the time, turning on and off as the grid needed power. Today, they run more than half the time. Many are on virtually nonstop, taking over the role once played by coal and nuclear.

Ben Fowke, chairman and chief executive of Xcel Energy Inc., a large utility that covers parts of Colorado, Minnesota and six other states, says wind and solar aren’t responsible for the demise of coal and nuclear plants. “I hope it doesn’t come out that renewables are to blame,” he said. “Wind is saving our customers money.” For now, renewable energy enjoys a federal tax subsidy.

Few utilities chiefs agree the administration should step in to prop up coal. There is more agreement that nuclear power plants should be saved from premature retirement.

Nukes, which have no carbon emissions, are struggling to compete with low wholesale power prices brought on by inexpensive natural gas and renewable generation. Low power prices are great until baseload assets are on the line, said Joe Dominguez, executive vice president of governmental and regulatory affairs for Exelon Corp. “Policy makers need to step in and address that,” he said.

For Minnesota Power, a mixture of a lot of wind, some solar, hydro power from Canadian dams and a state-of-the-art gas plant will make it easier to provide reliable electricity, even with the loss of so much coal.

“The combination of flexible natural gas and renewables really work well together,” said Julie Pierce, Minnesota Power’s vice president of strategy and planning.

—Dan Molinski contributed to this article.

Corrections & Amplifications

PacifiCorp has filed plans with state regulators to spend $3.5 billion on wind and transmission projects. An earlier version of this article incorrectly said it had filed plans to spend $13.6 billion. PacifiCorp’s parent company had previously said it planned to spend $13.6 billion between 2017 and 2019 primarily on wind and solar projects. (July 7, 2017)

Posted on Categories News

1,800 tons of radioactive waste has an ocean view and nowhere to go

A military helicopter prepares to take off over the coast from Camp Pendleton, located south of the decommissioned San Onofre Nuclear Generating Station. (Allen J. Schaben / Los Angeles Times)

The massive, 150-ton turbines have stopped spinning. The mile-long cooling pipes that extend into the Pacific will likely become undersea relics. High voltage that once energized the homes of more than a million Californians is down to zero.

But the San Onofre nuclear power plant will loom for a long time as a landmark, its 1,800 tons of lethal radioactive waste stored on the edge of the Pacific and within sight of the busy 5 Freeway.

Across the site, deep pools of water and massive concrete casks confine high-power gamma radiation and other forms of radioactivity emitted by 890,000 spent fuel rods that nobody wants there.

And like the other 79,000 tons of spent fuel spread across the nation, San Onofre’s nuclear waste has nowhere to go.

The nation’s inability to find a permanent home for the dangerous byproduct of its 50-year-adventure in nuclear energy represents one of the biggest and longest running policy failures in federal government history.

Now, the Trump administration and Congress are proposing a fast track fix. The new plan aims, after decades of delays, to move the waste to one or more temporary central storage sites that would hold it until a geologic repository can be built in Nevada or somewhere else.

But the new strategy faces many of the same challenges that have dogged past efforts, leaving some experts doubtful that it can succeed.

America’s nuclear waste failure

Left, Construction is underway of the Independent Spent Fuel Storage Installation (ISFSI) where dry cask storage of used nuclear fuel will be stored vertically at the closed San Onofre Nuclear Generating Station. Right, view of the two domes at San Onofre Nuclear Generating Station. The I-beams in foreground are part of the turbine structure, where steam was turned into electricity at the now closed facility. (Allen J. Schaben / Los Angeles Times)

The shuttered San Onofre facility — not withstanding its overlook of prime surf breaks — is similar to about a dozen other former nuclear power plants nationwide that now have to babysit waste to prevent natural disasters, human errors or terrorist plots from causing an environmental or health catastrophe.

Though utilities and government regulators say such risks are remote, they have inflamed public fear at least since 1979’s Three Mile Island reactor accident in Pennsylvania.

The sites are located on the scenic shores of northern Lake Michigan, along a bucolic river in Maine, on the high plateau of Colorado and along the densely populated Eastern Seaboard — each environmentally sensitive for different reasons.

No one wants that waste near them — including officials in the sleepy beach town of San Clemente, just north of San Onofre. Even Southern California Edison Co. officials, while insisting the waste is safe, agree it should be moved as soon as possible.

“It doesn’t make any sense to store the fuel at all these sites,” said Thomas Palmisano, chief nuclear officer at the Southern California Edison plant. “The public doesn’t want the spent fuel here. Well, the fuel is here.”

But every attempt to solve the problem almost instantly gets tangled in complex federal litigation and imposes enormous expense on taxpayers.

The Energy Department was legally bound to haul away the waste by 1998 under the Nuclear Waste Policy Act, making the agency about 20 years late in fulfilling its promise. That has saddled utilities with multibillion-dollar costs to store the waste onsite.

An aerial view of the closed San Onofre Nuclear Generating Station hangs on the wall of a conference room at the facility. (Allen J. Schaben / Los Angeles Times)

As a result, every nuclear utility, including Southern California Edison, has sued to recover its waste storage costs. So far, they have won judgments and settlements of $6.1 billion, and the Energy Department has projected that it may be liable for up to $25 billion more.

But the new plan is fraught with complex legal, political and financial questions, and has yet to be fully defined or vetted among powerful interest groups or receive approval by Congress or survive inevitable court challenges.

The House Energy and Commerce Committee last week overwhelmingly approved legislation that could clear up many legal questions. Similar bills have been introduced in recent years and failed to move ahead, but this legislation has strong bipartisan support and is backed by the White House.

Still, a lot could go wrong with the plan, as it has for every plan for decades.

Two little-known privately held companies, New Jersey-based Holtec International and Texas-based WCS, have unveiled plans and begun licensing applications with the Nuclear Regulatory Commission for interim storage sites on each side of the New Mexico-Texas border. Officials in the area, a booming center of oil production, are enthusiastic about the potential economic benefits. And nuclear utilities have offered encouragement.

Company officials and other proponents say such temporary dumps could be opened in as little as three or four years, assuming the licensing goes smoothly. But other nuclear waste experts expect a timetable of 10 to 15 years for a temporary dump and much longer for a permanent repository.

Two dozen antinuclear activist groups and leading environmental nonprofits already have signaled in letters to the NRC that they will dispute the idea of creating temporary consolidated storage sites.

The groups, along with many longtime nuclear waste technical experts, worry that temporary storage will weaken the government’s resolve to build a permanent repository. And they assert the plan would require transporting the fuel twice, first to the temporary site and then to a permanent dump, magnifying transportation costs and the fuel’s exposure to accidents or attacks by terrorists.

“These trains hauling nuclear waste would go right by Trump’s hotel in Las Vegas,” said Marta Adams, a now-retired deputy attorney general in Nevada who is consulting with the state on its renewed legal battle.

Serious business problems cloud the plan. Among the most important is who would own and be legally responsible for the waste once it leaves the utility plant sites.

The federal government promised in the Nuclear Waste Policy Act of 1982 to take ownership at a government-owned dump, but it never authorized such ownership at a temporary private facility — one of the legal questions that the Energy and Commerce Committee’s legislation would clear up.

A temporary facility by Holtec or another organization is intended as a segue to a permanent dump at Yucca Mountain, about 100 miles north of Las Vegas. Along with an interim storage site, the Trump administration wants to restart licensing of Yucca Mountain, which President Obama suspended.

But reviving Yucca Mountain is a long shot. A decade ago, the Energy Department estimated Yucca Mountain would cost nearly $100 billion, a figure that has undoubtedly increased. The cost could be a problem for deficit-minded Republicans.

The Energy Department collected a tiny monthly fee from utility customers to build the dump, and currently a so-called trust fund has $39 billion reserved for the purpose.

But a little known clause in federal budget law 20 years ago decreed that contributions to the trust fund would count against the federal deficit. There are no securities or bonds that back up the fund, unlike the Social Security Trust Fund. As a result, every dollar spent on Yucca Mountain will have to be appropriated, and the money will add to the national debt.

“The money was collected for one purpose and used for another,” said Dale Klein, a former NRC chairman who is now associate vice chancellor for research at the University of Texas. “There is a moral obligation to address the issue. It will be a challenge to get Congress to pay for it.”

The Trump plan has also rekindled the strident bipartisan political opposition of Nevada officials, including the governor, senators, representatives and attorney general, among others. They vow to erect every legal and political obstacle to delay or kill the Yucca Mountain dump.

The state filed nearly 300 formal objections to the plan before the Obama administration suspended licensing. They must be individually examined by the NRC, a process that could take five years.

Then, the design and construction of the underground dump will require construction of about two dozen big industrial buildings and 300 miles of new railroad track. It could cost $1 billion or more every year, ranking among the largest federal operations.

A permanent repository could take 10 years to 20 years by most estimates.

On the beach

Nowhere is the nuclear waste problem more urgent than at shuttered power plants like San Onofre.

After utilities dismantle the reactors, haul away the concrete debris and restore the sites to nearly pristine condition, the nuclear waste remains. Security officers with high-powered automatic weapons guard the sites round the clock.

About five years after the spent fuel rods cool off in a 40- to 50-foot-deep pool, they are transferred to massive steel and concrete dry casks about 20 feet tall. Almost every government and outside nuclear expert considers the dry casks much safer than the pools.

The 3 Yankee Cos., which are safeguarding dry casks at three former New England reactors, spend about $10 million annually per site for maintenance and security, company officials say. The costs could be higher at San Onofre if the waste is left in place, Palmisano said.

Clockwise from top left: A model of a fuel assembly inside the conference room at the closed San Onofre Nuclear Generating Station. Thomas Palmisano, decommissioning and chief nuclear officer, holds No. 11 rebar, which is used in the structure of the dry cask storage. Construction is underway of the Independent Spent Fuel Storage Installation, where dry cask storage of used nuclear fuel will be stored vertically. A view of a concrete dry cask facility where 50 canisters, about one-third of the used nuclear fuel, is stored horizontally. (Allen J. Schaben / Los Angeles Times)

Edison is building a massive concrete monolith for more storage, using a Holtec design called Hi-Storm UMAX. It will hold about two-thirds of the plant’s spent fuel in 73 stainless-steel canisters about 125 feet from the ocean. The 25-foot structure is about half-buried with the underground foundation just above the mean high-tide line. Tall cranes and swarms of hard hats are moving construction ahead.

The crucial question is whether it will be safe, especially if congressional inaction or litigation by opposition groups keeps it on-site for years.

“The top has four feet of steel-reinforced concrete,” said Ed Mayer, program director at Holtec. “It is remarkably strong. The … steel lids are designed to take an aircraft impact.”

NRC officials say the design is safe and meets all federal requirements. Although nuclear issues are within the NRC’s jurisdiction, the Coastal Commission also examined the potential for a tsunami, sea level rise or an earthquake to undermine the facility.

“Under our authority, which is limited, the commission approved the permit, and behind that is the evaluation that it is safe for a period of 20 years,” said Alison Dettmer, deputy director of the commission.

But suspicion lingers. San Clemente city officials have demanded that the fuel be removed as soon as possible. An activist group, Citizens’ Oversight, has sued Edison for starting construction and the California Coastal Commission for approving it.

The waste “is right down by the water, just inches from the high-tide line,” said Ray Lutz, the group’s founder. “It is the most ridiculous place they could find.”

In an effort to assuage local concerns, Edison participates in a “community engagement panel” that meets at least quarterly, led by UC San Diego professor David Victor.

“Early on, I was surprised by how many people did not understand there was no place for the fuel to go,” he said. Over the last year, the possibility of a temporary storage site has raised people’s hopes for a quicker solution, he said.

The history of nuclear waste, however, is replete with solutions that seem plausible but succumb to obscure and unanticipated legal, technical or financial issues.

Decades of delay

Two decades ago, the Skull Valley Band of Goshute Indians sought to create an interim storage facility for nuclear waste on its reservation about an hour out of Salt Lake City.

The NRC spent nine years examining the license application and approved it. But Utah officials and a broad swath of major environmental groups opposed the plan. Eventually, the state blocked shipping routes to the reservation.

Michael C. Layton, director of the NRC’s division of spent fuel management, said a temporary facility would use the same technology as existing dry cask storage sites, like San Onofre.

But Layton said it is unclear how long it will take to license a consolidated storage site. The formal review is scheduled for three years, but the Skull Valley license that took nine years is the only actual licensing effort to compare it to, he added. Palmisano, the Edison executive, estimates that an off-site temporary storage facility could be operating in 10 to 15 years.

Problems have already delayed WSC, which wants to build a storage site in Andrews, Texas. It asked the NRC in April to suspend its license application.

The $7.5-million cost of just the license application review “is significantly higher than we originally anticipated,” the company said, noting that it is under additional financial stress because the Justice Department has sued it to block a merger.

Holtec officials say that WCS’ problems haven’t deterred their plans for an underground storage site, saying interim storage could save the federal government billions of dollars, particularly if the Yucca Mountain plan is again postponed.

The company has strong support in New Mexico, which already has a dump for nuclear weapons waste, a uranium enrichment plant, a nuclear weapons armory and two nuclear weapons laboratories.

“We are very well-informed,” said Sam Cobb, mayor of nearby Hobbs, rejecting arguments by antinuclear groups that the industry preys on communities that need money and don’t understand the risk.

“It is not a death grab to get money,” he said. “We believe if we have an interim storage site, we will be the center for future nuclear fuel reprocessing.”

Transportation to an interim site would cost the federal government billions of dollars under the pending legislation. Aides at the House Energy and Commerce Committee said those costs would be recovered when the federal government no longer has to pay for legal settlements for failing to take the waste in the first place.

Thomas Palmisano, left, decommissioning and chief nuclear officer, and Lou Bosch, center, Southern California Edison plant manager, lead a tour near the electricity switch yard where two-thirds of the used nuclear fuel is in wet storage. (Allen J. Schaben / Los Angeles Times)

Even if an interim site is built, it is uncertain who would get to ship waste there first. The timing of waste shipments to a permanent site is determined by the so-called standard contract queue, a legal document so complex that federal bureaucrats have dedicated their entire careers to managing it.

The queue was structured so that the oldest waste would go into a future dump first. In the unlikely event that Yucca Mountain were opened in 2024, Edison’s fuel would be in line to start shipping in 2028 with the last bit of waste arriving in 2049, Palmisano said.

Whether that queue would apply to an interim site is unclear, even under the pending legislation.

The dry casks are designed to keep spent fuel confined only for decades, while the health standard for a permanent repository covers hundreds of thousands of years — longer than humans have roamed Earth. If the radioactive waste sits around in temporary storage for hundreds of years, it could be neglected and eventually forgotten.

So one outcome that nobody seems to want is for a temporary site to eventually become permanent by default.

“It would derail momentum for a permanent repository,” said Edwin Lyman, a nuclear physicist at the Union of Concerned Scientists. “This issue has always pitted one community against another and those in between.”

Posted on Categories News

Carbon in Atmosphere Is Rising, Even as Emissions Stabilize

The Cape Grim Baseline Air Pollution Station in Tasmania.
COMMONWEALTH SCIENTIFIC AND INDUSTRIAL RESEARCH ORGANIZATION

CAPE GRIM, Tasmania — On the best days, the wind howling across this rugged promontory has not touched land for thousands of miles, and the arriving air seems as if it should be the cleanest in the world.

But on a cliff above the sea, inside a low-slung government building, a bank of sophisticated machines sniffs that air day and night, revealing telltale indicators of the way human activity is altering the planet on a major scale.

For more than two years, the monitoring station here, along with its counterparts across the world, has been flashing a warning: The excess carbon dioxide scorching the planet rose at the highest rate on record in 2015 and 2016. A slightly slower but still unusual rate of increase has continued into 2017.

Scientists are concerned about the cause of the rapid rises because, in one of the most hopeful signs since the global climate crisis became widely understood in the 1980s, the amount of carbon dioxide that people are pumping into the air seems to have stabilized in recent years, at least judging from the data that countries compile on their own emissions.

That raises a conundrum: If the amount of the gas that people are putting out has stopped rising, how can the amount that stays in the air be going up faster than ever? Does it mean the natural sponges that have been absorbing carbon dioxide are now changing?

“To me, it’s a warning,” said Josep G. Canadell, an Australian climate scientist who runs the Global Carbon Project, a collaboration among several countries to monitor emissions trends.

Scientists have spent decades measuring what was happening to all of the carbon dioxide that was produced when people burned coal, oil and natural gas. They established that less than half of the gas was remaining in the atmosphere and warming the planet. The rest was being absorbed by the ocean and the land surface, in roughly equal amounts.

In essence, these natural sponges were doing humanity a huge service by disposing of much of its gaseous waste. But as emissions have risen higher and higher, it has been unclear how much longer the natural sponges will be able to keep up.

A raging fire in South Sumatra in September 2015. Huge fires that year in Indonesia sent a pulse of carbon dioxide into the atmosphere.
ANTARA FOTO / REUTERS

Should they weaken, the result would be something akin to garbage workers going on strike, but on a grand scale: The amount of carbon dioxide in the atmosphere would rise faster, speeding global warming even beyond its present rate. It is already fast enough to destabilize the weather, cause the seas to rise and threaten the polar ice sheets.


The record increases of airborne carbon dioxide in 2015 and 2016 thus raise the question of whether this has now come to pass. Scientists are worried, but they are not ready to draw that conclusion, saying more time is needed to get a clear picture.

Many of them suspect an El Niño climate pattern that spanned those two years, one of the strongest on record, may have caused the faster-than-usual rise in carbon dioxide, by drying out large parts of the tropics. The drying contributed to huge fires in Indonesia in late 2015 that sent a pulse of carbon dioxide into the atmosphere. Past El Niños have also produced rapid increases in the gas, though not as large as the recent ones.

Yet scientists are not entirely certain that the El Niño was the main culprit; the idea cannot explain why a high rate of increase in carbon dioxide has continued into 2017, even though the El Niño ended early last year.

Scientists say their inability to know for certain is a reflection not just of the scientific difficulty of the problem, but also of society’s failure to invest in an adequate monitoring system to keep up with the profound changes humans are wreaking on the planet.

“It’s really bare bones, our network, contrary to common misperceptions about the government wasting money,” said Pieter Tans, chief of a unit that monitors greenhouse gases at the National Oceanic and Atmospheric Administration.

While the recent events have made the scientific need for an improved network clear, the situation may be about to get worse, not better. President Trump’s administration has targeted American science agencies for cutbacks, with NOAA, the lead agency for tracking greenhouse gases, being one of those on the chopping block.

Australia also had a recent fight over proposed cutbacks in climate science, but so far that country’s conservative government has promised continued funds for the Cape Grim science program, Australia’s most important contribution to global climate monitoring. The atmospheric observatory here, which receives some money from NASA, is one of the most advanced among scores of facilities around the world where greenhouse gases and other pollutants are monitored.

A monorail moving through a smoky haze, which blew over from Indonesia, in the Singapore port in September 2015.
WONG MAYE-E / ASSOCIATED PRESS

The network is complete enough to give a clear picture of the overall global trends in industrial gases in the air, scientists say. But it is too sparse to give definitive information about which parts of the planet are absorbing or releasing greenhouse gases at a given moment. Lacking such data, scientists have trouble resolving some important questions, like the reasons for the rapid increase of carbon dioxide over the past three years.

“It’s really important that people get that there’s an awful lot that’s just not known yet,” Sam Cleland, the manager of the Cape Grim station, said.

Human activity is estimated to be pumping almost 40 billion tons of carbon dioxide into the air every year, an amount that Dr. Canadell of the Global Carbon Project called “staggering.” The atmospheric concentration of the gas has risen by about 43 percent since the Industrial Revolution.

That, in turn, has warmed the Earth by around 2 degrees Fahrenheit, a large number for the surface of an entire planet.

With a better monitoring network, scientists say they might be able to specify in greater detail what is causing variations in the amount of carbon dioxide staying in the air — and, perhaps, to give a timely warning if they detect a permanent shift in the ability of the natural sponges to absorb more.

Dr. Tans of NOAA would like to put sensors on perhaps a hundred commercial airplanes to get a clearer picture of what is happening just above land in the United States. The effort would cost some $20 million a year, but the government has not financed the project.

The uncertainty stemming from the recent increases in carbon dioxide is all the more acute given that global emissions from human activity seem to have stabilized over the past three years. That is primarily because of changes in China, the largest polluter, where an economic slowdown has coincided with a conscious effort to cut emissions.

“I’d estimate that we are about at the emissions peak, or if there are further rises, they won’t be much,” said Wang Yi, a professor at the Chinese Academy of Sciences in Beijing, who also belongs to the national legislature and advises the government on climate policy.

Emissions in the United States, the second-largest polluter after China, have also been relatively flat, but Mr. Trump has started tearing up President Barack Obama’s climate policies, raising the possibility that greenhouse gases could rise in coming years.

Dr. Tans said that if global emissions flattened out at today’s high level, the world would still be in grave trouble.

“If emissions were to stay flat for the next two decades, which could be called an achievement in some sense, it’s terrible for the climate problem,” he said.

Posted on Categories News