Novim News

Coal, Nuclear on the Losing End of Power Shift

Coal is losing out to natural gas and renewables for electricity generation. A coal-fired plant in Wyoming operated by PacifiCorp. PHOTO: JIM URQUHART/REUTERS

Not long ago, coal provided 98% of the electricity for the pulp-and-paper mills and iron-ore producers around the western edge of Lake Superior, as well as the port city of Duluth, Minn. That was 2005. Today, coal use is plunging, and by 2025 is expected to power just one-third of this region.

This is all part of a plan released last month by the local utility, Minnesota Power, ALE 0.08% to generate 44% of its electric power from renewable sources like wind farms. It also plans to build a new high-efficiency natural-gas power plant and has already shut down six of its eight coal-fired units.

This is an extreme example of the transition happening across the U.S. power grid. Natural gas, wind and solar power are expanding rapidly, while electricity generation from coal and nuclear reactors is shrinking.

The transition in the grid comes as the Trump administration has signaled it would like to help coal make a comeback, and last week President Donald Trump said he wanted to “revive and expand our nuclear-energy sector” and announced a policy review. Because U.S. power demand isn’t growing, promoting coal and nuclear would come at the expense of gas and renewables—and vice versa. This has set up a power-grid showdown: Will new federal policies bring back coal and nukes, or will gas and renewables continue to grab market share?

It isn’t just small utilities like Minnesota Power that are changing their generating mix.

Duke Energy Corp. , a large utility based in Charlotte, N.C., with power plants in five states, generated 7% of its power from gas and renewables in 2005. Last year, Duke got 32% from those new sources, and it expects the portion to hit 44% by 2026.

Last week, Oregon utility PacifiCorp, which is owned by Warren Buffett’s Berkshire Hathaway Energy, filed plans to spend $3.5 billion on wind generation and transmission projects. The parent company had previously said it planned to spend $13.6 billion between 2017 and 2019 primarily on wind and solar projects. PacifiCorp said its renewable projects are “the most cost-effective option to meet customers’ energy needs over the next 20 years,” the company said.

Overall, gas, wind and solar now meet 40% of U.S. power needs, up from 22% a decade ago, according to the U.S. Energy Information Administration.

As gas and renewables have grown, coal, the mainstay of American electricity generation for decades, the past few years have been a bloodbath. Three of every 10 coal generators has closed permanently in the past five years.

Nukes, another mainstay for decades, are imperiled, too. By 2023, there may be 54 nuclear-power plants, down from 65 a decade earlier. Only new state subsidies can keep more from closing, plant operators argue.

Coal and nuclear plants have provided so-called base-load power for decades, running around the clock to ensure a reliable stream of electricity. As those sources of power lose ground to gas and renewables, some worry the grid could become unstable.

Natural gas has been the main agent of change, mostly because the advent of hydraulic fracturing unlocked vast new natural gas reserves in the U.S., creating very low prices for the fuel.

“That is what is making coal go away,” said Pat Vincent-Collawn, chairman and chief executive of PNM Resources Inc., a New Mexico utility. It expects coal to drop from 51% of its generation last year to 41% next year.

Natural gas, wind and solar now meet 40% of U.S. power needs. PHOTO: WILL VRAGOVIC/ZUMA PRESS

In addition to inexpensive and abundant gas, new power plants are much more efficient than they were even five years ago.

“Not only have we gotten better at getting the natural gas molecule out of the ground, but we have gotten much better at getting as much electricity out of that molecule as possible,” said Josh Rhodes, a research fellow at the University of Texas Energy Institute.

Until a few years ago, gas plants typically operated about 30% of the time, turning on and off as the grid needed power. Today, they run more than half the time. Many are on virtually nonstop, taking over the role once played by coal and nuclear.

Ben Fowke, chairman and chief executive of Xcel Energy Inc., a large utility that covers parts of Colorado, Minnesota and six other states, says wind and solar aren’t responsible for the demise of coal and nuclear plants. “I hope it doesn’t come out that renewables are to blame,” he said. “Wind is saving our customers money.” For now, renewable energy enjoys a federal tax subsidy.

Few utilities chiefs agree the administration should step in to prop up coal. There is more agreement that nuclear power plants should be saved from premature retirement.

Nukes, which have no carbon emissions, are struggling to compete with low wholesale power prices brought on by inexpensive natural gas and renewable generation. Low power prices are great until baseload assets are on the line, said Joe Dominguez, executive vice president of governmental and regulatory affairs for Exelon Corp. “Policy makers need to step in and address that,” he said.

For Minnesota Power, a mixture of a lot of wind, some solar, hydro power from Canadian dams and a state-of-the-art gas plant will make it easier to provide reliable electricity, even with the loss of so much coal.

“The combination of flexible natural gas and renewables really work well together,” said Julie Pierce, Minnesota Power’s vice president of strategy and planning.

—Dan Molinski contributed to this article.

Corrections & Amplifications

PacifiCorp has filed plans with state regulators to spend $3.5 billion on wind and transmission projects. An earlier version of this article incorrectly said it had filed plans to spend $13.6 billion. PacifiCorp’s parent company had previously said it planned to spend $13.6 billion between 2017 and 2019 primarily on wind and solar projects. (July 7, 2017)

Posted on Categories News

1,800 tons of radioactive waste has an ocean view and nowhere to go

A military helicopter prepares to take off over the coast from Camp Pendleton, located south of the decommissioned San Onofre Nuclear Generating Station. (Allen J. Schaben / Los Angeles Times)

The massive, 150-ton turbines have stopped spinning. The mile-long cooling pipes that extend into the Pacific will likely become undersea relics. High voltage that once energized the homes of more than a million Californians is down to zero.

But the San Onofre nuclear power plant will loom for a long time as a landmark, its 1,800 tons of lethal radioactive waste stored on the edge of the Pacific and within sight of the busy 5 Freeway.

Across the site, deep pools of water and massive concrete casks confine high-power gamma radiation and other forms of radioactivity emitted by 890,000 spent fuel rods that nobody wants there.

And like the other 79,000 tons of spent fuel spread across the nation, San Onofre’s nuclear waste has nowhere to go.

The nation’s inability to find a permanent home for the dangerous byproduct of its 50-year-adventure in nuclear energy represents one of the biggest and longest running policy failures in federal government history.

Now, the Trump administration and Congress are proposing a fast track fix. The new plan aims, after decades of delays, to move the waste to one or more temporary central storage sites that would hold it until a geologic repository can be built in Nevada or somewhere else.

But the new strategy faces many of the same challenges that have dogged past efforts, leaving some experts doubtful that it can succeed.

America’s nuclear waste failure

Left, Construction is underway of the Independent Spent Fuel Storage Installation (ISFSI) where dry cask storage of used nuclear fuel will be stored vertically at the closed San Onofre Nuclear Generating Station. Right, view of the two domes at San Onofre Nuclear Generating Station. The I-beams in foreground are part of the turbine structure, where steam was turned into electricity at the now closed facility. (Allen J. Schaben / Los Angeles Times)

The shuttered San Onofre facility — not withstanding its overlook of prime surf breaks — is similar to about a dozen other former nuclear power plants nationwide that now have to babysit waste to prevent natural disasters, human errors or terrorist plots from causing an environmental or health catastrophe.

Though utilities and government regulators say such risks are remote, they have inflamed public fear at least since 1979’s Three Mile Island reactor accident in Pennsylvania.

The sites are located on the scenic shores of northern Lake Michigan, along a bucolic river in Maine, on the high plateau of Colorado and along the densely populated Eastern Seaboard — each environmentally sensitive for different reasons.

No one wants that waste near them — including officials in the sleepy beach town of San Clemente, just north of San Onofre. Even Southern California Edison Co. officials, while insisting the waste is safe, agree it should be moved as soon as possible.

“It doesn’t make any sense to store the fuel at all these sites,” said Thomas Palmisano, chief nuclear officer at the Southern California Edison plant. “The public doesn’t want the spent fuel here. Well, the fuel is here.”

But every attempt to solve the problem almost instantly gets tangled in complex federal litigation and imposes enormous expense on taxpayers.

The Energy Department was legally bound to haul away the waste by 1998 under the Nuclear Waste Policy Act, making the agency about 20 years late in fulfilling its promise. That has saddled utilities with multibillion-dollar costs to store the waste onsite.

An aerial view of the closed San Onofre Nuclear Generating Station hangs on the wall of a conference room at the facility. (Allen J. Schaben / Los Angeles Times)

As a result, every nuclear utility, including Southern California Edison, has sued to recover its waste storage costs. So far, they have won judgments and settlements of $6.1 billion, and the Energy Department has projected that it may be liable for up to $25 billion more.

But the new plan is fraught with complex legal, political and financial questions, and has yet to be fully defined or vetted among powerful interest groups or receive approval by Congress or survive inevitable court challenges.

The House Energy and Commerce Committee last week overwhelmingly approved legislation that could clear up many legal questions. Similar bills have been introduced in recent years and failed to move ahead, but this legislation has strong bipartisan support and is backed by the White House.

Still, a lot could go wrong with the plan, as it has for every plan for decades.

Two little-known privately held companies, New Jersey-based Holtec International and Texas-based WCS, have unveiled plans and begun licensing applications with the Nuclear Regulatory Commission for interim storage sites on each side of the New Mexico-Texas border. Officials in the area, a booming center of oil production, are enthusiastic about the potential economic benefits. And nuclear utilities have offered encouragement.

Company officials and other proponents say such temporary dumps could be opened in as little as three or four years, assuming the licensing goes smoothly. But other nuclear waste experts expect a timetable of 10 to 15 years for a temporary dump and much longer for a permanent repository.

Two dozen antinuclear activist groups and leading environmental nonprofits already have signaled in letters to the NRC that they will dispute the idea of creating temporary consolidated storage sites.

The groups, along with many longtime nuclear waste technical experts, worry that temporary storage will weaken the government’s resolve to build a permanent repository. And they assert the plan would require transporting the fuel twice, first to the temporary site and then to a permanent dump, magnifying transportation costs and the fuel’s exposure to accidents or attacks by terrorists.

“These trains hauling nuclear waste would go right by Trump’s hotel in Las Vegas,” said Marta Adams, a now-retired deputy attorney general in Nevada who is consulting with the state on its renewed legal battle.

Serious business problems cloud the plan. Among the most important is who would own and be legally responsible for the waste once it leaves the utility plant sites.

The federal government promised in the Nuclear Waste Policy Act of 1982 to take ownership at a government-owned dump, but it never authorized such ownership at a temporary private facility — one of the legal questions that the Energy and Commerce Committee’s legislation would clear up.

A temporary facility by Holtec or another organization is intended as a segue to a permanent dump at Yucca Mountain, about 100 miles north of Las Vegas. Along with an interim storage site, the Trump administration wants to restart licensing of Yucca Mountain, which President Obama suspended.

But reviving Yucca Mountain is a long shot. A decade ago, the Energy Department estimated Yucca Mountain would cost nearly $100 billion, a figure that has undoubtedly increased. The cost could be a problem for deficit-minded Republicans.

The Energy Department collected a tiny monthly fee from utility customers to build the dump, and currently a so-called trust fund has $39 billion reserved for the purpose.

But a little known clause in federal budget law 20 years ago decreed that contributions to the trust fund would count against the federal deficit. There are no securities or bonds that back up the fund, unlike the Social Security Trust Fund. As a result, every dollar spent on Yucca Mountain will have to be appropriated, and the money will add to the national debt.

“The money was collected for one purpose and used for another,” said Dale Klein, a former NRC chairman who is now associate vice chancellor for research at the University of Texas. “There is a moral obligation to address the issue. It will be a challenge to get Congress to pay for it.”

The Trump plan has also rekindled the strident bipartisan political opposition of Nevada officials, including the governor, senators, representatives and attorney general, among others. They vow to erect every legal and political obstacle to delay or kill the Yucca Mountain dump.

The state filed nearly 300 formal objections to the plan before the Obama administration suspended licensing. They must be individually examined by the NRC, a process that could take five years.

Then, the design and construction of the underground dump will require construction of about two dozen big industrial buildings and 300 miles of new railroad track. It could cost $1 billion or more every year, ranking among the largest federal operations.

A permanent repository could take 10 years to 20 years by most estimates.

On the beach

Nowhere is the nuclear waste problem more urgent than at shuttered power plants like San Onofre.

After utilities dismantle the reactors, haul away the concrete debris and restore the sites to nearly pristine condition, the nuclear waste remains. Security officers with high-powered automatic weapons guard the sites round the clock.

About five years after the spent fuel rods cool off in a 40- to 50-foot-deep pool, they are transferred to massive steel and concrete dry casks about 20 feet tall. Almost every government and outside nuclear expert considers the dry casks much safer than the pools.

The 3 Yankee Cos., which are safeguarding dry casks at three former New England reactors, spend about $10 million annually per site for maintenance and security, company officials say. The costs could be higher at San Onofre if the waste is left in place, Palmisano said.

Clockwise from top left: A model of a fuel assembly inside the conference room at the closed San Onofre Nuclear Generating Station. Thomas Palmisano, decommissioning and chief nuclear officer, holds No. 11 rebar, which is used in the structure of the dry cask storage. Construction is underway of the Independent Spent Fuel Storage Installation, where dry cask storage of used nuclear fuel will be stored vertically. A view of a concrete dry cask facility where 50 canisters, about one-third of the used nuclear fuel, is stored horizontally. (Allen J. Schaben / Los Angeles Times)

Edison is building a massive concrete monolith for more storage, using a Holtec design called Hi-Storm UMAX. It will hold about two-thirds of the plant’s spent fuel in 73 stainless-steel canisters about 125 feet from the ocean. The 25-foot structure is about half-buried with the underground foundation just above the mean high-tide line. Tall cranes and swarms of hard hats are moving construction ahead.

The crucial question is whether it will be safe, especially if congressional inaction or litigation by opposition groups keeps it on-site for years.

“The top has four feet of steel-reinforced concrete,” said Ed Mayer, program director at Holtec. “It is remarkably strong. The … steel lids are designed to take an aircraft impact.”

NRC officials say the design is safe and meets all federal requirements. Although nuclear issues are within the NRC’s jurisdiction, the Coastal Commission also examined the potential for a tsunami, sea level rise or an earthquake to undermine the facility.

“Under our authority, which is limited, the commission approved the permit, and behind that is the evaluation that it is safe for a period of 20 years,” said Alison Dettmer, deputy director of the commission.

But suspicion lingers. San Clemente city officials have demanded that the fuel be removed as soon as possible. An activist group, Citizens’ Oversight, has sued Edison for starting construction and the California Coastal Commission for approving it.

The waste “is right down by the water, just inches from the high-tide line,” said Ray Lutz, the group’s founder. “It is the most ridiculous place they could find.”

In an effort to assuage local concerns, Edison participates in a “community engagement panel” that meets at least quarterly, led by UC San Diego professor David Victor.

“Early on, I was surprised by how many people did not understand there was no place for the fuel to go,” he said. Over the last year, the possibility of a temporary storage site has raised people’s hopes for a quicker solution, he said.

The history of nuclear waste, however, is replete with solutions that seem plausible but succumb to obscure and unanticipated legal, technical or financial issues.

Decades of delay

Two decades ago, the Skull Valley Band of Goshute Indians sought to create an interim storage facility for nuclear waste on its reservation about an hour out of Salt Lake City.

The NRC spent nine years examining the license application and approved it. But Utah officials and a broad swath of major environmental groups opposed the plan. Eventually, the state blocked shipping routes to the reservation.

Michael C. Layton, director of the NRC’s division of spent fuel management, said a temporary facility would use the same technology as existing dry cask storage sites, like San Onofre.

But Layton said it is unclear how long it will take to license a consolidated storage site. The formal review is scheduled for three years, but the Skull Valley license that took nine years is the only actual licensing effort to compare it to, he added. Palmisano, the Edison executive, estimates that an off-site temporary storage facility could be operating in 10 to 15 years.

Problems have already delayed WSC, which wants to build a storage site in Andrews, Texas. It asked the NRC in April to suspend its license application.

The $7.5-million cost of just the license application review “is significantly higher than we originally anticipated,” the company said, noting that it is under additional financial stress because the Justice Department has sued it to block a merger.

Holtec officials say that WCS’ problems haven’t deterred their plans for an underground storage site, saying interim storage could save the federal government billions of dollars, particularly if the Yucca Mountain plan is again postponed.

The company has strong support in New Mexico, which already has a dump for nuclear weapons waste, a uranium enrichment plant, a nuclear weapons armory and two nuclear weapons laboratories.

“We are very well-informed,” said Sam Cobb, mayor of nearby Hobbs, rejecting arguments by antinuclear groups that the industry preys on communities that need money and don’t understand the risk.

“It is not a death grab to get money,” he said. “We believe if we have an interim storage site, we will be the center for future nuclear fuel reprocessing.”

Transportation to an interim site would cost the federal government billions of dollars under the pending legislation. Aides at the House Energy and Commerce Committee said those costs would be recovered when the federal government no longer has to pay for legal settlements for failing to take the waste in the first place.

Thomas Palmisano, left, decommissioning and chief nuclear officer, and Lou Bosch, center, Southern California Edison plant manager, lead a tour near the electricity switch yard where two-thirds of the used nuclear fuel is in wet storage. (Allen J. Schaben / Los Angeles Times)

Even if an interim site is built, it is uncertain who would get to ship waste there first. The timing of waste shipments to a permanent site is determined by the so-called standard contract queue, a legal document so complex that federal bureaucrats have dedicated their entire careers to managing it.

The queue was structured so that the oldest waste would go into a future dump first. In the unlikely event that Yucca Mountain were opened in 2024, Edison’s fuel would be in line to start shipping in 2028 with the last bit of waste arriving in 2049, Palmisano said.

Whether that queue would apply to an interim site is unclear, even under the pending legislation.

The dry casks are designed to keep spent fuel confined only for decades, while the health standard for a permanent repository covers hundreds of thousands of years — longer than humans have roamed Earth. If the radioactive waste sits around in temporary storage for hundreds of years, it could be neglected and eventually forgotten.

So one outcome that nobody seems to want is for a temporary site to eventually become permanent by default.

“It would derail momentum for a permanent repository,” said Edwin Lyman, a nuclear physicist at the Union of Concerned Scientists. “This issue has always pitted one community against another and those in between.”

Posted on Categories News

Carbon in Atmosphere Is Rising, Even as Emissions Stabilize

The Cape Grim Baseline Air Pollution Station in Tasmania.
COMMONWEALTH SCIENTIFIC AND INDUSTRIAL RESEARCH ORGANIZATION

CAPE GRIM, Tasmania — On the best days, the wind howling across this rugged promontory has not touched land for thousands of miles, and the arriving air seems as if it should be the cleanest in the world.

But on a cliff above the sea, inside a low-slung government building, a bank of sophisticated machines sniffs that air day and night, revealing telltale indicators of the way human activity is altering the planet on a major scale.

For more than two years, the monitoring station here, along with its counterparts across the world, has been flashing a warning: The excess carbon dioxide scorching the planet rose at the highest rate on record in 2015 and 2016. A slightly slower but still unusual rate of increase has continued into 2017.

Scientists are concerned about the cause of the rapid rises because, in one of the most hopeful signs since the global climate crisis became widely understood in the 1980s, the amount of carbon dioxide that people are pumping into the air seems to have stabilized in recent years, at least judging from the data that countries compile on their own emissions.

That raises a conundrum: If the amount of the gas that people are putting out has stopped rising, how can the amount that stays in the air be going up faster than ever? Does it mean the natural sponges that have been absorbing carbon dioxide are now changing?

“To me, it’s a warning,” said Josep G. Canadell, an Australian climate scientist who runs the Global Carbon Project, a collaboration among several countries to monitor emissions trends.

Scientists have spent decades measuring what was happening to all of the carbon dioxide that was produced when people burned coal, oil and natural gas. They established that less than half of the gas was remaining in the atmosphere and warming the planet. The rest was being absorbed by the ocean and the land surface, in roughly equal amounts.

In essence, these natural sponges were doing humanity a huge service by disposing of much of its gaseous waste. But as emissions have risen higher and higher, it has been unclear how much longer the natural sponges will be able to keep up.

A raging fire in South Sumatra in September 2015. Huge fires that year in Indonesia sent a pulse of carbon dioxide into the atmosphere.
ANTARA FOTO / REUTERS

Should they weaken, the result would be something akin to garbage workers going on strike, but on a grand scale: The amount of carbon dioxide in the atmosphere would rise faster, speeding global warming even beyond its present rate. It is already fast enough to destabilize the weather, cause the seas to rise and threaten the polar ice sheets.


The record increases of airborne carbon dioxide in 2015 and 2016 thus raise the question of whether this has now come to pass. Scientists are worried, but they are not ready to draw that conclusion, saying more time is needed to get a clear picture.

Many of them suspect an El Niño climate pattern that spanned those two years, one of the strongest on record, may have caused the faster-than-usual rise in carbon dioxide, by drying out large parts of the tropics. The drying contributed to huge fires in Indonesia in late 2015 that sent a pulse of carbon dioxide into the atmosphere. Past El Niños have also produced rapid increases in the gas, though not as large as the recent ones.

Yet scientists are not entirely certain that the El Niño was the main culprit; the idea cannot explain why a high rate of increase in carbon dioxide has continued into 2017, even though the El Niño ended early last year.

Scientists say their inability to know for certain is a reflection not just of the scientific difficulty of the problem, but also of society’s failure to invest in an adequate monitoring system to keep up with the profound changes humans are wreaking on the planet.

“It’s really bare bones, our network, contrary to common misperceptions about the government wasting money,” said Pieter Tans, chief of a unit that monitors greenhouse gases at the National Oceanic and Atmospheric Administration.

While the recent events have made the scientific need for an improved network clear, the situation may be about to get worse, not better. President Trump’s administration has targeted American science agencies for cutbacks, with NOAA, the lead agency for tracking greenhouse gases, being one of those on the chopping block.

Australia also had a recent fight over proposed cutbacks in climate science, but so far that country’s conservative government has promised continued funds for the Cape Grim science program, Australia’s most important contribution to global climate monitoring. The atmospheric observatory here, which receives some money from NASA, is one of the most advanced among scores of facilities around the world where greenhouse gases and other pollutants are monitored.

A monorail moving through a smoky haze, which blew over from Indonesia, in the Singapore port in September 2015.
WONG MAYE-E / ASSOCIATED PRESS

The network is complete enough to give a clear picture of the overall global trends in industrial gases in the air, scientists say. But it is too sparse to give definitive information about which parts of the planet are absorbing or releasing greenhouse gases at a given moment. Lacking such data, scientists have trouble resolving some important questions, like the reasons for the rapid increase of carbon dioxide over the past three years.

“It’s really important that people get that there’s an awful lot that’s just not known yet,” Sam Cleland, the manager of the Cape Grim station, said.

Human activity is estimated to be pumping almost 40 billion tons of carbon dioxide into the air every year, an amount that Dr. Canadell of the Global Carbon Project called “staggering.” The atmospheric concentration of the gas has risen by about 43 percent since the Industrial Revolution.

That, in turn, has warmed the Earth by around 2 degrees Fahrenheit, a large number for the surface of an entire planet.

With a better monitoring network, scientists say they might be able to specify in greater detail what is causing variations in the amount of carbon dioxide staying in the air — and, perhaps, to give a timely warning if they detect a permanent shift in the ability of the natural sponges to absorb more.

Dr. Tans of NOAA would like to put sensors on perhaps a hundred commercial airplanes to get a clearer picture of what is happening just above land in the United States. The effort would cost some $20 million a year, but the government has not financed the project.

The uncertainty stemming from the recent increases in carbon dioxide is all the more acute given that global emissions from human activity seem to have stabilized over the past three years. That is primarily because of changes in China, the largest polluter, where an economic slowdown has coincided with a conscious effort to cut emissions.

“I’d estimate that we are about at the emissions peak, or if there are further rises, they won’t be much,” said Wang Yi, a professor at the Chinese Academy of Sciences in Beijing, who also belongs to the national legislature and advises the government on climate policy.

Emissions in the United States, the second-largest polluter after China, have also been relatively flat, but Mr. Trump has started tearing up President Barack Obama’s climate policies, raising the possibility that greenhouse gases could rise in coming years.

Dr. Tans said that if global emissions flattened out at today’s high level, the world would still be in grave trouble.

“If emissions were to stay flat for the next two decades, which could be called an achievement in some sense, it’s terrible for the climate problem,” he said.

Posted on Categories News

Fisticuffs Over the Route to a Clean-Energy Future

Offshore wind farm turbines near Block Island, R.I. Claims that it is quite feasible to power the American economy entirely with energy from wind, sun and water are under fire.
CHANG W. LEE / THE NEW YORK TIMES

Could the entire American economy run on renewable energy alone?

This may seem like an irrelevant question, given that both the White House and Congress are controlled by a party that rejects the scientific consensus about human-driven climate change. But the proposition that it could, long a dream of an environmental movement as wary of nuclear energy as it is of fossil fuels, has been gaining ground among policy makers committed to reducing the nation’s carbon footprint. Democrats in both the United States Senate and in the California Assembly have proposed legislation this year calling for a full transition to renewable energy sources.

They are relying on what looks like a watertight scholarly analysis to support their call: the work of a prominent energy systems engineer from Stanford University, Mark Z. Jacobson. With three co-authors, he published a widely heralded article two years ago asserting that it would be eminently feasible to power the American economy by midcentury almost entirely with energy from the wind, the sun and water. What’s more, it would be cheaper than running it on fossil fuels.

And yet the proposition is hardly as solid as Professor Jacobson asserts.

In a long-awaited article published this week in The Proceedings of the National Academy of Sciences — the same journal in which Professor Jacobson’s manifesto appeared — a group of 21 prominent scholars, including physicists and engineers, climate scientists and sociologists, took a fine comb to the Jacobson paper and dismantled its conclusions bit by bit.

“I had largely ignored the papers arguing that doing all with renewables was possible at negative costs because they struck me as obviously incorrect,” said David Victor of the University of California, San Diego, a co-author of the new critique of Professor Jacobson’s work. “But when policy makers started using this paper for scientific support, I thought, ‘this paper is dangerous.’”

The conclusion of the critique is damning: Professor Jacobson relied on “invalid modeling tools,” committed “modeling errors” and made “implausible and inadequately supported assumptions,” the scholars wrote. “Our paper is pretty devastating,” said Varun Sivaram from the Council on Foreign Relations, a co-author of the new critique.

The experts are not opposed to aggressive investments in renewable energy. But they argue, as does most of the scientific community represented on the Intergovernmental Panel on Climate Change, that other energy sources — atomic power, say, or natural gas coupled with technologies to remove carbon from the atmosphere — are likely to prove indispensable in the global effort to combat climate change. Ignoring them risks derailing the effort to combat climate change.

But with the stakes so high, the gloves are clearly off.

Professor Jacobson is punching back hard. In an article published in the same issue of the Proceedings and in a related blog post, he argues that his critics’ analysis “is riddled with errors and has no impact” on his conclusions.

In a conversation over the weekend, he accused his critics of being shills for the fossil fuel and nuclear industries, without the standing to review his work. “Their paper is really a dangerous paper,” he told me.

In San Francisco, cooking oil is collected for recycling into biofuels. Mark Z. Jacobson, a Stanford engineer, claims renewables can provide 100 percent of the nation’s energy needs in a few decades without bioenergy, which today contributes about half of the country’s renewable energy production
JUSTIN SULLIVAN / GETTY IMAGES

But on close examination, Professor Jacobson’s premise does seem a leap of faith.

Renewable sources provide only about a tenth of the United States’ energy consumption. Increasing the penetration of intermittent energy sources from the sun and the wind is already proving a challenge for the electricity grid in many parts of the world.

Professor Jacobson not only claims renewables’ share can be ramped up on the cheap to 100 percent within a few decades, but also that it can be done without bioenergy, which today contributes about half of the country’s renewable-energy production.

And yet under the microscope of the critics — led by Christopher Clack, chief executive of the grid modeling firm Vibrant Clean Energy and formerly with the National Oceanic and Atmospheric Administration and the University of Colorado, Boulder — his proposed system does not hold together.

The weakness of energy systems powered by the sun and the wind is their intermittency. Where will the energy come from when the sun isn’t shining and the wind isn’t blowing? Professor Jacobson addresses this in two ways, vastly increasing the nation’s peak hydroelectricity capacity and deploying energy storage at a vast scale.

“To repower the world, we need to expand a lot of things to a large scale,” Professor Jacobson told me. “But there is no reason we can’t scale up.”

Actually, there are reasons. The main energy storage technologies he proposes — hydrogen and heat stored in rocks buried underground — have never been put in place at anywhere near the scale required to power a nation, or even a large city.

His system requires storing seven weeks’ worth of energy consumption. Today, the 10 biggest storage systems in the United States combined store some 43 minutes. Hydrogen production would have to be scaled up by a factor of 100,000 or more to meet the requirements in Professor Jacobson’s analysis, according to his critics.

Professor Jacobson notes that Denmark has deployed a heating system similar to the one he proposes. But Denmark adapted an existing underground pipe infrastructure to transport the heat, whereas a system would have to be built from scratch in American cities.

Professor Jacobsen envisions extensive systems to store the intermittent energy produced by solar and wind technologies.
SCOTT MCINTYRE FOR THE NEW YORK TIMES

A common thread to the Jacobson approach is how little regard it shows for the political, social and technical plausibility of what would undoubtedly be wrenching transformations across the economy.

He argues for the viability of hydrogen-fueled aviation by noting the existence of a hydrogen-powered four-seat jet. Jumping from that to assert that hydrogen can economically fuel the nation’s fleet within a few decades seems akin to arguing that because the United States sent a few astronauts to the moon we will all be able to move there soon.

He proposes building and deploying energy systems at a scale that has never been achieved and at a speed that nobody has ever tried. He assumes an implausibly low cost of capital. He asserts that most American industry will easily adjust its schedule to the availability of energy — unplugging when the wind and sun are down regardless of the needs of workers, suppliers, customers and other stakeholders.

And even after all this, the system fails unless it can obtain vast amounts of additional power from hydroelectricity as a backup at moments when other sources are weak: no less than 1,300 gigawatts. That is about 25 percent more power than is produced by all sources combined in the United States today, the equivalent of 600 Hoover Dams.

Building dams is hardly uncontroversial. So Professor Jacobson proposes adding this capacity with “zero increase in dam size, no annual increase in the use of water, no new land,” simply by adding a lot more turbines to existing dams. It is not obvious that so many of them can be added, however, or at what cost. Especially considering they would be unproductive 90 percent of the time and for use only as a backstop. What’s more, adding turbines does not increase the available energy at any given time unless there is more water pushing through them.

Ken Caldeira of the Carnegie Institution for Science, one of the lead authors of the critique, put it this way: The discharge rate needed from the nation’s dams to achieve the 1,300 gigawatts would be equivalent to about 100 times the flow of the Mississippi River. Even if this kind of push were available, it is not hard to imagine that people living downstream might object to the release of such vast amounts of water.

“The whole system falls apart because this is the very last thing that is used,” Professor Clack noted. “If you remove any of this, the model fails.”

It is critically important to bring this debate into the open. For too long, climate advocacy and policy has been inflected by a hope that the energy transformation before us can be achieved cheaply and virtuously — in harmony with nature. But the transformation is likely to be costly. And though sun, wind and water are likely to account for a much larger share of the nation’s energy supply, less palatable technologies are also likely to play a part.

Policy makers rushing to unplug existing nuclear reactors and embrace renewables note: Shuttering viable technological paths could send us down a cul-de-sac. And we might not be possible to correct course fast enough.

Correction: June 20, 2017
An earlier version of this column included an outdated affiliation for one scientist, Christopher Clack. He is now chief executive of the grid modeling firm Vibrant Clean Energy; he is no longer with the National Oceanic and Atmospheric Administration and the University of Colorado, Boulder.

Posted on Categories News

Safety lapses undermine nuclear warhead work at Los Alamos

A 2014 aerial view of the Los Alamos National Laboratory Plutonium Facility 4 where production setbacks occurred after a safety near miss. (Google)

An extended shutdown of the nation’s only scientific laboratory for producing and testing the plutonium cores for its nuclear weapons has taken a toll on America’s arsenal, with key work postponed and delays looming in the production of components for new nuclear warheads, according to government documents and officials.

The unique research and production facility is located at Los Alamos National Laboratory (LANL) in New Mexico, the birthplace of the U.S. atomic arsenal. The lab’s director ordered the shutdown in 2013 after the Washington official in charge of America’s warhead production expressed worries that the facility was ill-equipped to prevent an accident that would kill its workers and potentially others nearby.

Parts of the facility began renewed operations last year, but with only partial success. And workers there last year were still violating safety rules for handling plutonium, the unstable man-made metal that serves as the sparkplug of the thermonuclear explosions that American bombs are designed to create.

Los Alamos’s persistent shortcomings in plutonium safety have been cited in more than 40 reports by government oversight agencies, teams of nuclear safety experts and the lab’s own employees over the past 11 years. Some of these reports say that safety takes a back seat to meeting specific goals for nuclear warhead maintenance and production by private contractors running the labs. Nuclear workers and experts say the contractors have been chasing lucrative government bonuses tied to those goals.

With key work at Los Alamos deferred due to safety problems, officials and experts say the United States risks falling behind on an ambitious $1 trillion update of its nuclear arsenal, which former president Barack Obama supported and President Trump has said he wants to “greatly strengthen and expand.”

During the hiatus, Los Alamos has had to forego 29 planned tests of the safety and reliability of plutonium cores in warheads now deployed atop U.S. submarine-launched and land-based missiles and in bombs carried by aircraft. The facility also hasn’t been able to make new plutonium cores to replace those regularly withdrawn from the nuclear arsenal for testing or to be fit into warheads, which are being modernized for those missiles and bombers at a projected cost of billions of dollars.

“The laboratory shut down an important facility doing important work,” said James McConnell, the associate administrator for safety, infrastructure and operations at the National Nuclear Security Administration (NNSA), a semiautonomous arm of the Energy Department, in a recent interview at the agency’s Washington headquarters. “What we didn’t have was the quality program that we want.”

Ernest Moniz, the Massachusetts Institute of Technology physicist who served almost four years as President Obama’s energy secretary, said in a separate interview that “we were obviously quite concerned about” the shutdown at Los Alamos. Moniz said he considered the situation there a “mess” and the testing interruption “significant.”

“I don’t think it has, at this stage, in any way seriously compromised” the nuclear arsenal, Moniz said. But he added that it was still his conviction that “obviously we’ve got to get back to that” work as soon as possible. A mock plutonium core was made at Los Alamos last year in a demonstration timed to coincide with a visit by Ashton B. Carter, then secretary of defense.

U.S. Secretary of Defense Ash Carter tours the Los Alamos National Laboratory Plutonium Facility 4 in 2016. (Los Alamos National Laboratory)

At a public hearing in Santa Fe on June 7, McConnell said that while Los Alamos is making progress, it is still unable to resolve the safety issue that provoked its shutdown four years ago, namely an acute shortage of engineers who are trained in keeping the plutonium at the facility from becoming “critical” and fissioning uncontrollably. “They’re not where we need them yet,” he said of the lab and its managers.

A February report by the Defense Nuclear Facilities Safety Board, an independent safety advisory group chartered by Congress, detailed the magnitude of the gap. It said Los Alamos needs 27 fully qualified safety engineers specialized in keeping the plutonium from fissioning out of control. The lab has 10.

Some of the reports obtained by the Center for Public Integrity described flimsy workplace safety policies that left workers ignorant of proper procedures as well as incidents where plutonium was packed hundreds of times into dangerously close quarters or without the shielding needed to block a serious accident. The safety risks at the Los Alamos plutonium facility, which is known as PF-4, were alarmingly highlighted in August 2011, when a “criticality accident,” as it’s known, was narrowly averted, one of several factors prompting many safety officials there to quit.

A criticality accident is an uncontrolled chain reaction involving a fissionable material such as plutonium that releases energy and generates a deadly burst of radiation. Its prevention has been an important challenge for the nuclear weapons program since the 1940s. Criticality accidents have occurred 60 times at various nuclear sites in the last half-century, causing a total of 21 agonizing deaths.

Three workers at Los Alamos died in preventable criticality accidents in the 1940s and 1950s. The most recent criticality-related deaths elsewhere occurred in 1999 at a factory north of Tokyo, where Japanese technicians accidentally mixed too much highly enriched uranium into some wide-mouth buckets. A burst of radiation — and its resulting characteristic blue glow — provoked school and road closures and the evacuation of those living nearby, plus a Japanese government order for 310,000 others to shelter in place.

The building in Japan where a 1999 criticality accident caused deaths and an evacuation. (Nuclear Regulatory Commission)

The problems at Los Alamos were revealed by a year-long investigation by the Center for Public Integrity, which also found several unpublicized accidents at other privately run U.S. nuclear facilities. The investigation, which can be read in full at the Center for Public Integrity’s website, also showed that the penalties imposed by the government for these errors were typically small, relative to the tens of millions of dollars the NNSA gives to each of the contractors annually in pure profit. Some contractors involved in repeated workplace safety incidents were also awarded contract extensions and renewals by officials in Washington.

Asked about the Los Alamos facility’s record, NNSA spokesman Gregory Wolf responded that “we expect our contractors to perform work in a safe and secure manner that protects our employees, our facilities, and the public. When accidents do occur, our focus is to determine causes, identify corrective actions and prevent recurrences.”

Kevin Roark, the spokesman for the consortium of firms hired by the government to run the lab, said in an email that he would defer to the NNSA’s response. Charles McMillan, the Los Alamos lab’s director since 2011, who receives government-funded compensation exceeding $1 million a year, declined to be interviewed about its safety records or the national security consequences of the shutdown. But he said in a 2015 promotional video that “the only way” the lab can accomplish its vital national security mission “is by doing it safely.”

A near-calamity

Los Alamos’s handling of plutonium was the target of internal and external criticism a decade ago, around the time of its takeover by three profit-making firms — Bechtel National Inc., URS (now AECOM) and BWXT Government Group Inc. — in an alliance with the University of California. “We couldn’t prove we were safe,” said Douglas Bowen, a nuclear engineer on the laboratory’s criticality safety staff at the time, “not even close.”

In September 2007, the facility in question — technically known as PF-4 for Plutonium Facility Four and located in a highly secure part of the Los Alamos campus in the mountains above Santa Fe — was shut for a month while managers conducted new training and created an internal safety board to fix its problems. But in 2010, when the Energy Department did a checkup, it found “no official notes or records” the board had ever met, according to a report at the time.

Alarms were sounded more loudly after a nuclear technician positioned eight plutonium rods dangerously close together inside what is called a glovebox — a sealed container meant to contain the cancer-causing plutonium particles — on the afternoon of Aug. 11, 2011, to take a photograph for senior managers. Doing so posed the risk that neutrons emitted routinely by the metal in the rods would collide with the atoms of other particles, causing them to fission enough to provoke more collisions and begin an uncontrolled chain reaction of atom splitting.

Rods of plutonium placed precariously close for the purpose of taking this 2011 photo. The error caused a multiyear production setback. ( NNSA)

As luck had it, a supervisor returned from her lunch break and noticed the dangerous configuration. But she then ordered the technician to reach into the box and move the rods apart, and a more senior lab official ordered others present to keep working. Both decisions increased, rather than diminished, the likelihood of an accident, because bodies — and even hands — contain water that can reflect and slow the neutrons, increasing the likelihood of a criticality and its resulting radiation burst.

“The weird thing about criticality safety is it’s not intuitive,” Don Nichols, a former chief for defense nuclear safety at NNSA, said in an interview. The calculations involved in avoiding criticality — which take account of the shape, size, form, quantity and geometric configuration of the plutonium as it moves through more than a dozen messy industrial processes — are so complex that it takes 18 months of training for an engineer to become qualified, and as many as five years to become proficient.

That’s why the consequences of the 2011 incident were so severe, even though a criticality did not occur. Virtually all the criticality specialists responsible for helping to keep workers safe at Los Alamos decided to quit, having become frustrated by the sloppy work demonstrated in the incident and what they considered the lab management’s callousness about nuclear risks when higher profits were at stake, according to interviews and government reports.

Bowen recalled frequently hearing an official with one of the private contractors running PF-4 say “we don’t even need a criticality-safety program,” and that the work was costing the contractor too much money. Former NNSA official Nichols confirmed the exodus of trained experts, saying that due to “some mismanagement, people voted with their feet. They left.” The attrition rate was around 100 percent, according to a “lessons-learned” report completed last month by the lab’s current criticality safety chief and the lone NNSA expert assigned to that issue in the agency’s Los Alamos oversight office.

Workers at the Los Alamos National Laboratory Plutonium Facility 4. (NNSA/Los Alamos)

The exodus provokes the shutdown

The lab’s inability to fend off a deadly accident eventually became apparent to Washington.

Four NNSA staff members briefed Neile Miller, the agency’s acting administrator in 2013, in an anteroom of her office overlooking the Mall that year, Miller recalled. The precise risks did not need an explanation, she said. She said that criticality is “one of those trigger words” that should immediately get the attention of anyone responsible for preventing a nuclear weapons disaster.

With two of the four experts remaining in her office, Miller picked up the phone that day and called McMillan at the Los Alamos complex, which is financed by a federal payment exceeding $2 billion a year. She recommended that the key plutonium lab inside PF-4 be shut down, immediately, while the safety deficiencies were fixed.

McMillan responded that he had believed the problems could be solved while that lab kept operating, Miller said. He was “reluctant” to shut it down, she recalled. But as the telephone conversation proceeded, he became open to her view that the risks were too high, she added. So on McMillan’s order, the lab was shut within a day, with little public notice.

The exact cost to taxpayers of idling the facility is unclear, but an internal Los Alamos report estimated in 2013 that shutting down the facility where such work is conducted costs the government as much as $1.36 million a day in lost productivity.

Initially, McMillan promised the staff that a “pause” lasting less than a year wouldn’t cause “any significant impact to mission deliverables.” But at the end of 2013, a new group of safety experts commissioned by the lab declared in an internal report that “management has not yet fully embraced its commitment to criticality safety.” It listed nine weaknesses in the lab’s safety culture that were rooted in a “production focus” to meet deadlines. Workers say these deadlines are typically linked to managers’ financial bonuses.

Los Alamos’s leaders, the report said, had made the right promises, but failed to alter the underlying safety culture. “The focus appears to remain short-term and compliance-oriented rather than based on a strategic plan,” it said.

Shortfalls persisted in 2015, and new ones were discovered while the facility, still mostly shut down, was used for test runs. On May 6, 2015, for example, the NNSA sent Los Alamos’s managing contractors a letter again criticizing the lab for being slow to fix criticality risks. The Defense Nuclear Facilities Safety Board said the letter cited “more than 60 unresolved infractions,” many present for months “or even years.”

In January and again in April 2015, workers discovered tubes of liquids containing plutonium in seldom-used rooms at PF-4, with labels that made it hard to know how much plutonium the tubes held or where they’d come from, the safety board said. In May, workers packed a drum of nuclear waste with too much plutonium, posing a criticality risk, and in the ensuing probe, it became clear that they were relying on inaccurate and confusing documentation. Safety experts had miscalculated how much plutonium the drum could safely hold.

“These issues are very similar to the issues that contributed to the LANL Director’s decision to pause operations in June of 2013,” safety board inspectors wrote.

New troubles

In 2016, for the third straight year, the Energy Department and the Defense Nuclear Facilities Safety Board each listed criticality safety at Los Alamos as one of the most pressing problems facing the nuclear weapons program, in their annual reports to Congress. “Required improvements to the Criticality Safety program are moving at an unacceptably slow pace,” the most recent NNSA performance evaluation of Los Alamos, released in Nov. 2016, said.

Hazardous operations at PF-4 slowly started to resume in 2016, but problems continued. In June, after technicians working in a glovebox spilled about 7 tablespoons of a liquid containing plutonium, workers violated safety rules by sopping up the spill with organic cheesecloth and throwing it in waste bins with other nuclear materials, posing the risk of a chemical reaction and fire, according to an internal Los Alamos report. A similar chemical reaction stemming from the sloppy disposal of Los Alamos’s nuclear waste in 2014 provoked the shutdown of a deep-underground storage site in New Mexico for the waste for more than two years, a Department of Energy accident investigation concluded. That incident cost the government more than a billion dollars in cleanup and other expenses

Frank G. Klotz, the NNSA director, has tried to be upbeat. In March, he told hundreds of nuclear contractors packed into a Washington hotel ballroom for an industry gathering that PF-4 was fully back in business, having “safely resumed all plutonium activities there after a three-year pause.”

Klotz said the updated nuclear weapons would be delivered “on time and on budget.”

But a subsequent analysis by the Government Accountability Office clashed with Klotz’s description. In an April report on costs associated with the NNSA’s ongoing weapons modernization, the GAO disclosed the existence of an internal NNSA report forecasting that PF-4 will be unable to meet the plutonium-pit production deadlines.

Moreover, late last year when Los Alamos conducted its first scheduled invasive test of a plutonium pit since the shutdown of PF-4 more than three years ago, it did not produce the needed results, according to NNSA’s annual evaluation of Los Alamos’s performance last year. The test involved the core of a refurbished warhead scheduled to be delivered to the Navy by the end of 2019 for use atop the Trident missiles carried by U.S. submarines. A second attempt involving a different warhead was canceled because the safety analysis was incomplete, NNSA’s evaluation said.

The purpose of such stockpile surveillance tests, as Vice President Joe Biden said in a 2010 National Defense University speech, is to “anticipate potential problems and reduce their impact on our arsenal.” Weapons designers say these tests are akin to what car owners would do if they were storing a vehicle for years while still expecting the engine to start and the vehicle to speed down the road at the sudden turn of a key.

At the public hearing in Santa Fe on June 7, NNSA’s McConnell said the agency is studying whether to keep plutonium-pit operations at Los Alamos. Options being considered include upgrading the facilities there or “adding capabilities or leveraging existing capabilities elsewhere in the country, at other sites where plutonium is already present or has been used.”

Active NNSA sites that fit that description include the Savannah River Site in South Carolina, the Pantex plant in Texas and the Nevada National Security Site. The NNSA expects to complete its analysis by late summer.

This article is from the Center for Public Integrity, a nonprofit, nonpartisan investigative media organization in Washington.

Posted on Categories News

Climate Science Meets a Stubborn Obstacle: Students

Gwen Beatty in James Sutter’s classroom at Wellston High School in Ohio, where she and Mr. Sutter butted heads over the issue of human-caused climate change. Credit Maddie McGarvey for The New York Times

WELLSTON, Ohio — To Gwen Beatty, a junior at the high school in this proud, struggling, Trump-supporting town, the new science teacher’s lessons on climate change seemed explicitly designed to provoke her.

So she provoked him back.

When the teacher, James Sutter, ascribed the recent warming of the Earth to heat-trapping gases released by burning fossil fuels like the coal her father had once mined, she asserted that it could be a result of other, natural causes.

When he described the flooding, droughts and fierce storms that scientists predict within the century if such carbon emissions are not sharply reduced, she challenged him to prove it. “Scientists are wrong all the time,” she said with a shrug, echoing those celebrating President Trump’s announcement last week that the United States would withdraw from the Paris climate accord.

When Mr. Sutter lamented that information about climate change had been removed from the White House website after Mr. Trump’s inauguration, she rolled her eyes.

“It’s his website,” she said.

Mr. Sutter during his Advanced Placement environmental science class. He was hired from a program that recruits science professionals into teaching. Credit Maddie McGarvey for The New York Times

For his part, Mr. Sutter occasionally fell short of his goal of providing Gwen — the most vocal of a raft of student climate skeptics — with calm, evidence-based responses. “Why would I lie to you?” he demanded one morning. “It’s not like I’m making a lot of money here.”

She was, he knew, a straight-A student. She would have had no trouble comprehending the evidence, embedded in ancient tree rings, ice, leaves and shells, as well as sophisticated computer models, that atmospheric carbon dioxide is the chief culprit when it comes to warming the world. Or the graph he showed of how sharply it has spiked since the Industrial Revolution, when humans began pumping vast quantities of it into the air.

Thinking it a useful soothing device, Mr. Sutter assented to Gwen’s request that she be allowed to sand the bark off the sections of wood he used to illustrate tree rings during class. When she did so with an energy that, classmates said, increased during discussion points with which she disagreed, he let it go.

When she insisted that teachers “are supposed to be open to opinions,” however, Mr. Sutter held his ground.

“It’s not about opinions,” he told her. “It’s about the evidence.”

“It’s like you can’t disagree with a scientist or you’re ‘denying science,”’ she sniffed to her friends.

Gwen, 17, could not put her finger on why she found Mr. Sutter, whose biology class she had enjoyed, suddenly so insufferable. Mr. Sutter, sensing that his facts and figures were not helping, was at a loss. And the day she grew so agitated by a documentary he was showing that she bolted out of the school left them both shaken.

“I have a runner,” Mr. Sutter called down to the office, switching off the video.

He had chosen the video, an episode from an Emmy-winning series that featured a Christian climate activist and high production values, as a counterpoint to another of Gwen’s objections, that a belief in climate change does not jibe with Christianity.

“It was just so biased toward saying climate change is real,” she said later, trying to explain her flight. “And that all these people that I pretty much am like are wrong and stupid.”

Classroom Culture Wars

As more of the nation’s teachers seek to integrate climate science into the curriculum, many of them are reckoning with students for whom suspicion of the subject is deeply rooted.

In rural Wellston, a former coal and manufacturing town seeking its next act, rejecting the key findings of climate science can seem like a matter of loyalty to a way of life already under siege. Originally tied, perhaps, to economic self-interest, climate skepticism has itself become a proxy for conservative ideals of hard work, small government and what people here call “self-sustainability.”

A tractor near Wellston, an area where coal and manufacturing were once the primary employment opportunities. Credit Maddie McGarvey for The New York Times

Assiduously promoted by fossil fuel interests, that powerful link to a collective worldview largely explains why just 22 percent of Mr. Trump’s supporters in a 2016 poll said they believed that human activity is warming the planet, compared with half of all registered voters. And the prevailing outlook among his base may in turn have facilitated the president’s move to withdraw from the global agreement to battle rising temperatures.

“What people ‘believe’ about global warming doesn’t reflect what they know,” Dan Kahan, a Yale researcher who studies political polarization, has stressed in talks, papers and blog posts. “It expresses who they are.”

But public-school science classrooms are also proving to be a rare place where views on climate change may shift, research has found. There, in contrast with much of adult life, it can be hard to entirely tune out new information.

“Adolescents are still heavily influenced by their parents, but they’re also figuring themselves out,” said Kathryn Stevenson, a researcher at North Carolina State University who studies climate literacy.

Gwen’s father died when she was young, and her mother and uncle, both Trump supporters, doubt climate change as much as she does.

“If she was in math class and teacher told her two plus two equals four and she argued with him about that, I would say she’s wrong,” said her uncle, Mark Beatty. “But no one knows if she’s wrong.”

As Gwen clashed with her teacher over the notion of human-caused climate change, one of her best friends, Jacynda Patton, was still circling the taboo subject. “I learned some stuff, that’s all,’’ Jacynda told Gwen, on whom she often relied to supply the $2.40 for school lunch that she could not otherwise afford.

Jacynda Patton, right, during Mr. Sutter’s class. “I thought it would be an easy A,” she said. “It wasn’t.” Credit Maddie McGarvey for The New York Times

Hired a year earlier, Mr. Sutter was the first science teacher at Wellston to emphasize climate science. He happened to do so at a time when the mounting evidence of the toll that global warming is likely to take, and the Trump administration’s considerable efforts to discredit those findings, are drawing new attention to the classroom from both sides of the nation’s culture war.

Since March, the Heartland Institute, a think tank that rejects the scientific consensus on climate change, has sent tens of thousands of science teachers a book of misinformation titled “Why Scientists Disagree About Global Warming,” in an effort to influence “the next generation of thought,” said Joseph Bast, the group’s chief executive.

The Alliance for Climate Education, which runs assemblies based on the consensus science for high schools across the country, received new funding from a donor who sees teenagers as the best means of reaching and influencing their parents.

Idaho, however, this year joined several other states that have declined to adopt new science standards that emphasize the role human activities play in climate change.

At Wellston, where most students live below the poverty line and the needle-strewn bike path that abuts the marching band’s practice field is known as “heroin highway,” climate change is not regarded as the most pressing issue. And since most Wellston graduates typically do not go on to obtain a four-year college degree, this may be the only chance many of them have to study the impact of global warming.

But Mr. Sutter’s classroom shows how curriculum can sometimes influence culture on a subject that stands to have a more profound impact on today’s high schoolers than their parents.

“I thought it would be an easy A,” said Jacynda, 16, an outspoken Trump supporter. “It wasn’t.”

God’s Gift to Wellston?

Mr. Sutter, who grew up three hours north of Wellston in the largely Democratic city of Akron, applied for the job at Wellston High straight from a program to recruit science professionals into teaching, a kind of science-focused Teach for America.

He already had a graduate-level certificate in environmental science from the University of Akron and a private sector job assessing environmental risk for corporations. But a series of personal crises that included his sister’s suicide, he said, had compelled him to look for a way to channel his knowledge to more meaningful use.

The fellowship gave him a degree in science education in exchange for a three-year commitment to teach in a high-needs Ohio school district. Megan Sowers, the principal, had been looking for someone qualified to teach an Advanced Placement course, which could help improve her financially challenged school’s poor performance ranking. She hired him on the spot.

Mr. Sutter walking with his students on a nature trail near the high school, where he pointed out evidence of climate change. Credit Maddie McGarvey for The New York Times

But at a school where most teachers were raised in the same southeastern corner of Appalachian Ohio as their students, Mr. Sutter’s credentials themselves could raise hackles.

“He says, ‘I left a higher-paying job to come teach in an area like this,’” Jacynda recalled. “We’re like, ‘What is that supposed to mean?”’

“He acts,” Gwen said with her patented eye roll, “like he’s God’s gift to Wellston.”

In truth, he was largely winging it.

Some 20 states, including a handful of red ones, have recently begun requiring students to learn that human activity is a major cause of climate change, but few, if any, have provided a road map for how to teach it, and most science teachers, according to one recent survey, spend at most two hours on the subject.

Chagrined to learn that none of his students could recall a school visit by a scientist, Mr. Sutter hosted several graduate students from nearby Ohio University.

On a field trip to a biology laboratory there, many of his students took their first ride on an escalator. To illustrate why some scientists in the 1970s believed the world was cooling rather than warming (“So why should we believe them now?” students sometimes asked), he brought in a 1968 push-button phone and a 1980s Nintendo game cartridge.

“Our data and our ability to process it is just so much better now,” he said.

In the A.P. class, Mr. Sutter took an informal poll midway through: In all, 14 of 17 students said their parents thought he was, at best, wasting their time. “My stepdad says they’re brainwashing me,” one said.

Jacynda’s father, for one, did not raise an eyebrow when his daughter stopped attending Mr. Sutter’s class for a period in the early winter. A former coal miner who had endured two years of unemployment before taking a construction job, he declined a request to talk about it.

“I think it’s that it’s taken a lot from him,” Jacynda said. “He sees it as the environmental people have taken his job.”

And having listened to Mr. Sutter reiterate the overwhelming agreement among scientists regarding humanity’s role in global warming in answer to another classmate’s questions — “What if we’re not the cause of it? What if this is something that’s natural?” — Jacynda texted the classmate one night using an expletive to refer to Mr. Sutter’s teaching approach.

But even the staunchest climate-change skeptics could not ignore the dearth of snow days last winter, the cap to a year that turned out to be the warmest Earth has experienced since 1880, according to NASA. The high mark eclipsed the record set just the year before, which had eclipsed the year before that.

In woods behind the school, where Mr. Sutter had his students scout out a nature trail, he showed them the preponderance of emerald ash borers, an invasive insect that, because of the warm weather, had not experienced the usual die-off that winter. There was flooding, too: Once, more than 5.5 inches of rain fell in 48 hours.

The field trip to a local stream where the water runs neon orange also made an impression. Mr. Sutter had the class collect water samples: The pH levels were as acidic as “the white vinegar you buy at a grocery store,” he told them. And the drainage, they could see, was from the mine.

It was the realization that she had failed to grasp the damage done to her immediate environment, Jacynda said, that made her begin to pay more attention. She did some reading. She also began thinking that she might enjoy a job working for the Environmental Protection Agency — until she learned that, under Mr. Trump, the agency would undergo huge layoffs.

“O.K., I’m not going to lie. I did a 180,” she said that afternoon in the library with Gwen, casting a guilty look at her friend. “This is happening, and we have to fix it.”

After fleeing Mr. Sutter’s classroom that day, Gwen never returned, a pragmatic decision about which he has regrets. “That’s one student I feel I failed a little bit,” he said.

As an alternative, Gwen took an online class for environmental science credit, which she does not recall ever mentioning climate change. She and Jacynda had other things to talk about, like planning a bonfire after prom.

As they tried on dresses last month, Jacynda mentioned that others in their circle, including the boys they had invited to prom, believed the world was dangerously warming, and that humans were to blame. By the last days of school, most of Mr. Sutter’s doubters, in fact, had come to that conclusion.

“I know,” Gwen said, pausing for a moment. “Now help me zip this up.”

Posted on Categories News

A startup has invented a power cycle that runs on carbon dioxide—without emitting it

Between the energy hub of Houston, Texas, and the Gulf Coast lies a sprawling petropolis: a sea of refineries and oil storage tanks, power lines, and smokestacks, all dedicated to converting fossil fuels into dollars. They are the reason why the Houston area emits more carbon dioxide (CO2) than anyplace else in the United States.

But here, on the eastern edge of that CO2 hot spot, a new fossil fuel power plant showcases a potential remedy for Houston’s outsized greenhouse gas footprint. The facility looks suspiciously like its forebears, a complex the size of two U.S. football fields, chock-a-block with snaking pipes and pumps. It has a turbine and a combustor. But there is one thing it doesn’t need: smokestacks.

Zero-emission fossil fuel power sounds like an oxymoron. But when that 25-megawatt demonstration plant is fired up later this year, it will burn natural gas in pure oxygen. The result: a stream of nearly pure CO2, which can be piped away and stored underground or blasted into depleted oil reservoirs to free more oil, a process called enhanced oil recovery (EOR). Either way, the CO2 will be sequestered from the atmosphere and the climate.

That has long been the hope for carbon capture and storage (CCS), a strategy that climate experts say will be necessary if the world is to make any headway in limiting climate change (see sidebar, p. 798). But CCS systems bolted to conventional fossil fuel plants have struggled to take off because CO2 makes up only a small fraction of their exhaust. Capturing it saps up to 30% of a power plant’s energy and drives up the cost of electricity.

In contrast, NET Power, the startup backing the new plant, says it expects to produce emission-free power at about $0.06 per kilowatt-hour. That’s about the same cost as power from a state-of-the-art natural gas-fired plant—and cheaper than most renewable energy. The key to its efficiency is a new thermodynamic cycle that swaps CO2 for the steam that drives turbines in conventional plants. Invented by an unlikely trio—a retired British engineer and a pair of technology geeks who had tired of their day jobs—the scheme may soon get a bigger test. If the prototype lives up to hopes, NET Power says, it will forge ahead with a full-scale, 300-megawatt power plant—enough to power more than 200,000 homes—which could open in 2021 at a cost of about $300 million. Both the company and CCS experts hope that the technology will then proliferate. “This is a game-changer if they achieve 100% of their goals,” says John Thompson, a carbon capture expert at the Clean Air Task Force, an environmental nonprofit with an office in Carbondale, Illinois.

Engineer Rodney Allam conceived the carbon dioxide cycle at the heart of the new power plant PHOTO: MARC WILSON

NET POWER CEO BILL BROWN, 62, never set out to remake the energy market. A decade ago, as a dealmaking lawyer in New York City, he crafted financial trading strategies for Morgan Stanley. But he was restless. So he called Miles Palmer, a buddy from his undergraduate days at the Massachusetts Institute of Technology (MIT) in Cambridge. Palmer was a chemist for Science Applications International Corporation (SAIC), a defense contractor that designed everything from rail guns to drones. Brown suggested they “make something good for a change.” In 2008, as the economy was collapsing, they left their jobs and started 8 Rivers, a technology incubator in Durham, North Carolina, where Brown also taught law at Duke University.

They needed something to incubate. They liked the thought of doing something in the energy sector, a famously risk-averse arena, but one in which a breakthrough technology can make a fortune. First came a brief, fruitless attempt to make biofuels from algae. Then, in 2009, the Obama administration’s stimulus package offered billions of dollars in grants for “clean coal” projects—ways to reduce coal’s CO2 emissions. Palmer knew that, worldwide, coal wasn’t going away anytime soon, and he understood how it threatened the climate. “I wanted to solve that problem,” he says.

Cleaning up coal has been tough. Not only does coal release twice as much carbon pollution as natural gas, but that CO2 also makes up just 14% of the flue gas from a conventional power plant. Still, coal is plentiful and cheap, and until recently few people cared about the CO2 it unleashes. So coal-fired power plants haven’t changed much since 1882, when Thomas Edison’s company built the first one in London. Most still burn coal to boil water. The steam drives a turbine to generate electricity. At the turbine’s back end, cooling towers condense the steam into water, lest the high-pressure steam there drive the turbine in reverse. Those towers vent much of the energy used to boil the water in the first place. Overall, just 38% of coal’s energy yields electricity. “All that energy is just wasted,” Brown says.

That inefficiency helped drive utilities to natural gas. Not only is gas cleaner—and, in the United States, cheaper than coal—but because it is a gas to begin with, engineers can take advantage of an explosive expansion as it burns to drive a gas turbine. The heat of the turbine exhaust then boils water to make steam that drives additional turbines. The best natural gas “combined cycle” plants achieve nearly 60% efficiency.

The prototype NET Power plant near Houston, Texas, is testing an emission-free technology designed to compete with conventional fossil power. PHOTO: CHICAGO BRIDGE & IRON

Still, Palmer was focused on coal, the bigger climate problem. He built on work he had done at SAIC on a high-pressure combustor for burning coal in pure oxygen. It was more efficient and smaller, and so it would cost less to build. It also produced an exhaust of concentrated CO2, thus avoiding the separation costs. “I got it to work almost as well as a conventional coal plant, but with zero emissions,” Palmer says. “But it wasn’t good enough.”

Palmer and Brown needed to nudge the efficiency higher. In 2009, they contacted Rodney Allam, a chemical engineer who had run European R&D operations for Air Products, an industrial giant in the United Kingdom. Later, in 2012, Allam won a share of the $600,000 Global Energy Prize, sponsored by the Russian energy industry, for his work on industrial gas production. But at the time, he was mostly retired, concentrating on his fishing, lawn bowling, and gardening.

Palmer and Brown hired Allam as a consultant. Inspired by some Russian research from the 1930s, Allam thought he saw a way to radically reinvent the staid steam cycle. Forget about boilers, he thought. He would drive everything with the CO2 itself, making an ally out of his enemy. “The only way you could proceed was to develop a totally new power system,” Allam says.

ALLAM ENVISIONED THE CO2 circulating in a loop, cycling between a gas and what’s called a supercritical fluid. At high pressure and temperature, supercritical CO2 expands to fill a container like a gas but flows like a liquid.

For decades, engineers have worked on Brayton cycles—thermodynamic loops that take advantage of the properties of supercritical fluids, which could be air or CO2 (see Perspectives, p. 805). Supercritical fluids offer advantages: Because they are fluids, a pump can pressurize them, which takes far less energy than a compressor needs to pressurize a gas. And because of the fluidlike gas’s extra density, it can efficiently gain or shed heat at heat exchangers.

In Allam’s particular Brayton cycle, CO2 is compressed to 300 times atmospheric pressure—equivalent to a depth of 3 kilometers in the ocean. Then fuel is burned to heat the CO2 to 1150°C, which turns it supercritical. After the CO2 drives a turbine, the gas’s pressure drops and it turns into a normal gas again. The CO2 is then repressurized and returned to the front end of the loop. A tiny amount of excess CO2—exactly as much as burning the fuel created—is shunted into a pipeline for disposal.

The Allam cycle, as it is now called, comes with costs. Giant cryogenic refrigerators must chill air—which is mostly nitrogen—to extract the pure oxygen needed for combustion. Compressing CO2 into a supercritical state also sucks up energy. But both steps are well-known industrial processes. Allam calculated that discarding the steam cycle would boost the 38% efficiency of a coal plant to 56%. That would put it within striking distance of the efficiency of a contemporary combined cycle plant. As a bonus, the exhaust is nearly pure CO2 that can be sold for EOR. Another perk is that the Allam cycle generates water as a byproduct of combustion, instead of consuming it voraciously as conventional steam cycles do, which could make plants easier to site in arid parts of the world.

At this point, Brown and Palmer were still planning to use coal as their fuel. But when they sent Allam’s handiwork to the engineering firm Babcock & Wilcox, to see whether the system would work on an industrial scale, “they had good news and bad news,” Brown says. On the downside, the Allam cycle would be tough to pull off with coal, at least initially, because the coal would first have to be converted to a synthetic gas, which adds cost. Also, sulfur and mercury in that syngas would have to be filtered out of the exhaust. But on the upside, the engineers saw no reason why the technique wouldn’t work with natural gas, which is ready to burn and doesn’t have the extra contaminants.

Brown and Palmer gave up on winning a clean coal grant from the government. Instead, they sought private investment for a far bigger prize: revolutionizing energy production with carbon capture. By 2014, 8 Rivers had secured $140 million in funding from Exelon and Chicago Bridge & Iron, two industrial giants that now co-own the NET Power demo plant. In March 2016, the company broke ground on its pilot plant outside Houston.

“This is the biggest thing in carbon capture,” says Howard Herzog, a chemical engineer and carbon capture expert at MIT. “It’s very sound on paper. We’ll see soon if it works in reality. There are only a million things that can go wrong.”

ONE OF THOSE IS THE NEW TURBINE, which needs to work at intense temperatures and pressures. Some steam turbines reach those extremes, but “no one had ever designed a turbine to do that with CO2 as the working fluid,” says NET Power spokesperson Walker Dimmig. In 2012, NET Power officials inked a deal to have the Japanese conglomerate Toshiba retool one of its high-pressure steam turbines to work with supercritical CO2, which required changing the lengths and angles of the turbine blades. Toshiba also engineered a new combustor to mix and burn small amounts of oxygen and natural gas in the midst of a gust of hot supercritical CO2—a problem not unlike trying to keep a fire going while dousing it with a fire extinguisher.

The re-engineered combustor and turbine were tested in 2013 and delivered to the demo plant in November 2016. Now, they are being integrated with the rest of the facility’s components, and the plant is undergoing preliminary testing before ramping up to full power sometime this fall. “I’m 100% confident it will work,” Allam says.

If it does, Brown says, NET Power will have advantages that could encourage widespread market adoption. First, the CO2 emerging from the plant is already pressurized, ready to be injected underground for EOR, unlike CO2 recovered from natural gas wells—the usual source.

Another advantage is the plant’s size. Not only are the heat exchangers much smaller and cheaper to build than massive boilers, but so are many of the other components. The 25-megawatt supercritical CO2 turbine, for example, is about 10% the size of an equivalent steam turbine. Overall, NET Power plants are expected to be just one-quarter the size of an equivalent advanced coal plant with carbon capture, and about half the size of a natural gas combined cycle with carbon capture. That means less concrete and steel and lower capital costs. “For many CCS projects, the upfront costs are daunting,” says Julio Friedmann, a carbon capture expert at Lawrence Livermore National Laboratory in Livermore, California. “Avoiding those costs really matters.” What’s more, unlike gas plants without carbon capture, NET Power will be able to sell its CO2 for EOR.

EVEN IF NET POWER’S TECHNOLOGY works as advertised, not everyone will be a fan. Lukas Ross, who directs the climate and energy campaign at Friends of the Earth in Washington, D.C., notes that the natural gas that powers the plant comes from hydraulic fracturing, or “fracking,” and other potentially destructive practices. And providing a steady supply of high-pressure gas for EOR, he adds, will only perpetuate a reliance on fossil fuels. Ross argues that money would be better spent on encouraging broad deployment of renewable energy sources, such as solar and wind power.

GRAPHIC: C. BICKEL/SCIENCE

Yet oddly enough, NET Power could help smooth the way for renewables to expand. The renewable portfolio standards in many countries and U.S. states require solar, wind, and other carbon-free sources to produce an increasing proportion of the electric power supply. But those sources are intermittent: The power comes only when the sun is shining and the wind is blowing. Nuclear and fossil fuel sources provide “base load” power that fills the gaps when renewables aren’t available. Conventional natural gas power plants, in particular, are viewed as a renewable-friendly technology because they can be ramped up and down quickly depending on the supply of renewable power.

As an emission-free alternative, NET Power’s plants could enable communities to deploy even more renewables without having to add dirty base-load sources. “Fossil fuel carbon-free power allows even more aggressive deployment of renewables,” says George Peridas, an environmental policy analyst with the Natural Resources Defense Council in San Francisco, California.

That’s a combination Allam wants to promote. “I’m not knocking renewables, but they can’t meet future power demands by themselves,” he says. Allam, a longtime member of the Intergovernmental Panel on Climate Change, says time for solving carbon pollution is running short—for both the world and himself. “I’m 76,” he says. “I’ve got to do this quickly.”

Posted on Categories News

Exelon Moves to Pull Plug on Three Mile Island Nuclear Power Plant

The Three Mile Island nuclear power plant was the site of a partial core meltdown in 1979. Its owner says it may now shut it down. PHOTO: MATT ROURKE/ASSOCIATED PRESS

Exelon Corp. warned Tuesday that it will close the Three Mile Island nuclear power plant in Pennsylvania in 2019 unless it receives government aid, the latest sign of how the sector is in danger of shrinking as it faces intense competition in the U.S.

A global symbol of the potential perils of nuclear power after suffering a partial meltdown in 1979, the plant has been losing money for years. Last week, it failed to sell its electricity in advance in a regional power auction for 2020 and 2021, the third year in a row it did not find a buyer.

As a result, Exelon said it was accelerating its retirement unless it receives assistance from the federal government or the state, which has been reluctant to subsidize it as some states have done to keep their nuclear facilities running. Three Mile Island has a federal license to operate until 2034.

“Like New York and Illinois before it, [Pennsylvania] has an opportunity to take a leadership role by implementing a policy solution to preserve its nuclear energy facilities,” said Exelon Chief Executive Chris Crane. The company said it was taking one-time charges of up to $110 million for 2017 in connection with the planned closure.

Utilities have been closing U.S. nuclear-power plants at a rapid clip due to political pressure from critics and growing competition from other electricity sources, notably the increasing number of plants fired by natural gas as horizontal drilling and hydraulic fracturing unlock mass quantities of the fuel.

Power demand in the U.S. has been flat for nearly a decade, creating a battle for market share. Last year, natural gas generated 34% of the electricity in the U.S., according to federal data. Nuclear power generated 20%, and coal 30%. The rest came from renewable sources, including hydroelectric dams.

Three Mile Island would be at least the fifth U.S. nuclear facility set to close by 2025, including PG&E Corp.’s Diablo Canyon plant in California, and Entergy Corp.’s Palisades unit in Michigan and the Indian Point plant in New York.

Four other facilities have already closed in the past four years, including Dominion Resources Inc.’s Kewaunee plant in Wisconsin. The retirements would leave about 60 nuclear plants in the U.S.

A little more than a decade ago, the U.S. nuclear industry was talking about a rebirth. But the first new nuclear units being built in the country in years, facilities in Georgia and South Carolina, are years behind schedule and billions over budget.

Southern Co. and Scana Corp. , the utilities behind the new plants, are now scrambling to determine how much it will cost to finish them after their builder, Westinghouse Electric Co., declared bankruptcy in March.

The fate of the new plants could help determine the future of U.S. nuclear power. Late last year, the Tennessee Valley Authority sold two unfinished nuclear units in northern Alabama for $111 million after spending billions since the 1970s on the project.

“We have to find a way to build these reactors in the U.S.,” Jose Gutierrez, Westinghouse’s interim chief executive, said last week. “Otherwise, the future is going to be compromised.”

Even if the nukes get built, their hardships underscore the fact that nuclear power remains a complex business full of booby traps, analysts say.

“The nuclear renaissance is dead for the foreseeable future,” said Steve Fleishman, managing director at Wolfe Research.

Exelon and other operators have sought state subsidies to keep plants running, arguing that they create high-paying jobs and do not emit air pollution or greenhouse gases.

Three Mile Island employs 675 people and contracts with another 1,500 workers. Exelon said Tuesday that it provides roughly 93% of the emissions-free electricity in Pennsylvania.

Exelon has succeeded in persuading some states to provide new financial incentives. Last year, Illinois lawmakers voted to allow Exelon to collect as much as $235 million annually from customers in exchange for keeping two nuclear power plants open.

But the deals have been controversial due to opposition from critics of nuclear power and from such independent power producers as Dynegy Inc. and NRG Energy Inc. that own coal-fired power plants and other sources of electricity.

Pennsylvania lawmakers in March formed a bipartisan caucus to discuss possible funding. State Rep. Dave Hickernell, who represents the area where Three Mile Island is located and is a member of the caucus, said he hopes Exelon’s decision can be reversed.

A spokesman for Pennsylvania Gov. Tom Wolf said the Democrat was concerned about potential layoffs from a Three Mile Island closure and was open to a conversation with state lawmakers about the future of nuclear power in the state.

Three Mile Island drew international attention in 1979 when a partial core meltdown in one of its two reactors led to five days of panic. The reactor involved was permanently shut down and the incident was followed by 14 years of expensive cleanup, heightening awareness of the potential safety problems of nuclear plants.

Shares of Exelon were up 0.5% at $36 around 3:30 p.m. EDT Tuesday.

—Miguel Bustillo contributed to this article.

Posted on Categories News

Beyond Batteries: Other Ways to Capture and Store Energy

Storing electricity on a large scale has been a bigger challenge than generating it and keeping it flowing. PHOTO: STEVE HOCKSTEIN/BLOOMBERG NEWS

Unlike oil, which can be stored in tanks, and natural gas, which can be kept in underground caverns, electricity has been a challenge to bottle.

But that is starting to change.

These days, companies including Elon Musk’s Tesla Inc. are selling lithium-ion batteries, similar to those that power electric cars, to utilities, businesses and homeowners, who use them to store electricity, mostly for short periods.

But now, some nonbattery technologies are gaining traction as utilities continue to look for economical ways to capture and store power.

These alternatives have a longer lifetime than chemical batteries, which generally need to be switched out after about 10 years, and some can store and discharge more electricity.

Here’s a look at three of the technologies.

PUMPED HYDROPOWER: Pumped hydropower is a century-old technology that is getting a fresh look, as developers turn old mines into holding tanks for water. During periods when electricity is cheap and abundant, pumps are used to push large volumes of water uphill, where it is stored in giant basins. When extra power is needed on the grid, the water is released and gravity pulls it downhill and through generators that produce electricity.

Eagle Crest Energy Co. plans to build a $2 billion pumped-hydropower facility at an abandoned iron mine east of Palm Springs, Calif. The plant would have a capacity of 1,300 megawatts, enough to power nearly one million homes, and would be able to generate power for about 10 hours at a time. The plant could soak up excess power overnight, when demand is slack, and during the day, when California’s solar farms are churning out electricity, and then return the juice in the evening, after the sun sets and power use rises in cities and towns. Several similar projects are awaiting government permits.

FLYWHEELS: Flywheels store electricity in the form of kinetic energy. The basic technology, in which a wheel spins at high speed, has been around for decades and used for various applications, including storing and discharging power in momentary spurts. Newer flywheels, such as those developed by Amber Kinetics Inc., based in Union City, Calif., can hold their rotation longer, creating electricity that can be discharged over four hours.

With Amber Kinetics’ technology, an electric motor turns a 5,000-pound steel rotor until it is spinning at thousands of rotations a minute, a process that takes a few hours. The rotor is housed inside a vacuum chamber—the air is sucked out to remove friction. An electromagnet overhead lifts the steel rotor off its bearings, which allows it to spin quickly without requiring a lot of electricity. Indeed, the company says the steel disks, which resemble giant hockey pucks, can maintain their rotation with the same amount of electricity as it takes to power a 75-watt lightbulb.

The flywheel stores the energy in its continuous motion. When power is needed on the grid, the flywheel connects to a generator, and its momentum turns the generator’s shaft to produce electricity.

COMPRESSED AIR: Machines that use compressed air have been around for more than 100 years. Various attempts to use compressed air to store electricity have been tried over the past few decades, but high costs and technical challenges kept the technology from advancing, until now, according to Toronto-based Hydrostor, which is building next-generation compressed-air energy-storage facilities in Canada and Aruba, and says it is in talks with two utilities for additional projects.

Hydrostor uses electricity when it is cheap and abundant to run an air compressor, purchased off the shelf from General Electric Co. or Siemens . The compressor squeezes air into a pipeline and down into a hole the length and width of a football field and up to four stories tall, that the company digs deep underground and fills with water. When the pressurized air is piped into the underground cavern, it displace the water up a shaft and into an adjacent pond. Then the pipeline valve is shut. When electricity is needed, the valve of the air pipe is opened and the air rushes up.

When air is compressed, it becomes hot. To be stored, the air needs to be cooled, but to be reused to generate electricity, it needs to be hot. In the old days, the heat from the compressed air would be vented, and later the air would be reheated using a natural-gas-fired motor. Hydrostor, however, removes the heat from the compressed air and stores it in a tank filled with waxes and salts. When the air is brought back up to the surface, it is reheated with the hot wax and salts, then pushed through a turbine, where it generates electricity.

Ms. Sweet is a writer in San Francisco. She can be reached at reports@wsj.com.

Appeared in the May. 22, 2017, print edition as ‘Beyond Batteries.’

Posted on Categories News, Uncategorized

California set an ambitious goal for fighting global warming. Now comes the hard part.

When Stanford University energy economist Danny Cullenward looks at California’s policies on climate change, he sees a potential time bomb.

The state wants to slash greenhouse gas emissions so deeply in the coming years that oil refineries and other industries could face skyrocketing costs to comply with regulations, driving up gasoline prices until the system loses political support. If that happens, an effort touted as an international model for fighting global warming could collapse.

Not everyone agrees with Cullenward’s assessment, but it reflects how experts, officials and lawmakers are starting to reckon with the state’s steep ambitions and the understanding that its current policies may no longer be adequate. Although California has been gliding toward its initial goal of reducing emissions to 1990 levels by 2020, it must cut an additional 40% by 2030 under a law signed by Gov. Jerry Brown last year.

“It’s going to take bold proposals to get us to where we need to be,” said Cullenward, who has helped shape legislation in the Capitol.

Getting the details right means the difference between California burnishing its role as an incubator for innovation or proving itself to be a canary in the coal mine, and lawmakers are sorting through a flood of ideas this year. One proposal would accelerate the adoption of renewable energy and eventually phase out all fossil fuels for generating electricity. Some advocates want a regional power grid to share clean energy across state lines. Everyone is looking for ways to turn climate policies into jobs in their local communities.

“We’ve already decided as a state and as a Legislature that we want to dramatically reduce pollution and move forward toward a clean energy future,” Senate leader Kevin de León (D-Los Angeles) said. “That debate is over. Now we’re deciding how to get there.”

Those conversations, however, could prove just as contentious as previous debates, expose divisions among environmentalists and force lawmakers to make difficult decisions about squeezing a state with the world’s sixth-largest economy into a dramatically smaller emissions footprint.

“Everything is a huge question mark,” said Rob Lapsley, president of the California Business Roundtable, which represents the state’s largest corporations.

Front and center is the haggling over extending the cap-and-trade program, which requires companies to buy permits to release greenhouse gases into the atmosphere. Permits can also be traded on a secondary market. It’s the only system of its kind in the country, and it faces steep legal challenges that can only be fully resolved with new legislation to keep it operating.

Brown wants to settle the issue next month, and there’s wide consensus around keeping the program in some form. Oil companies that once launched an expensive ballot campaign to block it are now negotiating its extension — albeit on terms that are friendlier to their industry — and even some Republicans are on board with the idea.

But there are still disagreements over how to move forward, some of which were highlighted with the recent release of a proposal Cullenward worked on with one of his Stanford colleagues, environmental law professor Michael Wara.

The legislation, SB 775, which was written by Sen. Bob Wieckowski (D-Fremont) and backed by De León, would create a higher minimum price for emission permits and increase it annually to provide a steeper incentive for companies to clean up their operations.

There would also be a firm ceiling on how high prices could climb to guard against sticker shock as permits become more valuable while the state ratchets down emissions to meet its 2030 goal.

The legislation would make another significant change. Instead of sending cap-and-trade revenue only to projects intended to reduce emissions, such as subsidies for electric cars or affordable housing near mass transit, a chunk of the money would be distributed to Californians — much like a tax rebate.

Although it’s not yet clear how the rebates would function, the proposal is an acknowledgement that costs for gasoline and electricity are likely to rise, and lawmakers want to help insulate voters from the effects. The state’s transition toward low-emission technology could prove expensive over time, requiring the purchase of millions of electric vehicles and shuttering natural gas operations in favor of new solar plants.

“This is a massive infrastructure replacement program for California,” said Snuller Price, senior partner at E3, an energy efficiency consulting firm that has worked with state regulators. “We’re swapping out all of our things.”

Wara said California needs different policies to set a new target.

“It’s a totally different animal. We need to acknowledge that,” he said. “It’s going to a level of [emissions] reductions that no one has ever achieved.”

The idea has been embraced by some policy wonks — “state of the art,” one writer proclaimed — but others see peril in this approach. No longer would companies be able to finance offsets — green projects intended to lower emissions anywhere in the country — to meet their obligations under the program, cutting the flow of cash from California industries to environmental efforts nationwide.

The legislation also includes a modification to the program that some environmental advocates fear would make it harder to ensure the state meets its goals. Under the new proposal, the state would sell an unlimited number of permits if prices reach the ceiling, rather than restricting how many are available as the program does now.

That adjustment would make cap and trade function more like a tax, said Nathaniel Keohane, vice president at the Environmental Defense Fund, which is critical of the proposal and doesn’t see the same threat of price spikes.

“It’s a fundamental, philosophical thing,” he said.

Senate leader Kevin de León launches his push to phase out the use of fossil fuels for generating electricity at a solar farm in Davis on May 2. (Chris Megerian / Los Angeles Times)

De León wants to accelerate the process of reducing emissions from generating electricity. He launched new legislation, SB 100, to require the state to use renewable sources such as wind and solar for 60% of its power by 2030, up from the current target of 50%. By 2045, the use of fossil fuels such as coal and natural gas to generate electricity would no longer be allowed.

It’s a big climb — about 20% of the state’s electricity came from renewable sources in 2015, the latest figures available. The proposal has been embraced by labor groups who see jobs in building new infrastructure, but some are skeptical.

Brent Newell, legal director at the Center on Race, Poverty and the Environment, doesn’t want to see incentives for producing bio-gas from cow manure at industrial-scale dairies, which are a source of air and water pollution.

Although the legislation is “pointing the compass” in the right direction, he said, “that’s not clean energy.”

Reaching the point where fossil fuels aren’t used to keep the lights on will require new approaches to California’s electricity grid. Renewable sources can be difficult to manage because it’s impossible to control when the sun shines or the wind blows. The challenge is finding ways to soak up electricity when there’s too much, such as charging batteries or pumping water into reservoirs, and then releasing it when needed.

Another approach involves integrating California’s grid with other states, providing a wider market for excess solar energy that’s produced here on sunny days and allowing more wind energy to flow in from turbines elsewhere in the region.

“There’s a huge amount of efficiency to be gained,” said Don Furman, who directs the Fix the Grid campaign.

The idea would require California to share control of the electricity grid with other states, which unnerves some lawmakers and advocates. Unions also fear changes that would make building energy projects more attractive outside of California.

Debates over these issues are drawing the most attention in the Capitol, but other proposals are bubbling up as well, a sign that many lawmakers want to get involved in the issue.

One measure would make it easier for low-income Californians to access solar power. Another would create a system for tracking electricity consumption to help pinpoint areas for more efficiency.

“A lot of small steps create big momentum,” said Lauren Navarro, a senior policy manager at the Environmental Defense Fund. “These are pieces of what it takes to get to a clean-energy economy.”

Posted on Categories News