Novim News

Fisticuffs Over the Route to a Clean-Energy Future

Offshore wind farm turbines near Block Island, R.I. Claims that it is quite feasible to power the American economy entirely with energy from wind, sun and water are under fire.
CHANG W. LEE / THE NEW YORK TIMES

Could the entire American economy run on renewable energy alone?

This may seem like an irrelevant question, given that both the White House and Congress are controlled by a party that rejects the scientific consensus about human-driven climate change. But the proposition that it could, long a dream of an environmental movement as wary of nuclear energy as it is of fossil fuels, has been gaining ground among policy makers committed to reducing the nation’s carbon footprint. Democrats in both the United States Senate and in the California Assembly have proposed legislation this year calling for a full transition to renewable energy sources.

They are relying on what looks like a watertight scholarly analysis to support their call: the work of a prominent energy systems engineer from Stanford University, Mark Z. Jacobson. With three co-authors, he published a widely heralded article two years ago asserting that it would be eminently feasible to power the American economy by midcentury almost entirely with energy from the wind, the sun and water. What’s more, it would be cheaper than running it on fossil fuels.

And yet the proposition is hardly as solid as Professor Jacobson asserts.

In a long-awaited article published this week in The Proceedings of the National Academy of Sciences — the same journal in which Professor Jacobson’s manifesto appeared — a group of 21 prominent scholars, including physicists and engineers, climate scientists and sociologists, took a fine comb to the Jacobson paper and dismantled its conclusions bit by bit.

“I had largely ignored the papers arguing that doing all with renewables was possible at negative costs because they struck me as obviously incorrect,” said David Victor of the University of California, San Diego, a co-author of the new critique of Professor Jacobson’s work. “But when policy makers started using this paper for scientific support, I thought, ‘this paper is dangerous.’”

The conclusion of the critique is damning: Professor Jacobson relied on “invalid modeling tools,” committed “modeling errors” and made “implausible and inadequately supported assumptions,” the scholars wrote. “Our paper is pretty devastating,” said Varun Sivaram from the Council on Foreign Relations, a co-author of the new critique.

The experts are not opposed to aggressive investments in renewable energy. But they argue, as does most of the scientific community represented on the Intergovernmental Panel on Climate Change, that other energy sources — atomic power, say, or natural gas coupled with technologies to remove carbon from the atmosphere — are likely to prove indispensable in the global effort to combat climate change. Ignoring them risks derailing the effort to combat climate change.

But with the stakes so high, the gloves are clearly off.

Professor Jacobson is punching back hard. In an article published in the same issue of the Proceedings and in a related blog post, he argues that his critics’ analysis “is riddled with errors and has no impact” on his conclusions.

In a conversation over the weekend, he accused his critics of being shills for the fossil fuel and nuclear industries, without the standing to review his work. “Their paper is really a dangerous paper,” he told me.

In San Francisco, cooking oil is collected for recycling into biofuels. Mark Z. Jacobson, a Stanford engineer, claims renewables can provide 100 percent of the nation’s energy needs in a few decades without bioenergy, which today contributes about half of the country’s renewable energy production
JUSTIN SULLIVAN / GETTY IMAGES

But on close examination, Professor Jacobson’s premise does seem a leap of faith.

Renewable sources provide only about a tenth of the United States’ energy consumption. Increasing the penetration of intermittent energy sources from the sun and the wind is already proving a challenge for the electricity grid in many parts of the world.

Professor Jacobson not only claims renewables’ share can be ramped up on the cheap to 100 percent within a few decades, but also that it can be done without bioenergy, which today contributes about half of the country’s renewable-energy production.

And yet under the microscope of the critics — led by Christopher Clack, chief executive of the grid modeling firm Vibrant Clean Energy and formerly with the National Oceanic and Atmospheric Administration and the University of Colorado, Boulder — his proposed system does not hold together.

The weakness of energy systems powered by the sun and the wind is their intermittency. Where will the energy come from when the sun isn’t shining and the wind isn’t blowing? Professor Jacobson addresses this in two ways, vastly increasing the nation’s peak hydroelectricity capacity and deploying energy storage at a vast scale.

“To repower the world, we need to expand a lot of things to a large scale,” Professor Jacobson told me. “But there is no reason we can’t scale up.”

Actually, there are reasons. The main energy storage technologies he proposes — hydrogen and heat stored in rocks buried underground — have never been put in place at anywhere near the scale required to power a nation, or even a large city.

His system requires storing seven weeks’ worth of energy consumption. Today, the 10 biggest storage systems in the United States combined store some 43 minutes. Hydrogen production would have to be scaled up by a factor of 100,000 or more to meet the requirements in Professor Jacobson’s analysis, according to his critics.

Professor Jacobson notes that Denmark has deployed a heating system similar to the one he proposes. But Denmark adapted an existing underground pipe infrastructure to transport the heat, whereas a system would have to be built from scratch in American cities.

Professor Jacobsen envisions extensive systems to store the intermittent energy produced by solar and wind technologies.
SCOTT MCINTYRE FOR THE NEW YORK TIMES

A common thread to the Jacobson approach is how little regard it shows for the political, social and technical plausibility of what would undoubtedly be wrenching transformations across the economy.

He argues for the viability of hydrogen-fueled aviation by noting the existence of a hydrogen-powered four-seat jet. Jumping from that to assert that hydrogen can economically fuel the nation’s fleet within a few decades seems akin to arguing that because the United States sent a few astronauts to the moon we will all be able to move there soon.

He proposes building and deploying energy systems at a scale that has never been achieved and at a speed that nobody has ever tried. He assumes an implausibly low cost of capital. He asserts that most American industry will easily adjust its schedule to the availability of energy — unplugging when the wind and sun are down regardless of the needs of workers, suppliers, customers and other stakeholders.

And even after all this, the system fails unless it can obtain vast amounts of additional power from hydroelectricity as a backup at moments when other sources are weak: no less than 1,300 gigawatts. That is about 25 percent more power than is produced by all sources combined in the United States today, the equivalent of 600 Hoover Dams.

Building dams is hardly uncontroversial. So Professor Jacobson proposes adding this capacity with “zero increase in dam size, no annual increase in the use of water, no new land,” simply by adding a lot more turbines to existing dams. It is not obvious that so many of them can be added, however, or at what cost. Especially considering they would be unproductive 90 percent of the time and for use only as a backstop. What’s more, adding turbines does not increase the available energy at any given time unless there is more water pushing through them.

Ken Caldeira of the Carnegie Institution for Science, one of the lead authors of the critique, put it this way: The discharge rate needed from the nation’s dams to achieve the 1,300 gigawatts would be equivalent to about 100 times the flow of the Mississippi River. Even if this kind of push were available, it is not hard to imagine that people living downstream might object to the release of such vast amounts of water.

“The whole system falls apart because this is the very last thing that is used,” Professor Clack noted. “If you remove any of this, the model fails.”

It is critically important to bring this debate into the open. For too long, climate advocacy and policy has been inflected by a hope that the energy transformation before us can be achieved cheaply and virtuously — in harmony with nature. But the transformation is likely to be costly. And though sun, wind and water are likely to account for a much larger share of the nation’s energy supply, less palatable technologies are also likely to play a part.

Policy makers rushing to unplug existing nuclear reactors and embrace renewables note: Shuttering viable technological paths could send us down a cul-de-sac. And we might not be possible to correct course fast enough.

Correction: June 20, 2017
An earlier version of this column included an outdated affiliation for one scientist, Christopher Clack. He is now chief executive of the grid modeling firm Vibrant Clean Energy; he is no longer with the National Oceanic and Atmospheric Administration and the University of Colorado, Boulder.

Posted on Categories News

Safety lapses undermine nuclear warhead work at Los Alamos

A 2014 aerial view of the Los Alamos National Laboratory Plutonium Facility 4 where production setbacks occurred after a safety near miss. (Google)

An extended shutdown of the nation’s only scientific laboratory for producing and testing the plutonium cores for its nuclear weapons has taken a toll on America’s arsenal, with key work postponed and delays looming in the production of components for new nuclear warheads, according to government documents and officials.

The unique research and production facility is located at Los Alamos National Laboratory (LANL) in New Mexico, the birthplace of the U.S. atomic arsenal. The lab’s director ordered the shutdown in 2013 after the Washington official in charge of America’s warhead production expressed worries that the facility was ill-equipped to prevent an accident that would kill its workers and potentially others nearby.

Parts of the facility began renewed operations last year, but with only partial success. And workers there last year were still violating safety rules for handling plutonium, the unstable man-made metal that serves as the sparkplug of the thermonuclear explosions that American bombs are designed to create.

Los Alamos’s persistent shortcomings in plutonium safety have been cited in more than 40 reports by government oversight agencies, teams of nuclear safety experts and the lab’s own employees over the past 11 years. Some of these reports say that safety takes a back seat to meeting specific goals for nuclear warhead maintenance and production by private contractors running the labs. Nuclear workers and experts say the contractors have been chasing lucrative government bonuses tied to those goals.

With key work at Los Alamos deferred due to safety problems, officials and experts say the United States risks falling behind on an ambitious $1 trillion update of its nuclear arsenal, which former president Barack Obama supported and President Trump has said he wants to “greatly strengthen and expand.”

During the hiatus, Los Alamos has had to forego 29 planned tests of the safety and reliability of plutonium cores in warheads now deployed atop U.S. submarine-launched and land-based missiles and in bombs carried by aircraft. The facility also hasn’t been able to make new plutonium cores to replace those regularly withdrawn from the nuclear arsenal for testing or to be fit into warheads, which are being modernized for those missiles and bombers at a projected cost of billions of dollars.

“The laboratory shut down an important facility doing important work,” said James McConnell, the associate administrator for safety, infrastructure and operations at the National Nuclear Security Administration (NNSA), a semiautonomous arm of the Energy Department, in a recent interview at the agency’s Washington headquarters. “What we didn’t have was the quality program that we want.”

Ernest Moniz, the Massachusetts Institute of Technology physicist who served almost four years as President Obama’s energy secretary, said in a separate interview that “we were obviously quite concerned about” the shutdown at Los Alamos. Moniz said he considered the situation there a “mess” and the testing interruption “significant.”

“I don’t think it has, at this stage, in any way seriously compromised” the nuclear arsenal, Moniz said. But he added that it was still his conviction that “obviously we’ve got to get back to that” work as soon as possible. A mock plutonium core was made at Los Alamos last year in a demonstration timed to coincide with a visit by Ashton B. Carter, then secretary of defense.

U.S. Secretary of Defense Ash Carter tours the Los Alamos National Laboratory Plutonium Facility 4 in 2016. (Los Alamos National Laboratory)

At a public hearing in Santa Fe on June 7, McConnell said that while Los Alamos is making progress, it is still unable to resolve the safety issue that provoked its shutdown four years ago, namely an acute shortage of engineers who are trained in keeping the plutonium at the facility from becoming “critical” and fissioning uncontrollably. “They’re not where we need them yet,” he said of the lab and its managers.

A February report by the Defense Nuclear Facilities Safety Board, an independent safety advisory group chartered by Congress, detailed the magnitude of the gap. It said Los Alamos needs 27 fully qualified safety engineers specialized in keeping the plutonium from fissioning out of control. The lab has 10.

Some of the reports obtained by the Center for Public Integrity described flimsy workplace safety policies that left workers ignorant of proper procedures as well as incidents where plutonium was packed hundreds of times into dangerously close quarters or without the shielding needed to block a serious accident. The safety risks at the Los Alamos plutonium facility, which is known as PF-4, were alarmingly highlighted in August 2011, when a “criticality accident,” as it’s known, was narrowly averted, one of several factors prompting many safety officials there to quit.

A criticality accident is an uncontrolled chain reaction involving a fissionable material such as plutonium that releases energy and generates a deadly burst of radiation. Its prevention has been an important challenge for the nuclear weapons program since the 1940s. Criticality accidents have occurred 60 times at various nuclear sites in the last half-century, causing a total of 21 agonizing deaths.

Three workers at Los Alamos died in preventable criticality accidents in the 1940s and 1950s. The most recent criticality-related deaths elsewhere occurred in 1999 at a factory north of Tokyo, where Japanese technicians accidentally mixed too much highly enriched uranium into some wide-mouth buckets. A burst of radiation — and its resulting characteristic blue glow — provoked school and road closures and the evacuation of those living nearby, plus a Japanese government order for 310,000 others to shelter in place.

The building in Japan where a 1999 criticality accident caused deaths and an evacuation. (Nuclear Regulatory Commission)

The problems at Los Alamos were revealed by a year-long investigation by the Center for Public Integrity, which also found several unpublicized accidents at other privately run U.S. nuclear facilities. The investigation, which can be read in full at the Center for Public Integrity’s website, also showed that the penalties imposed by the government for these errors were typically small, relative to the tens of millions of dollars the NNSA gives to each of the contractors annually in pure profit. Some contractors involved in repeated workplace safety incidents were also awarded contract extensions and renewals by officials in Washington.

Asked about the Los Alamos facility’s record, NNSA spokesman Gregory Wolf responded that “we expect our contractors to perform work in a safe and secure manner that protects our employees, our facilities, and the public. When accidents do occur, our focus is to determine causes, identify corrective actions and prevent recurrences.”

Kevin Roark, the spokesman for the consortium of firms hired by the government to run the lab, said in an email that he would defer to the NNSA’s response. Charles McMillan, the Los Alamos lab’s director since 2011, who receives government-funded compensation exceeding $1 million a year, declined to be interviewed about its safety records or the national security consequences of the shutdown. But he said in a 2015 promotional video that “the only way” the lab can accomplish its vital national security mission “is by doing it safely.”

A near-calamity

Los Alamos’s handling of plutonium was the target of internal and external criticism a decade ago, around the time of its takeover by three profit-making firms — Bechtel National Inc., URS (now AECOM) and BWXT Government Group Inc. — in an alliance with the University of California. “We couldn’t prove we were safe,” said Douglas Bowen, a nuclear engineer on the laboratory’s criticality safety staff at the time, “not even close.”

In September 2007, the facility in question — technically known as PF-4 for Plutonium Facility Four and located in a highly secure part of the Los Alamos campus in the mountains above Santa Fe — was shut for a month while managers conducted new training and created an internal safety board to fix its problems. But in 2010, when the Energy Department did a checkup, it found “no official notes or records” the board had ever met, according to a report at the time.

Alarms were sounded more loudly after a nuclear technician positioned eight plutonium rods dangerously close together inside what is called a glovebox — a sealed container meant to contain the cancer-causing plutonium particles — on the afternoon of Aug. 11, 2011, to take a photograph for senior managers. Doing so posed the risk that neutrons emitted routinely by the metal in the rods would collide with the atoms of other particles, causing them to fission enough to provoke more collisions and begin an uncontrolled chain reaction of atom splitting.

Rods of plutonium placed precariously close for the purpose of taking this 2011 photo. The error caused a multiyear production setback. ( NNSA)

As luck had it, a supervisor returned from her lunch break and noticed the dangerous configuration. But she then ordered the technician to reach into the box and move the rods apart, and a more senior lab official ordered others present to keep working. Both decisions increased, rather than diminished, the likelihood of an accident, because bodies — and even hands — contain water that can reflect and slow the neutrons, increasing the likelihood of a criticality and its resulting radiation burst.

“The weird thing about criticality safety is it’s not intuitive,” Don Nichols, a former chief for defense nuclear safety at NNSA, said in an interview. The calculations involved in avoiding criticality — which take account of the shape, size, form, quantity and geometric configuration of the plutonium as it moves through more than a dozen messy industrial processes — are so complex that it takes 18 months of training for an engineer to become qualified, and as many as five years to become proficient.

That’s why the consequences of the 2011 incident were so severe, even though a criticality did not occur. Virtually all the criticality specialists responsible for helping to keep workers safe at Los Alamos decided to quit, having become frustrated by the sloppy work demonstrated in the incident and what they considered the lab management’s callousness about nuclear risks when higher profits were at stake, according to interviews and government reports.

Bowen recalled frequently hearing an official with one of the private contractors running PF-4 say “we don’t even need a criticality-safety program,” and that the work was costing the contractor too much money. Former NNSA official Nichols confirmed the exodus of trained experts, saying that due to “some mismanagement, people voted with their feet. They left.” The attrition rate was around 100 percent, according to a “lessons-learned” report completed last month by the lab’s current criticality safety chief and the lone NNSA expert assigned to that issue in the agency’s Los Alamos oversight office.

Workers at the Los Alamos National Laboratory Plutonium Facility 4. (NNSA/Los Alamos)

The exodus provokes the shutdown

The lab’s inability to fend off a deadly accident eventually became apparent to Washington.

Four NNSA staff members briefed Neile Miller, the agency’s acting administrator in 2013, in an anteroom of her office overlooking the Mall that year, Miller recalled. The precise risks did not need an explanation, she said. She said that criticality is “one of those trigger words” that should immediately get the attention of anyone responsible for preventing a nuclear weapons disaster.

With two of the four experts remaining in her office, Miller picked up the phone that day and called McMillan at the Los Alamos complex, which is financed by a federal payment exceeding $2 billion a year. She recommended that the key plutonium lab inside PF-4 be shut down, immediately, while the safety deficiencies were fixed.

McMillan responded that he had believed the problems could be solved while that lab kept operating, Miller said. He was “reluctant” to shut it down, she recalled. But as the telephone conversation proceeded, he became open to her view that the risks were too high, she added. So on McMillan’s order, the lab was shut within a day, with little public notice.

The exact cost to taxpayers of idling the facility is unclear, but an internal Los Alamos report estimated in 2013 that shutting down the facility where such work is conducted costs the government as much as $1.36 million a day in lost productivity.

Initially, McMillan promised the staff that a “pause” lasting less than a year wouldn’t cause “any significant impact to mission deliverables.” But at the end of 2013, a new group of safety experts commissioned by the lab declared in an internal report that “management has not yet fully embraced its commitment to criticality safety.” It listed nine weaknesses in the lab’s safety culture that were rooted in a “production focus” to meet deadlines. Workers say these deadlines are typically linked to managers’ financial bonuses.

Los Alamos’s leaders, the report said, had made the right promises, but failed to alter the underlying safety culture. “The focus appears to remain short-term and compliance-oriented rather than based on a strategic plan,” it said.

Shortfalls persisted in 2015, and new ones were discovered while the facility, still mostly shut down, was used for test runs. On May 6, 2015, for example, the NNSA sent Los Alamos’s managing contractors a letter again criticizing the lab for being slow to fix criticality risks. The Defense Nuclear Facilities Safety Board said the letter cited “more than 60 unresolved infractions,” many present for months “or even years.”

In January and again in April 2015, workers discovered tubes of liquids containing plutonium in seldom-used rooms at PF-4, with labels that made it hard to know how much plutonium the tubes held or where they’d come from, the safety board said. In May, workers packed a drum of nuclear waste with too much plutonium, posing a criticality risk, and in the ensuing probe, it became clear that they were relying on inaccurate and confusing documentation. Safety experts had miscalculated how much plutonium the drum could safely hold.

“These issues are very similar to the issues that contributed to the LANL Director’s decision to pause operations in June of 2013,” safety board inspectors wrote.

New troubles

In 2016, for the third straight year, the Energy Department and the Defense Nuclear Facilities Safety Board each listed criticality safety at Los Alamos as one of the most pressing problems facing the nuclear weapons program, in their annual reports to Congress. “Required improvements to the Criticality Safety program are moving at an unacceptably slow pace,” the most recent NNSA performance evaluation of Los Alamos, released in Nov. 2016, said.

Hazardous operations at PF-4 slowly started to resume in 2016, but problems continued. In June, after technicians working in a glovebox spilled about 7 tablespoons of a liquid containing plutonium, workers violated safety rules by sopping up the spill with organic cheesecloth and throwing it in waste bins with other nuclear materials, posing the risk of a chemical reaction and fire, according to an internal Los Alamos report. A similar chemical reaction stemming from the sloppy disposal of Los Alamos’s nuclear waste in 2014 provoked the shutdown of a deep-underground storage site in New Mexico for the waste for more than two years, a Department of Energy accident investigation concluded. That incident cost the government more than a billion dollars in cleanup and other expenses

Frank G. Klotz, the NNSA director, has tried to be upbeat. In March, he told hundreds of nuclear contractors packed into a Washington hotel ballroom for an industry gathering that PF-4 was fully back in business, having “safely resumed all plutonium activities there after a three-year pause.”

Klotz said the updated nuclear weapons would be delivered “on time and on budget.”

But a subsequent analysis by the Government Accountability Office clashed with Klotz’s description. In an April report on costs associated with the NNSA’s ongoing weapons modernization, the GAO disclosed the existence of an internal NNSA report forecasting that PF-4 will be unable to meet the plutonium-pit production deadlines.

Moreover, late last year when Los Alamos conducted its first scheduled invasive test of a plutonium pit since the shutdown of PF-4 more than three years ago, it did not produce the needed results, according to NNSA’s annual evaluation of Los Alamos’s performance last year. The test involved the core of a refurbished warhead scheduled to be delivered to the Navy by the end of 2019 for use atop the Trident missiles carried by U.S. submarines. A second attempt involving a different warhead was canceled because the safety analysis was incomplete, NNSA’s evaluation said.

The purpose of such stockpile surveillance tests, as Vice President Joe Biden said in a 2010 National Defense University speech, is to “anticipate potential problems and reduce their impact on our arsenal.” Weapons designers say these tests are akin to what car owners would do if they were storing a vehicle for years while still expecting the engine to start and the vehicle to speed down the road at the sudden turn of a key.

At the public hearing in Santa Fe on June 7, NNSA’s McConnell said the agency is studying whether to keep plutonium-pit operations at Los Alamos. Options being considered include upgrading the facilities there or “adding capabilities or leveraging existing capabilities elsewhere in the country, at other sites where plutonium is already present or has been used.”

Active NNSA sites that fit that description include the Savannah River Site in South Carolina, the Pantex plant in Texas and the Nevada National Security Site. The NNSA expects to complete its analysis by late summer.

This article is from the Center for Public Integrity, a nonprofit, nonpartisan investigative media organization in Washington.

Posted on Categories News

Climate Science Meets a Stubborn Obstacle: Students

Gwen Beatty in James Sutter’s classroom at Wellston High School in Ohio, where she and Mr. Sutter butted heads over the issue of human-caused climate change. Credit Maddie McGarvey for The New York Times

WELLSTON, Ohio — To Gwen Beatty, a junior at the high school in this proud, struggling, Trump-supporting town, the new science teacher’s lessons on climate change seemed explicitly designed to provoke her.

So she provoked him back.

When the teacher, James Sutter, ascribed the recent warming of the Earth to heat-trapping gases released by burning fossil fuels like the coal her father had once mined, she asserted that it could be a result of other, natural causes.

When he described the flooding, droughts and fierce storms that scientists predict within the century if such carbon emissions are not sharply reduced, she challenged him to prove it. “Scientists are wrong all the time,” she said with a shrug, echoing those celebrating President Trump’s announcement last week that the United States would withdraw from the Paris climate accord.

When Mr. Sutter lamented that information about climate change had been removed from the White House website after Mr. Trump’s inauguration, she rolled her eyes.

“It’s his website,” she said.

Mr. Sutter during his Advanced Placement environmental science class. He was hired from a program that recruits science professionals into teaching. Credit Maddie McGarvey for The New York Times

For his part, Mr. Sutter occasionally fell short of his goal of providing Gwen — the most vocal of a raft of student climate skeptics — with calm, evidence-based responses. “Why would I lie to you?” he demanded one morning. “It’s not like I’m making a lot of money here.”

She was, he knew, a straight-A student. She would have had no trouble comprehending the evidence, embedded in ancient tree rings, ice, leaves and shells, as well as sophisticated computer models, that atmospheric carbon dioxide is the chief culprit when it comes to warming the world. Or the graph he showed of how sharply it has spiked since the Industrial Revolution, when humans began pumping vast quantities of it into the air.

Thinking it a useful soothing device, Mr. Sutter assented to Gwen’s request that she be allowed to sand the bark off the sections of wood he used to illustrate tree rings during class. When she did so with an energy that, classmates said, increased during discussion points with which she disagreed, he let it go.

When she insisted that teachers “are supposed to be open to opinions,” however, Mr. Sutter held his ground.

“It’s not about opinions,” he told her. “It’s about the evidence.”

“It’s like you can’t disagree with a scientist or you’re ‘denying science,”’ she sniffed to her friends.

Gwen, 17, could not put her finger on why she found Mr. Sutter, whose biology class she had enjoyed, suddenly so insufferable. Mr. Sutter, sensing that his facts and figures were not helping, was at a loss. And the day she grew so agitated by a documentary he was showing that she bolted out of the school left them both shaken.

“I have a runner,” Mr. Sutter called down to the office, switching off the video.

He had chosen the video, an episode from an Emmy-winning series that featured a Christian climate activist and high production values, as a counterpoint to another of Gwen’s objections, that a belief in climate change does not jibe with Christianity.

“It was just so biased toward saying climate change is real,” she said later, trying to explain her flight. “And that all these people that I pretty much am like are wrong and stupid.”

Classroom Culture Wars

As more of the nation’s teachers seek to integrate climate science into the curriculum, many of them are reckoning with students for whom suspicion of the subject is deeply rooted.

In rural Wellston, a former coal and manufacturing town seeking its next act, rejecting the key findings of climate science can seem like a matter of loyalty to a way of life already under siege. Originally tied, perhaps, to economic self-interest, climate skepticism has itself become a proxy for conservative ideals of hard work, small government and what people here call “self-sustainability.”

A tractor near Wellston, an area where coal and manufacturing were once the primary employment opportunities. Credit Maddie McGarvey for The New York Times

Assiduously promoted by fossil fuel interests, that powerful link to a collective worldview largely explains why just 22 percent of Mr. Trump’s supporters in a 2016 poll said they believed that human activity is warming the planet, compared with half of all registered voters. And the prevailing outlook among his base may in turn have facilitated the president’s move to withdraw from the global agreement to battle rising temperatures.

“What people ‘believe’ about global warming doesn’t reflect what they know,” Dan Kahan, a Yale researcher who studies political polarization, has stressed in talks, papers and blog posts. “It expresses who they are.”

But public-school science classrooms are also proving to be a rare place where views on climate change may shift, research has found. There, in contrast with much of adult life, it can be hard to entirely tune out new information.

“Adolescents are still heavily influenced by their parents, but they’re also figuring themselves out,” said Kathryn Stevenson, a researcher at North Carolina State University who studies climate literacy.

Gwen’s father died when she was young, and her mother and uncle, both Trump supporters, doubt climate change as much as she does.

“If she was in math class and teacher told her two plus two equals four and she argued with him about that, I would say she’s wrong,” said her uncle, Mark Beatty. “But no one knows if she’s wrong.”

As Gwen clashed with her teacher over the notion of human-caused climate change, one of her best friends, Jacynda Patton, was still circling the taboo subject. “I learned some stuff, that’s all,’’ Jacynda told Gwen, on whom she often relied to supply the $2.40 for school lunch that she could not otherwise afford.

Jacynda Patton, right, during Mr. Sutter’s class. “I thought it would be an easy A,” she said. “It wasn’t.” Credit Maddie McGarvey for The New York Times

Hired a year earlier, Mr. Sutter was the first science teacher at Wellston to emphasize climate science. He happened to do so at a time when the mounting evidence of the toll that global warming is likely to take, and the Trump administration’s considerable efforts to discredit those findings, are drawing new attention to the classroom from both sides of the nation’s culture war.

Since March, the Heartland Institute, a think tank that rejects the scientific consensus on climate change, has sent tens of thousands of science teachers a book of misinformation titled “Why Scientists Disagree About Global Warming,” in an effort to influence “the next generation of thought,” said Joseph Bast, the group’s chief executive.

The Alliance for Climate Education, which runs assemblies based on the consensus science for high schools across the country, received new funding from a donor who sees teenagers as the best means of reaching and influencing their parents.

Idaho, however, this year joined several other states that have declined to adopt new science standards that emphasize the role human activities play in climate change.

At Wellston, where most students live below the poverty line and the needle-strewn bike path that abuts the marching band’s practice field is known as “heroin highway,” climate change is not regarded as the most pressing issue. And since most Wellston graduates typically do not go on to obtain a four-year college degree, this may be the only chance many of them have to study the impact of global warming.

But Mr. Sutter’s classroom shows how curriculum can sometimes influence culture on a subject that stands to have a more profound impact on today’s high schoolers than their parents.

“I thought it would be an easy A,” said Jacynda, 16, an outspoken Trump supporter. “It wasn’t.”

God’s Gift to Wellston?

Mr. Sutter, who grew up three hours north of Wellston in the largely Democratic city of Akron, applied for the job at Wellston High straight from a program to recruit science professionals into teaching, a kind of science-focused Teach for America.

He already had a graduate-level certificate in environmental science from the University of Akron and a private sector job assessing environmental risk for corporations. But a series of personal crises that included his sister’s suicide, he said, had compelled him to look for a way to channel his knowledge to more meaningful use.

The fellowship gave him a degree in science education in exchange for a three-year commitment to teach in a high-needs Ohio school district. Megan Sowers, the principal, had been looking for someone qualified to teach an Advanced Placement course, which could help improve her financially challenged school’s poor performance ranking. She hired him on the spot.

Mr. Sutter walking with his students on a nature trail near the high school, where he pointed out evidence of climate change. Credit Maddie McGarvey for The New York Times

But at a school where most teachers were raised in the same southeastern corner of Appalachian Ohio as their students, Mr. Sutter’s credentials themselves could raise hackles.

“He says, ‘I left a higher-paying job to come teach in an area like this,’” Jacynda recalled. “We’re like, ‘What is that supposed to mean?”’

“He acts,” Gwen said with her patented eye roll, “like he’s God’s gift to Wellston.”

In truth, he was largely winging it.

Some 20 states, including a handful of red ones, have recently begun requiring students to learn that human activity is a major cause of climate change, but few, if any, have provided a road map for how to teach it, and most science teachers, according to one recent survey, spend at most two hours on the subject.

Chagrined to learn that none of his students could recall a school visit by a scientist, Mr. Sutter hosted several graduate students from nearby Ohio University.

On a field trip to a biology laboratory there, many of his students took their first ride on an escalator. To illustrate why some scientists in the 1970s believed the world was cooling rather than warming (“So why should we believe them now?” students sometimes asked), he brought in a 1968 push-button phone and a 1980s Nintendo game cartridge.

“Our data and our ability to process it is just so much better now,” he said.

In the A.P. class, Mr. Sutter took an informal poll midway through: In all, 14 of 17 students said their parents thought he was, at best, wasting their time. “My stepdad says they’re brainwashing me,” one said.

Jacynda’s father, for one, did not raise an eyebrow when his daughter stopped attending Mr. Sutter’s class for a period in the early winter. A former coal miner who had endured two years of unemployment before taking a construction job, he declined a request to talk about it.

“I think it’s that it’s taken a lot from him,” Jacynda said. “He sees it as the environmental people have taken his job.”

And having listened to Mr. Sutter reiterate the overwhelming agreement among scientists regarding humanity’s role in global warming in answer to another classmate’s questions — “What if we’re not the cause of it? What if this is something that’s natural?” — Jacynda texted the classmate one night using an expletive to refer to Mr. Sutter’s teaching approach.

But even the staunchest climate-change skeptics could not ignore the dearth of snow days last winter, the cap to a year that turned out to be the warmest Earth has experienced since 1880, according to NASA. The high mark eclipsed the record set just the year before, which had eclipsed the year before that.

In woods behind the school, where Mr. Sutter had his students scout out a nature trail, he showed them the preponderance of emerald ash borers, an invasive insect that, because of the warm weather, had not experienced the usual die-off that winter. There was flooding, too: Once, more than 5.5 inches of rain fell in 48 hours.

The field trip to a local stream where the water runs neon orange also made an impression. Mr. Sutter had the class collect water samples: The pH levels were as acidic as “the white vinegar you buy at a grocery store,” he told them. And the drainage, they could see, was from the mine.

It was the realization that she had failed to grasp the damage done to her immediate environment, Jacynda said, that made her begin to pay more attention. She did some reading. She also began thinking that she might enjoy a job working for the Environmental Protection Agency — until she learned that, under Mr. Trump, the agency would undergo huge layoffs.

“O.K., I’m not going to lie. I did a 180,” she said that afternoon in the library with Gwen, casting a guilty look at her friend. “This is happening, and we have to fix it.”

After fleeing Mr. Sutter’s classroom that day, Gwen never returned, a pragmatic decision about which he has regrets. “That’s one student I feel I failed a little bit,” he said.

As an alternative, Gwen took an online class for environmental science credit, which she does not recall ever mentioning climate change. She and Jacynda had other things to talk about, like planning a bonfire after prom.

As they tried on dresses last month, Jacynda mentioned that others in their circle, including the boys they had invited to prom, believed the world was dangerously warming, and that humans were to blame. By the last days of school, most of Mr. Sutter’s doubters, in fact, had come to that conclusion.

“I know,” Gwen said, pausing for a moment. “Now help me zip this up.”

Posted on Categories News

A startup has invented a power cycle that runs on carbon dioxide—without emitting it

Between the energy hub of Houston, Texas, and the Gulf Coast lies a sprawling petropolis: a sea of refineries and oil storage tanks, power lines, and smokestacks, all dedicated to converting fossil fuels into dollars. They are the reason why the Houston area emits more carbon dioxide (CO2) than anyplace else in the United States.

But here, on the eastern edge of that CO2 hot spot, a new fossil fuel power plant showcases a potential remedy for Houston’s outsized greenhouse gas footprint. The facility looks suspiciously like its forebears, a complex the size of two U.S. football fields, chock-a-block with snaking pipes and pumps. It has a turbine and a combustor. But there is one thing it doesn’t need: smokestacks.

Zero-emission fossil fuel power sounds like an oxymoron. But when that 25-megawatt demonstration plant is fired up later this year, it will burn natural gas in pure oxygen. The result: a stream of nearly pure CO2, which can be piped away and stored underground or blasted into depleted oil reservoirs to free more oil, a process called enhanced oil recovery (EOR). Either way, the CO2 will be sequestered from the atmosphere and the climate.

That has long been the hope for carbon capture and storage (CCS), a strategy that climate experts say will be necessary if the world is to make any headway in limiting climate change (see sidebar, p. 798). But CCS systems bolted to conventional fossil fuel plants have struggled to take off because CO2 makes up only a small fraction of their exhaust. Capturing it saps up to 30% of a power plant’s energy and drives up the cost of electricity.

In contrast, NET Power, the startup backing the new plant, says it expects to produce emission-free power at about $0.06 per kilowatt-hour. That’s about the same cost as power from a state-of-the-art natural gas-fired plant—and cheaper than most renewable energy. The key to its efficiency is a new thermodynamic cycle that swaps CO2 for the steam that drives turbines in conventional plants. Invented by an unlikely trio—a retired British engineer and a pair of technology geeks who had tired of their day jobs—the scheme may soon get a bigger test. If the prototype lives up to hopes, NET Power says, it will forge ahead with a full-scale, 300-megawatt power plant—enough to power more than 200,000 homes—which could open in 2021 at a cost of about $300 million. Both the company and CCS experts hope that the technology will then proliferate. “This is a game-changer if they achieve 100% of their goals,” says John Thompson, a carbon capture expert at the Clean Air Task Force, an environmental nonprofit with an office in Carbondale, Illinois.

Engineer Rodney Allam conceived the carbon dioxide cycle at the heart of the new power plant PHOTO: MARC WILSON

NET POWER CEO BILL BROWN, 62, never set out to remake the energy market. A decade ago, as a dealmaking lawyer in New York City, he crafted financial trading strategies for Morgan Stanley. But he was restless. So he called Miles Palmer, a buddy from his undergraduate days at the Massachusetts Institute of Technology (MIT) in Cambridge. Palmer was a chemist for Science Applications International Corporation (SAIC), a defense contractor that designed everything from rail guns to drones. Brown suggested they “make something good for a change.” In 2008, as the economy was collapsing, they left their jobs and started 8 Rivers, a technology incubator in Durham, North Carolina, where Brown also taught law at Duke University.

They needed something to incubate. They liked the thought of doing something in the energy sector, a famously risk-averse arena, but one in which a breakthrough technology can make a fortune. First came a brief, fruitless attempt to make biofuels from algae. Then, in 2009, the Obama administration’s stimulus package offered billions of dollars in grants for “clean coal” projects—ways to reduce coal’s CO2 emissions. Palmer knew that, worldwide, coal wasn’t going away anytime soon, and he understood how it threatened the climate. “I wanted to solve that problem,” he says.

Cleaning up coal has been tough. Not only does coal release twice as much carbon pollution as natural gas, but that CO2 also makes up just 14% of the flue gas from a conventional power plant. Still, coal is plentiful and cheap, and until recently few people cared about the CO2 it unleashes. So coal-fired power plants haven’t changed much since 1882, when Thomas Edison’s company built the first one in London. Most still burn coal to boil water. The steam drives a turbine to generate electricity. At the turbine’s back end, cooling towers condense the steam into water, lest the high-pressure steam there drive the turbine in reverse. Those towers vent much of the energy used to boil the water in the first place. Overall, just 38% of coal’s energy yields electricity. “All that energy is just wasted,” Brown says.

That inefficiency helped drive utilities to natural gas. Not only is gas cleaner—and, in the United States, cheaper than coal—but because it is a gas to begin with, engineers can take advantage of an explosive expansion as it burns to drive a gas turbine. The heat of the turbine exhaust then boils water to make steam that drives additional turbines. The best natural gas “combined cycle” plants achieve nearly 60% efficiency.

The prototype NET Power plant near Houston, Texas, is testing an emission-free technology designed to compete with conventional fossil power. PHOTO: CHICAGO BRIDGE & IRON

Still, Palmer was focused on coal, the bigger climate problem. He built on work he had done at SAIC on a high-pressure combustor for burning coal in pure oxygen. It was more efficient and smaller, and so it would cost less to build. It also produced an exhaust of concentrated CO2, thus avoiding the separation costs. “I got it to work almost as well as a conventional coal plant, but with zero emissions,” Palmer says. “But it wasn’t good enough.”

Palmer and Brown needed to nudge the efficiency higher. In 2009, they contacted Rodney Allam, a chemical engineer who had run European R&D operations for Air Products, an industrial giant in the United Kingdom. Later, in 2012, Allam won a share of the $600,000 Global Energy Prize, sponsored by the Russian energy industry, for his work on industrial gas production. But at the time, he was mostly retired, concentrating on his fishing, lawn bowling, and gardening.

Palmer and Brown hired Allam as a consultant. Inspired by some Russian research from the 1930s, Allam thought he saw a way to radically reinvent the staid steam cycle. Forget about boilers, he thought. He would drive everything with the CO2 itself, making an ally out of his enemy. “The only way you could proceed was to develop a totally new power system,” Allam says.

ALLAM ENVISIONED THE CO2 circulating in a loop, cycling between a gas and what’s called a supercritical fluid. At high pressure and temperature, supercritical CO2 expands to fill a container like a gas but flows like a liquid.

For decades, engineers have worked on Brayton cycles—thermodynamic loops that take advantage of the properties of supercritical fluids, which could be air or CO2 (see Perspectives, p. 805). Supercritical fluids offer advantages: Because they are fluids, a pump can pressurize them, which takes far less energy than a compressor needs to pressurize a gas. And because of the fluidlike gas’s extra density, it can efficiently gain or shed heat at heat exchangers.

In Allam’s particular Brayton cycle, CO2 is compressed to 300 times atmospheric pressure—equivalent to a depth of 3 kilometers in the ocean. Then fuel is burned to heat the CO2 to 1150°C, which turns it supercritical. After the CO2 drives a turbine, the gas’s pressure drops and it turns into a normal gas again. The CO2 is then repressurized and returned to the front end of the loop. A tiny amount of excess CO2—exactly as much as burning the fuel created—is shunted into a pipeline for disposal.

The Allam cycle, as it is now called, comes with costs. Giant cryogenic refrigerators must chill air—which is mostly nitrogen—to extract the pure oxygen needed for combustion. Compressing CO2 into a supercritical state also sucks up energy. But both steps are well-known industrial processes. Allam calculated that discarding the steam cycle would boost the 38% efficiency of a coal plant to 56%. That would put it within striking distance of the efficiency of a contemporary combined cycle plant. As a bonus, the exhaust is nearly pure CO2 that can be sold for EOR. Another perk is that the Allam cycle generates water as a byproduct of combustion, instead of consuming it voraciously as conventional steam cycles do, which could make plants easier to site in arid parts of the world.

At this point, Brown and Palmer were still planning to use coal as their fuel. But when they sent Allam’s handiwork to the engineering firm Babcock & Wilcox, to see whether the system would work on an industrial scale, “they had good news and bad news,” Brown says. On the downside, the Allam cycle would be tough to pull off with coal, at least initially, because the coal would first have to be converted to a synthetic gas, which adds cost. Also, sulfur and mercury in that syngas would have to be filtered out of the exhaust. But on the upside, the engineers saw no reason why the technique wouldn’t work with natural gas, which is ready to burn and doesn’t have the extra contaminants.

Brown and Palmer gave up on winning a clean coal grant from the government. Instead, they sought private investment for a far bigger prize: revolutionizing energy production with carbon capture. By 2014, 8 Rivers had secured $140 million in funding from Exelon and Chicago Bridge & Iron, two industrial giants that now co-own the NET Power demo plant. In March 2016, the company broke ground on its pilot plant outside Houston.

“This is the biggest thing in carbon capture,” says Howard Herzog, a chemical engineer and carbon capture expert at MIT. “It’s very sound on paper. We’ll see soon if it works in reality. There are only a million things that can go wrong.”

ONE OF THOSE IS THE NEW TURBINE, which needs to work at intense temperatures and pressures. Some steam turbines reach those extremes, but “no one had ever designed a turbine to do that with CO2 as the working fluid,” says NET Power spokesperson Walker Dimmig. In 2012, NET Power officials inked a deal to have the Japanese conglomerate Toshiba retool one of its high-pressure steam turbines to work with supercritical CO2, which required changing the lengths and angles of the turbine blades. Toshiba also engineered a new combustor to mix and burn small amounts of oxygen and natural gas in the midst of a gust of hot supercritical CO2—a problem not unlike trying to keep a fire going while dousing it with a fire extinguisher.

The re-engineered combustor and turbine were tested in 2013 and delivered to the demo plant in November 2016. Now, they are being integrated with the rest of the facility’s components, and the plant is undergoing preliminary testing before ramping up to full power sometime this fall. “I’m 100% confident it will work,” Allam says.

If it does, Brown says, NET Power will have advantages that could encourage widespread market adoption. First, the CO2 emerging from the plant is already pressurized, ready to be injected underground for EOR, unlike CO2 recovered from natural gas wells—the usual source.

Another advantage is the plant’s size. Not only are the heat exchangers much smaller and cheaper to build than massive boilers, but so are many of the other components. The 25-megawatt supercritical CO2 turbine, for example, is about 10% the size of an equivalent steam turbine. Overall, NET Power plants are expected to be just one-quarter the size of an equivalent advanced coal plant with carbon capture, and about half the size of a natural gas combined cycle with carbon capture. That means less concrete and steel and lower capital costs. “For many CCS projects, the upfront costs are daunting,” says Julio Friedmann, a carbon capture expert at Lawrence Livermore National Laboratory in Livermore, California. “Avoiding those costs really matters.” What’s more, unlike gas plants without carbon capture, NET Power will be able to sell its CO2 for EOR.

EVEN IF NET POWER’S TECHNOLOGY works as advertised, not everyone will be a fan. Lukas Ross, who directs the climate and energy campaign at Friends of the Earth in Washington, D.C., notes that the natural gas that powers the plant comes from hydraulic fracturing, or “fracking,” and other potentially destructive practices. And providing a steady supply of high-pressure gas for EOR, he adds, will only perpetuate a reliance on fossil fuels. Ross argues that money would be better spent on encouraging broad deployment of renewable energy sources, such as solar and wind power.

GRAPHIC: C. BICKEL/SCIENCE

Yet oddly enough, NET Power could help smooth the way for renewables to expand. The renewable portfolio standards in many countries and U.S. states require solar, wind, and other carbon-free sources to produce an increasing proportion of the electric power supply. But those sources are intermittent: The power comes only when the sun is shining and the wind is blowing. Nuclear and fossil fuel sources provide “base load” power that fills the gaps when renewables aren’t available. Conventional natural gas power plants, in particular, are viewed as a renewable-friendly technology because they can be ramped up and down quickly depending on the supply of renewable power.

As an emission-free alternative, NET Power’s plants could enable communities to deploy even more renewables without having to add dirty base-load sources. “Fossil fuel carbon-free power allows even more aggressive deployment of renewables,” says George Peridas, an environmental policy analyst with the Natural Resources Defense Council in San Francisco, California.

That’s a combination Allam wants to promote. “I’m not knocking renewables, but they can’t meet future power demands by themselves,” he says. Allam, a longtime member of the Intergovernmental Panel on Climate Change, says time for solving carbon pollution is running short—for both the world and himself. “I’m 76,” he says. “I’ve got to do this quickly.”

Posted on Categories News

Exelon Moves to Pull Plug on Three Mile Island Nuclear Power Plant

The Three Mile Island nuclear power plant was the site of a partial core meltdown in 1979. Its owner says it may now shut it down. PHOTO: MATT ROURKE/ASSOCIATED PRESS

Exelon Corp. warned Tuesday that it will close the Three Mile Island nuclear power plant in Pennsylvania in 2019 unless it receives government aid, the latest sign of how the sector is in danger of shrinking as it faces intense competition in the U.S.

A global symbol of the potential perils of nuclear power after suffering a partial meltdown in 1979, the plant has been losing money for years. Last week, it failed to sell its electricity in advance in a regional power auction for 2020 and 2021, the third year in a row it did not find a buyer.

As a result, Exelon said it was accelerating its retirement unless it receives assistance from the federal government or the state, which has been reluctant to subsidize it as some states have done to keep their nuclear facilities running. Three Mile Island has a federal license to operate until 2034.

“Like New York and Illinois before it, [Pennsylvania] has an opportunity to take a leadership role by implementing a policy solution to preserve its nuclear energy facilities,” said Exelon Chief Executive Chris Crane. The company said it was taking one-time charges of up to $110 million for 2017 in connection with the planned closure.

Utilities have been closing U.S. nuclear-power plants at a rapid clip due to political pressure from critics and growing competition from other electricity sources, notably the increasing number of plants fired by natural gas as horizontal drilling and hydraulic fracturing unlock mass quantities of the fuel.

Power demand in the U.S. has been flat for nearly a decade, creating a battle for market share. Last year, natural gas generated 34% of the electricity in the U.S., according to federal data. Nuclear power generated 20%, and coal 30%. The rest came from renewable sources, including hydroelectric dams.

Three Mile Island would be at least the fifth U.S. nuclear facility set to close by 2025, including PG&E Corp.’s Diablo Canyon plant in California, and Entergy Corp.’s Palisades unit in Michigan and the Indian Point plant in New York.

Four other facilities have already closed in the past four years, including Dominion Resources Inc.’s Kewaunee plant in Wisconsin. The retirements would leave about 60 nuclear plants in the U.S.

A little more than a decade ago, the U.S. nuclear industry was talking about a rebirth. But the first new nuclear units being built in the country in years, facilities in Georgia and South Carolina, are years behind schedule and billions over budget.

Southern Co. and Scana Corp. , the utilities behind the new plants, are now scrambling to determine how much it will cost to finish them after their builder, Westinghouse Electric Co., declared bankruptcy in March.

The fate of the new plants could help determine the future of U.S. nuclear power. Late last year, the Tennessee Valley Authority sold two unfinished nuclear units in northern Alabama for $111 million after spending billions since the 1970s on the project.

“We have to find a way to build these reactors in the U.S.,” Jose Gutierrez, Westinghouse’s interim chief executive, said last week. “Otherwise, the future is going to be compromised.”

Even if the nukes get built, their hardships underscore the fact that nuclear power remains a complex business full of booby traps, analysts say.

“The nuclear renaissance is dead for the foreseeable future,” said Steve Fleishman, managing director at Wolfe Research.

Exelon and other operators have sought state subsidies to keep plants running, arguing that they create high-paying jobs and do not emit air pollution or greenhouse gases.

Three Mile Island employs 675 people and contracts with another 1,500 workers. Exelon said Tuesday that it provides roughly 93% of the emissions-free electricity in Pennsylvania.

Exelon has succeeded in persuading some states to provide new financial incentives. Last year, Illinois lawmakers voted to allow Exelon to collect as much as $235 million annually from customers in exchange for keeping two nuclear power plants open.

But the deals have been controversial due to opposition from critics of nuclear power and from such independent power producers as Dynegy Inc. and NRG Energy Inc. that own coal-fired power plants and other sources of electricity.

Pennsylvania lawmakers in March formed a bipartisan caucus to discuss possible funding. State Rep. Dave Hickernell, who represents the area where Three Mile Island is located and is a member of the caucus, said he hopes Exelon’s decision can be reversed.

A spokesman for Pennsylvania Gov. Tom Wolf said the Democrat was concerned about potential layoffs from a Three Mile Island closure and was open to a conversation with state lawmakers about the future of nuclear power in the state.

Three Mile Island drew international attention in 1979 when a partial core meltdown in one of its two reactors led to five days of panic. The reactor involved was permanently shut down and the incident was followed by 14 years of expensive cleanup, heightening awareness of the potential safety problems of nuclear plants.

Shares of Exelon were up 0.5% at $36 around 3:30 p.m. EDT Tuesday.

—Miguel Bustillo contributed to this article.

Posted on Categories News

Beyond Batteries: Other Ways to Capture and Store Energy

Storing electricity on a large scale has been a bigger challenge than generating it and keeping it flowing. PHOTO: STEVE HOCKSTEIN/BLOOMBERG NEWS

Unlike oil, which can be stored in tanks, and natural gas, which can be kept in underground caverns, electricity has been a challenge to bottle.

But that is starting to change.

These days, companies including Elon Musk’s Tesla Inc. are selling lithium-ion batteries, similar to those that power electric cars, to utilities, businesses and homeowners, who use them to store electricity, mostly for short periods.

But now, some nonbattery technologies are gaining traction as utilities continue to look for economical ways to capture and store power.

These alternatives have a longer lifetime than chemical batteries, which generally need to be switched out after about 10 years, and some can store and discharge more electricity.

Here’s a look at three of the technologies.

PUMPED HYDROPOWER: Pumped hydropower is a century-old technology that is getting a fresh look, as developers turn old mines into holding tanks for water. During periods when electricity is cheap and abundant, pumps are used to push large volumes of water uphill, where it is stored in giant basins. When extra power is needed on the grid, the water is released and gravity pulls it downhill and through generators that produce electricity.

Eagle Crest Energy Co. plans to build a $2 billion pumped-hydropower facility at an abandoned iron mine east of Palm Springs, Calif. The plant would have a capacity of 1,300 megawatts, enough to power nearly one million homes, and would be able to generate power for about 10 hours at a time. The plant could soak up excess power overnight, when demand is slack, and during the day, when California’s solar farms are churning out electricity, and then return the juice in the evening, after the sun sets and power use rises in cities and towns. Several similar projects are awaiting government permits.

FLYWHEELS: Flywheels store electricity in the form of kinetic energy. The basic technology, in which a wheel spins at high speed, has been around for decades and used for various applications, including storing and discharging power in momentary spurts. Newer flywheels, such as those developed by Amber Kinetics Inc., based in Union City, Calif., can hold their rotation longer, creating electricity that can be discharged over four hours.

With Amber Kinetics’ technology, an electric motor turns a 5,000-pound steel rotor until it is spinning at thousands of rotations a minute, a process that takes a few hours. The rotor is housed inside a vacuum chamber—the air is sucked out to remove friction. An electromagnet overhead lifts the steel rotor off its bearings, which allows it to spin quickly without requiring a lot of electricity. Indeed, the company says the steel disks, which resemble giant hockey pucks, can maintain their rotation with the same amount of electricity as it takes to power a 75-watt lightbulb.

The flywheel stores the energy in its continuous motion. When power is needed on the grid, the flywheel connects to a generator, and its momentum turns the generator’s shaft to produce electricity.

COMPRESSED AIR: Machines that use compressed air have been around for more than 100 years. Various attempts to use compressed air to store electricity have been tried over the past few decades, but high costs and technical challenges kept the technology from advancing, until now, according to Toronto-based Hydrostor, which is building next-generation compressed-air energy-storage facilities in Canada and Aruba, and says it is in talks with two utilities for additional projects.

Hydrostor uses electricity when it is cheap and abundant to run an air compressor, purchased off the shelf from General Electric Co. or Siemens . The compressor squeezes air into a pipeline and down into a hole the length and width of a football field and up to four stories tall, that the company digs deep underground and fills with water. When the pressurized air is piped into the underground cavern, it displace the water up a shaft and into an adjacent pond. Then the pipeline valve is shut. When electricity is needed, the valve of the air pipe is opened and the air rushes up.

When air is compressed, it becomes hot. To be stored, the air needs to be cooled, but to be reused to generate electricity, it needs to be hot. In the old days, the heat from the compressed air would be vented, and later the air would be reheated using a natural-gas-fired motor. Hydrostor, however, removes the heat from the compressed air and stores it in a tank filled with waxes and salts. When the air is brought back up to the surface, it is reheated with the hot wax and salts, then pushed through a turbine, where it generates electricity.

Ms. Sweet is a writer in San Francisco. She can be reached at reports@wsj.com.

Appeared in the May. 22, 2017, print edition as ‘Beyond Batteries.’

Posted on Categories News, Uncategorized

California set an ambitious goal for fighting global warming. Now comes the hard part.

When Stanford University energy economist Danny Cullenward looks at California’s policies on climate change, he sees a potential time bomb.

The state wants to slash greenhouse gas emissions so deeply in the coming years that oil refineries and other industries could face skyrocketing costs to comply with regulations, driving up gasoline prices until the system loses political support. If that happens, an effort touted as an international model for fighting global warming could collapse.

Not everyone agrees with Cullenward’s assessment, but it reflects how experts, officials and lawmakers are starting to reckon with the state’s steep ambitions and the understanding that its current policies may no longer be adequate. Although California has been gliding toward its initial goal of reducing emissions to 1990 levels by 2020, it must cut an additional 40% by 2030 under a law signed by Gov. Jerry Brown last year.

“It’s going to take bold proposals to get us to where we need to be,” said Cullenward, who has helped shape legislation in the Capitol.

Getting the details right means the difference between California burnishing its role as an incubator for innovation or proving itself to be a canary in the coal mine, and lawmakers are sorting through a flood of ideas this year. One proposal would accelerate the adoption of renewable energy and eventually phase out all fossil fuels for generating electricity. Some advocates want a regional power grid to share clean energy across state lines. Everyone is looking for ways to turn climate policies into jobs in their local communities.

“We’ve already decided as a state and as a Legislature that we want to dramatically reduce pollution and move forward toward a clean energy future,” Senate leader Kevin de León (D-Los Angeles) said. “That debate is over. Now we’re deciding how to get there.”

Those conversations, however, could prove just as contentious as previous debates, expose divisions among environmentalists and force lawmakers to make difficult decisions about squeezing a state with the world’s sixth-largest economy into a dramatically smaller emissions footprint.

“Everything is a huge question mark,” said Rob Lapsley, president of the California Business Roundtable, which represents the state’s largest corporations.

Front and center is the haggling over extending the cap-and-trade program, which requires companies to buy permits to release greenhouse gases into the atmosphere. Permits can also be traded on a secondary market. It’s the only system of its kind in the country, and it faces steep legal challenges that can only be fully resolved with new legislation to keep it operating.

Brown wants to settle the issue next month, and there’s wide consensus around keeping the program in some form. Oil companies that once launched an expensive ballot campaign to block it are now negotiating its extension — albeit on terms that are friendlier to their industry — and even some Republicans are on board with the idea.

But there are still disagreements over how to move forward, some of which were highlighted with the recent release of a proposal Cullenward worked on with one of his Stanford colleagues, environmental law professor Michael Wara.

The legislation, SB 775, which was written by Sen. Bob Wieckowski (D-Fremont) and backed by De León, would create a higher minimum price for emission permits and increase it annually to provide a steeper incentive for companies to clean up their operations.

There would also be a firm ceiling on how high prices could climb to guard against sticker shock as permits become more valuable while the state ratchets down emissions to meet its 2030 goal.

The legislation would make another significant change. Instead of sending cap-and-trade revenue only to projects intended to reduce emissions, such as subsidies for electric cars or affordable housing near mass transit, a chunk of the money would be distributed to Californians — much like a tax rebate.

Although it’s not yet clear how the rebates would function, the proposal is an acknowledgement that costs for gasoline and electricity are likely to rise, and lawmakers want to help insulate voters from the effects. The state’s transition toward low-emission technology could prove expensive over time, requiring the purchase of millions of electric vehicles and shuttering natural gas operations in favor of new solar plants.

“This is a massive infrastructure replacement program for California,” said Snuller Price, senior partner at E3, an energy efficiency consulting firm that has worked with state regulators. “We’re swapping out all of our things.”

Wara said California needs different policies to set a new target.

“It’s a totally different animal. We need to acknowledge that,” he said. “It’s going to a level of [emissions] reductions that no one has ever achieved.”

The idea has been embraced by some policy wonks — “state of the art,” one writer proclaimed — but others see peril in this approach. No longer would companies be able to finance offsets — green projects intended to lower emissions anywhere in the country — to meet their obligations under the program, cutting the flow of cash from California industries to environmental efforts nationwide.

The legislation also includes a modification to the program that some environmental advocates fear would make it harder to ensure the state meets its goals. Under the new proposal, the state would sell an unlimited number of permits if prices reach the ceiling, rather than restricting how many are available as the program does now.

That adjustment would make cap and trade function more like a tax, said Nathaniel Keohane, vice president at the Environmental Defense Fund, which is critical of the proposal and doesn’t see the same threat of price spikes.

“It’s a fundamental, philosophical thing,” he said.

Senate leader Kevin de León launches his push to phase out the use of fossil fuels for generating electricity at a solar farm in Davis on May 2. (Chris Megerian / Los Angeles Times)

De León wants to accelerate the process of reducing emissions from generating electricity. He launched new legislation, SB 100, to require the state to use renewable sources such as wind and solar for 60% of its power by 2030, up from the current target of 50%. By 2045, the use of fossil fuels such as coal and natural gas to generate electricity would no longer be allowed.

It’s a big climb — about 20% of the state’s electricity came from renewable sources in 2015, the latest figures available. The proposal has been embraced by labor groups who see jobs in building new infrastructure, but some are skeptical.

Brent Newell, legal director at the Center on Race, Poverty and the Environment, doesn’t want to see incentives for producing bio-gas from cow manure at industrial-scale dairies, which are a source of air and water pollution.

Although the legislation is “pointing the compass” in the right direction, he said, “that’s not clean energy.”

Reaching the point where fossil fuels aren’t used to keep the lights on will require new approaches to California’s electricity grid. Renewable sources can be difficult to manage because it’s impossible to control when the sun shines or the wind blows. The challenge is finding ways to soak up electricity when there’s too much, such as charging batteries or pumping water into reservoirs, and then releasing it when needed.

Another approach involves integrating California’s grid with other states, providing a wider market for excess solar energy that’s produced here on sunny days and allowing more wind energy to flow in from turbines elsewhere in the region.

“There’s a huge amount of efficiency to be gained,” said Don Furman, who directs the Fix the Grid campaign.

The idea would require California to share control of the electricity grid with other states, which unnerves some lawmakers and advocates. Unions also fear changes that would make building energy projects more attractive outside of California.

Debates over these issues are drawing the most attention in the Capitol, but other proposals are bubbling up as well, a sign that many lawmakers want to get involved in the issue.

One measure would make it easier for low-income Californians to access solar power. Another would create a system for tracking electricity consumption to help pinpoint areas for more efficiency.

“A lot of small steps create big momentum,” said Lauren Navarro, a senior policy manager at the Environmental Defense Fund. “These are pieces of what it takes to get to a clean-energy economy.”

Posted on Categories News

How to dispose of nuclear waste

Finland shows the way with a project expected to span 100,000 years

A steep 5km ramp corkscrews down from the mouth of a tunnel (pictured above) into the bowels of the Earth. At the bottom, a yellow rig is drilling boreholes into the rock face, preparing it for blasting. The air is chilly, but within a few years, it may feel more like a Finnish sauna. Buried in holes in the floor will be copper canisters, 5.2 metres long, containing the remains of some of the world’s most radioactive nuclear waste. When the drilling is finished, in a century or so, 3,250 canisters each containing half a tonne of spent fuel will be buried in up to 70km of tunnels. Then the entire area will be sealed to make it safe for posterity.

The hundred-year timescale already means this is a megaproject. But that is just the beginning. The radioactive isotopes of plutonium used in nuclear-power plants must be stored for tens of thousands of years before they are safe. Finland aims to isolate its stockpile in the Onkalo repository, a burial chamber beneath the small forested island of Olkiluoto, home to one of its two nuclear-power plants, for at least 100,000 years.

In geological terms, that is a heartbeat; Finland’s bedrock is 1.9bn years old. But in human terms, 4,000 generations are almost inconceivable. As Mika Pohjonen, the managing director of Posiva, the utility-owned Finnish company overseeing the project, says, no one knows whether humans, creatures (or machines) will rule the Earth above by then—let alone whether they will be able to read today’s safety manuals. A hundred thousand years ago, Finland was under an ice sheet and Homo sapiens had not yet reached Europe.

Posiva has commissioned studies on the possibility that in the intervening millennia the area could be inundated by rising seas caused by global warming, or buried beneath a few kilometres of ice once more. Scientists have studied Greenland as an analogue to ice-capped Finland. The firm’s assurance to future generations is that if, in tens of thousands of years, a future Finn digs a 400-metre-deep well and draws water contaminated with 21st-century nuclear waste, it will be safe to drink.

But Posiva’s immediate priority is to create disposal caverns far enough from rock fissures and groundwater that Finland’s nuclear authorities allow it to start moving the canisters to their tomb in the early 2020s. “This is drilling with silk gloves on,” Mr Pohjonen says, as the machine pounds the rock with a deafening roar. “It has to be done gently.”

Nuclear authorities around the world are watching with interest because in the past two years Finland has become the first country to license and start building a final repository for highly radioactive waste fuel from nuclear reactors. Experts at the International Atomic Energy Agency (IAEA), a global body, say other countries, such as Sweden and France, are close behind. In America, Donald Trump’s administration has included a budget request for $120m to restart construction of a high-level waste repository at Yucca Mountain in Nevada, chosen in 1987 but stalled since 2010.

Delayed gratification

The disposal of nuclear fuel is among the most intractable of infrastructure projects. And there are already 266,000 tonnes of it in storage around the world, about 70,000 tonnes more than there were a decade ago. As Markku Lehtonen, a Finnish academic at the University of Sussex, puts it, the costs are high; the benefits are about avoiding harm rather than adding value; and evaluation is not about assessing risk, but about dealing with “uncertainty, ambiguity and ignorance” over a protracted timescale. Not everyone is convinced that permanent disposal is urgent, either. Some argue that semi-cooled fuel could be kept in cement dry-storage casks, as much is in America, for generations until technologies are developed to handle it. A blue-ribbon commission in America in 2012 mentioned the benefits of keeping spent fuel in storage for a longer time in order to keep the options open. But it also said that final storage was essential.

For all the countries committed to burial, Finland represents an overdue step in the right direction. It offers two lessons. The first is to find a relatively stable geological area, and reliable storage technology. The second is to build a broad consensus that the waste can be handled and disposed of responsibly. Like other Nordic success stories, it will be hard to replicate. “Finland has a kind of unique institutional context: a high trust in experts and representative democracy,” says Matti Kojo, of Finland’s Tampere University. “You cannot just copy a model from Finland.”

Under solid ground

The geological part, though the timespan is greatest, is probably the least tricky. Finland began the search for a site in 1983, shortly after it began generating nuclear power, and chose Olkiluoto after reviewing 100 areas. It has mapped faults and fissures in the bedrock, and sited the repository in a seismic “quiet zone”. It says it will avoid burying canisters close to potential pressure points, to minimise the danger that rock movements would crush or tear the canisters and cause radioactive leakage. Finland’s Radiation and Nuclear Safety Authority (STUK) called Posiva’s analysis of the bedrock and groundwater “state of the art”.

Ismo Aaltonen, Posiva’s chief geologist, says that earthquakes cannot be ruled out, especially if the bedrock shifts upwards in the melting period after a future ice age. Olkiluoto is still rising as it rebounds from the pressure of the last one, which ended more than 10,000 years ago. Close to the repository’s entrance, he points to scratchmarks on the rocks—“footprints of the last ice age” left by the retreating ice cap. But whether in crystalline granite, as in Finland and Sweden, or clay, as in France, or volcanic rock, as in Yucca Mountain, nuclear experts are confident that deep geological disposal can be safe. “There is a great deal of evidence that we can find many sites in the world with adequate geological properties for the required safety,” says Stefan Mayer, a waste-disposal expert at the IAEA.

Technology is the next hurdle. As well as 400-500 metres of bedrock between the canisters and the surface, there will be several man-made layers: steel, copper, water-absorbent bentonite clay around the canisters, and bentonite plugs sealing the caverns and, eventually, the access tunnel.

A model in the visitor’s centre, with moving parts that replicate all this in miniature, makes the whole set-up look safer than Fort Knox. Posiva says it has modelled copper deposits in ancient rocks to assess the likelihood of corrosion. STUK, however, says it will need more study on the potential for the copper to deteriorate. Some academics, including Mr Kojo, are worried that the Finnish media have underplayed concerns about copper corrosion, compared with other countries with similar “multi-barrier” protection systems.

The trickiest challenge, though, is to build broader societal consent. Finland appears to have succeeded by starting early and sticking to its timetable. The decision to find a site and start disposing of nuclear waste in the 2020s was taken 40 years ago. In 1994 its parliament banned the import and export of spent nuclear fuel, which increased the pressure to find a home-grown solution. Few other countries have demonstrated the same determination. The good news is that, because waste needs to be cooled in tanks for 30-50 years before being disposed of, emerging nuclear powerhouses such as China have time to prepare.

Out of site, not out of mind

Finns’ trust in their nuclear industry has remained high, despite accidents elsewhere, such as those at Chernobyl in 1986 and Fukushima in 2011. Finland’s four nuclear reactors operate at among the world’s highest utilisation rates, and supply 26% of its electricity. Its two nuclear utilities, TVO and Fortum, which co-own Posiva, are themselves part of an electricity system in which Finnish industries and many municipalities have a stake, bolstering public support. The Onkalo repository is situated next door to TVO’s two working Olkiluoto reactors, which means people nearby are—in the phrase of academics—“nuclearised”, that is, convinced of the benefits of nuclear power. Surveys suggest positive attitudes to nuclear power nationally exceed negative ones.

Finns’ trust in government as a whole is high. Vesa Lakaniemi, the mayor of the 9,300-strong municipality of Eurajoki in which Olkiluoto lies (who once did a summer job at TVO), says it did not take much to persuade locals to support the site. Income from the nuclear industry gives them slightly lower taxes, good public services and a restored mansion for the elderly. They trust the waste will be handled safely and transparently. “It’s Finnish design. Finnish rock is solid rock. Regulation is strict everywhere in the world but Finnish people do these things very well,” he says.

Faith in the future

Some academics worry that Finland is taking waste disposal too much on faith. Any mishap could erode trust in an instant, as happened in Japan, another “high-trust” society, after the Fukushima disaster. TVO admits that negative attitudes towards nuclear power have risen as the construction of its third reactor at Olkiluoto has been plagued by delays, cost overruns and squabbles with the French-German contractors. The experience has shown that STUK tolerates no shortcuts, but some fear that its relationship with Posiva sometimes appears too close. Sweden and France have moved towards licensing repositories with far more criticism from NGOs and the media, suggesting more robust engagement.

Other countries, including America and France, follow principles of reversibility or retrievability, meaning they can reverse the disposal process while it is under way or retrieve waste after burial, if technologies and social attitudes change. Finland’s model is more closed; it would take a huge amount of digging to recover the waste once it has been sealed. But analysts say there is no single correct approach. Britain, for instance, has done things by the book but still failed to find a place for a repository.

Finally, there is the matter of cost. Finland’s nuclear-waste kitty, collected from the utilities, currently stands at €2.5bn ($2.7bn). By the time it is closed, the price is expected to be €3.5bn. That is reassuringly modest for a 100-year project, partly reflecting the fact that Finland’s nuclear industry, even when the planned total of five reactors are up and running, is relatively small. Other countries have higher costs, and less discipline. Yucca Mountain, for instance, was once estimated to cost $96bn to complete. In 2012 America had $27bn in its disposal fund, collected from ratepayers, none of which has gone towards nuclear-waste management.

It may be hard to replicate Finland’s exact model, but its sense of responsibility is seen as an inspiration. When visiting the Finnish repository, authorities from elsewhere, be they American, Chinese, Australian, Japanese or British, learn that safeguarding the future is not just a question of seismology, technology, sociology and cash. It is also an ethical one.

Posted on Categories News

Is It O.K. to Tinker With the Environment to Fight Climate Change?

Illustration by Valero Doval

Scientists are investigating whether releasing tons of particulates into the atmosphere might be good for the planet. Not everyone thinks this is a good idea.

For the past few years, the Harvard professor David Keith has been sketching this vision: Ten Gulfstream jets, outfitted with special engines that allow them to fly safely around the stratosphere at an altitude of 70,000 feet, take off from a runway near the Equator. Their cargo includes thousands of pounds of a chemical compound — liquid sulfur, let’s suppose — that can be sprayed as a gas from the aircraft. It is not a one-time event; the flights take place throughout the year, dispersing a load that amounts to 25,000 tons. If things go right, the gas converts to an aerosol of particles that remain aloft and scatter sunlight for two years. The payoff? A slowing of the earth’s warming — for as long as the Gulfstream flights continue.

Keith argues that such a project, usually known as solar geoengineering, is technologically feasible and — with a back-of-the-envelope cost of under $1 billion annually — ought to be fairly cheap from a cost-benefit perspective, considering the economic damages potentially forestalled: It might do good for a world unable to cut carbon-dioxide emissions enough to prevent further temperature increases later this century.

What surprised me, then, as Keith paced around his Harvard office one morning in early March, was his listing all the reasons humans might not want to hack the environment. “Actually, I’m writing a paper on this right now,” he said. Most of his thoughts were related to the possible dangers of trying to engineer our way out of a climate problem of nearly unimaginable scientific, political and moral complexity. Solar geoengineering might lead to what some economists call “lock-in,” referring to the momentum that a new technology, even one with serious flaws, can assume after it gains a foothold in the market. The qwerty keyboard is one commonly cited example; the internal combustion engine is another. Once we start putting sulfate particles in the atmosphere, he mused, would we really be able to stop?

Another concern, he said, is “just the ethics about messing with nature.” Tall, wiry and kinetic, with thinning hair and a thick beard that gives him the look of the backcountry skier he is, Keith proudly showed me the framed badge that his father, a biologist, wore when he attended the landmark United Nations Conference on the Human Environment in Stockholm in 1972. Now 53, Keith has taken more wilderness trips — hiking, rock climbing, canoeing — than he can properly recall, and for their recent honeymoon, he and his wife were dropped off by helicopter 60 miles from the nearest road in northern British Columbia. “It was quite rainy,” he told me, “and that ended up making it even better.” So the prospect of intentionally changing the climate, he confessed, is not just unpleasant — “it initially struck me as nuts.”

It still strikes him as a moral hazard, to use a term he borrows from economics. A planet cooled by an umbrella of aerosol particles — an umbrella that works by reflecting back into space, say, 1 percent of the sun’s incoming energy — might give societies less incentive to adopt greener technologies and radically cut carbon emissions. That would be disastrous, Keith said. The whole point of geoengineering is not to give us license to forget about the buildup of CO₂. It’s to lessen the ill effects of the buildup and give us time to transition to cleaner energy.

Beyond these conceivable dangers, though, a more fundamental problem lurks: Solar geoengineering simply might not work. It has been a subject of intense debate among climate scientists for roughly a decade. But most of what we know about its potential effects derives from either computer simulations or studies on volcanic eruptions like that of Mount Pinatubo in 1991, which generated millions of tons of sunlight-scattering particulates and might have cooled the planet by as much as 0.5 degrees Celsius, or nearly 1 degree Fahrenheit. The lack of support for solar geoengineering’s efficacy informs Keith’s thinking about what we should do next. Actively tinkering with our environment — fueling up the Gulfstream jets and trying to cool things down — is not something he intends to try anytime soon, if ever. But conducting research is another matter.

A decade ago, when Keith was among the few American scientists to advocate starting a geoengineering research program, he was often treated at science conferences as an outlier. “People would sort of inch away or, really, tell me I shouldn’t be doing this,” he said. Geoengineering was seen as a scientific taboo and Keith its dark visionary. “The preconception was that I was some kind of Dr. Strangelove figure,” he told me — “which I didn’t like.”

Attitudes appear to have changed over the past few years, at least in part because of the continuing academic debates and computer-modeling studies. The National Academy of Sciences endorsed the pursuit of solar geoengineering research in 2015, a stance also taken in a later report by the Obama administration. A few influential environmental groups, like the Natural Resources Defense Council and the Environmental Defense Fund, now favor research.

In the meantime, Keith’s own work at Harvard has progressed. This month, he is helping to start Harvard’s Solar Geoengineering Research Program, a broad endeavor that begins with $7 million in funding and intends to reach $20 million over seven years. One backer is the Hewlett Foundation; another is Bill Gates, whom Keith regularly advises on climate change. Keith is planning to conduct a field experiment early next year by putting particles into the stratosphere over Tucson.

The new Harvard program is not merely intent on getting its concepts out of the lab and into the field, though; a large share of its money will also be directed to physical and social scientists at the university, who will evaluate solar geoengineering’s environmental dangers — and be willing to challenge its ethics and practicality. Keith told me, “It’s really important that we have a big chunk of the research go to groups whose job will be to find all the ways that it won’t work.” In other words, the technology that Keith has long believed could help us ease our predicament — “the nuclear option” for climate, as one opponent described it to me, to be considered only when all else has failed — will finally be investigated to see whether it is a reasonable idea. At the same time, it will be examined under the premise that it may in fact be a very, very bad one.

Climate change already presents a demoralizing array of challenges — melting ice sheets and species extinctions — but the ultimate severity of its impacts depends greatly on how drastically technology and societies can change over the next few decades. The growth of solar and wind power in recent years, along with an apparent decrease in coal use, suggest that the global community will succeed in curtailing CO₂ emissions. Still, that may not happen nearly fast enough to avert some dangerous consequences. As Keith likes to point out, simply reducing emissions doesn’t reverse global warming. In fact, even if annual global CO₂ emissions decrease somewhat, the total atmospheric CO₂ may continue to increase, because the gas is so slow to dissipate. We may still be living with damaging amounts of atmospheric carbon dioxide a half-century from now, with calamitous repercussions. The last time atmospheric CO₂ levels were as elevated as they are today, three million years ago, sea levels were most likely 45 feet higher, and giant camels roamed above the Arctic Circle.

Recently, I met with Daniel Schrag, who is the head of the Harvard University Center for the Environment, an interdisciplinary teaching and research department. Schrag, who helped recruit Keith to Harvard, painted a bleak picture of our odds of keeping global temperatures from rising beyond levels considered safe by many climate scientists. When you evaluate the time scales involved in actually switching our energy systems to cleaner fuels, Schrag told me, “the really depressing thing is you start to understand why any of these kinds of projections — for 2030 or 2050 — are absurd.” He went on: “Are they impossible? No. I want to give people hope, too. I’d love to make this happen. And we have made a lot of progress on some things, on solar, on wind. But the reality is we haven’t even started doing the hard stuff.”

Schrag described any kind of geoengineering as “at best an imperfect solution that is operationally extremely challenging.” Yet to Schrag and Keith, the political and technical difficulties associated with a rapid transition to a zero-carbon-emissions world make it sensible to look into geoengineering research. There happens to be a number of different plans for how to actually do it, however — including the fantastical (pumping seawater onto Antarctica to combat sea-level rise) and the impractical (fertilizing oceans with iron to foster the growth of algae, which would absorb more CO₂). Some proposals involve taking carbon out of the air, using either immense plant farms or absorption machines. (Keith is involved with such sequestration technology, which faces significant hurdles in terms of cost and feasibility.) Another possible approach would inject salt crystals into clouds over the ocean to brighten them and cool targeted areas, like the dying Great Barrier Reef. Still, the feeling among Keith and his colleagues is that aerosols sprayed into the atmosphere might be the most economically and technologically viable approach of all — and might yield the most powerful global effect.

It is not a new idea. In 2000, Keith published a long academic paper on the history of weather and climate modification, noting that an Institute of Rainmaking was established in Leningrad in 1932 and that American engineers began a cloud-seeding campaign in Vietnam a few decades later. A report issued in 1965 by President Lyndon B. Johnson’s administration called attention to the dangers of increasing concentrations of CO₂ and, anticipating Keith’s research, speculated that a logical response might be to change the albedo, or reflectivity, of the earth. To Keith’s knowledge, though, there have been only two actual field experiments so far. One, by a Russian scientist in 2009, released aerosols into the lower atmosphere via helicopter and appears to have generated no useful data. “It was a stunt,” Keith says. Another was a modest attempt at cloud brightening a few years ago by a team at the Scripps Institution of Oceanography at the University of California, San Diego.

Downstairs from Keith’s Harvard office, there is a lab cluttered with students fiddling with pipettes and arcane scientific instruments. When I visited in early March, Zhen Dai, a graduate student who works with Keith, was engaged with a tabletop apparatus, a maze of tubes and pumps and sensors, meant to study how chemical compounds interact with the stratosphere. For the moment, Keith’s group is leaning toward beginning its field experiments with ice crystals and calcium carbonate — limestone — that has been milled to particles a half-micron in diameter, or less than 1/100th the width of a human hair. They may eventually try a sulfur compound too. The experiment is called Scopex, which stands for Stratospheric Controlled Perturbation Experiment. An instrument that can disperse an aerosol of particles — say, several ounces of limestone dust — will be housed in a gondola that hangs beneath a balloon that ascends to 70,000 feet. The whole custom-built contraption, whose two small propellers will be steered from the ground, will also include a variety of sensors to collect data on any aerosol plume. Keith’s group will measure the sunlight-scattering properties of the plume and evaluate how its particles interact with atmospheric gases, especially ozone. The resulting data will be used by computer models to try to predict larger-scale effects.

But whether a scientist should be deliberately putting foreign substances into the atmosphere, even for a small experiment like this, is a delicate question. There is also the difficulty of deciding on how big the atmospheric plumes should get. When does an experiment become an actual trial run? Ultimately, how will the scientists know if geoengineering really works without scaling it up all the way?

Keith cites precedents for his thinking: a company that scatters cremation ashes from a high-altitude balloon, and jet engines, whose exhaust contains sulfates. But the crux of the problem that Harvard’s Solar Geoengineering Research Program wrestles with is intentionality. Frank Keutsch, a professor of atmospheric sciences at Harvard who is designing and running the Scopex experiments with Keith, told me: “This effort with David is very different from all my other work, because for those other field experiments, we’ve tried to measure the atmosphere and look at processes that are already there. You’re not actually changing nature.” But in this case, Keutsch agrees, they will be.

Illustration by Valero Doval

During one of our conversations, Keith suggested that I try to flip my thinking for a moment. “What if humanity had never gotten into fossil fuels,” he posed, “and the world had gone directly to generating energy from solar or wind power?” But then, he added, what if in this imaginary cleaner world there was a big natural seep of a heat-trapping gas from within the earth? Such events have happened before. “It would have all the same consequences that we’re worried about now, except that it’s not us doing the CO₂ emissions,” Keith said. In that case, the reaction to using geoengineering to cool the planet might be one of relief and enthusiasm.

In other words, decoupling mankind’s actions — the “sin,” as Keith put it, of burning fossil fuels — from our present dilemma can demonstrate the value of climate intervention. “No matter what, if we emit CO₂, we are hurting future generations,” Keith said. “And it may or may not be true that doing some solar geo would over all be a wise thing to do, but we don’t know yet. That’s the reason to do research.”

There are risks, undeniably — some small, others potentially large and terrifying. David Santillo, a senior scientist at Greenpeace, told me that some modeling studies suggest that putting aerosols in the atmosphere, which might alter local climates and rain patterns and would certainly affect the amount of sunlight hitting the earth, could have a significant impact on biodiversity. “There’s a lot more we can do in theoretical terms and in modeling terms,” Santillo said of the Harvard experiments, “before anyone should go out and do this kind of proof-of-concept work.” Alan Robock, a professor of atmospheric sciences at Rutgers, has compiled an exhaustive list of possible dangers. He thinks that small-scale projects like the Scopex experiment could be useful, but that we don’t know the impacts of large-scale geoengineering on agriculture or whether it might deplete the ozone layer (as volcanic eruptions do). Robock’s list goes on from there: Solar geoengineering would probably reduce solar-electricity generation. It would do nothing to reduce the increasing acidification of the oceans, caused by seawater absorbing carbon dioxide. A real prospect exists, too, that if solar geoengineering efforts were to stop abruptly for any reason, the world could face a rapid warming even more dangerous than what’s happening now — perhaps too fast for any ecological adaptation.

Keith is well aware of Robock’s concerns. He also makes the distinction that advocating research is not the same as advocating geoengineering. But the line can blur. Keith struck me as having a fair measure of optimism that his research can yield insights into materials and processes that can reduce the impacts of global warming while averting huge risks. For instance, he is already encouraged by computer models that suggest the Arctic ice cap, which has shrunk this year to the smallest size observed during the satellite era, could regrow under cooler conditions brought on by light-scattering aerosols. He also believes that the most common accusation directed against geoengineering — that it might disrupt precipitation patterns and lead to widespread droughts — will prove largely unfounded.

But Keith is not trained as an atmospheric scientist; he’s a hands-on physicist-engineer who likes to take machinery apart. There are deep unknowns here. Keutsch, for one, seems uncertain about what he will discover when the group actually tries spraying particulates high above the earth. The reduction of sunlight could adversely affect the earth’s water cycle, for example. “It really is unclear to me if this approach is feasible,” he says, “and at this point we know far too little about the risks. But if we want to know whether it works, we have to find out.”

Finally, what if something goes wrong either in research or in deployment? David Battisti, an atmospheric scientist at the University of Washington, told me, “It’s not obvious to me that we can reduce the uncertainty to anywhere near a tolerable level — that is, to the level that there won’t be unintended consequences that are really serious.” While Battisti thought Keith’s small Scopex experiment posed little danger — “The atmosphere will restore itself,” he said — he noted that the whole point of the Harvard researchers’ work is to determine whether solar geoengineering could be done “forever,” on a large-scale, round-the-clock basis. When I asked Battisti if he had issues with going deeper into geoengineering research, as opposed to geoengineering itself, he said: “Name a technology humans have developed that they haven’t used. I can’t think of any. So we can work on this for sure. But we are in this dilemma: Once we do develop this technology, it will be tempting to use it.”

Suppose Keith’s research shows that solar geoengineering works. What then? The world would need to agree where to set the global thermostat. If there is no consensus, could developed nations impose a geoengineering regimen on poorer nations? On the second point, if this technology works, it would arguably be unethical not to use it, because the world’s poorest populations, facing drought and rising seas, may suffer the worst effects of a changing climate.

In recent months, a group under the auspices of the Carnegie Council in New York, led by Janos Pasztor, a former United Nations climate official, has begun to work through the thorny international issues of governance and ethics. Pasztor told me that this effort will most likely take four years. And it is not lost on him — or anyone I spoke with in Keith’s Harvard group — that the idea of engineering our environment is taking hold as we are contemplating the engineering of ourselves through novel gene-editing technologies. “They both have an effect on shaping the pathway where human beings are now and where will they be,” says Sheila Jasanoff, a professor of science and technology studies at Harvard who sometimes collaborates with Keith. Jasanoff also points out that each technology potentially enables rogue agents to act without societal consent.

This is a widespread concern. We might reach a point at which some countries pursue geoengineering, and nothing — neither costs nor treaties nor current technologies — can stop them. Pasztor sketched out another possibility to me: “You could even have a nightmare scenario, where a country decides to do geoengineering and another country decides to do counter-geoengineering.” Such a countermeasure could take the form of an intentional release of a heat-trapping gas far more potent than CO₂, like a hydrochlorofluorocarbon. One of Schrag’s main concerns, in fact, is that geoengineering a lower global temperature might preserve ecosystems and limit sea-level rise while producing irreconcilable geopolitical frictions. “One thing I can’t figure out,” he told me, “is how do you protect the Greenland ice sheet and still have Russia have access to its northern ports, which they really like?” Either Greenland and Siberia will melt, or perhaps both can stay frozen. You probably can’t split the difference.

For the moment, and perhaps for 10 or 20 years more, these are mere hypotheticals. But the impacts of climate change were once hypotheticals, too. Now they’ve become possibilities and probabilities. And yet, as Tom Ackerman, an atmospheric scientist at the University of Washington, said at a recent discussion among policy makers that I attended in Washington: “We are doing an experiment now that we don’t understand.” He was not talking about geoengineering; he was observing that the uncertainty about the potential risks of geoengineering can obscure the fact that there is uncertainty, too, about the escalating disasters that may soon result from climate change.

His comment reminded me of a claim made more than a half-century ago, long before the buildup of CO₂ in the atmosphere had become the central environmental and economic problem of our time. Two scientists, Roger Revelle and Hans Suess, wrote in a scientific paper, “Human beings are now carrying out a large-scale geophysical experiment of a kind that could not have happened in the past nor be reproduced in the future.”

If anything could sway a fence-sitter to consider whether geoengineering research makes sense, perhaps it is this. The fact is, we are living through a test already.

Editor’s Note: In 2008 professors Keith, Schrag and Battisti participated in Novim’s first scientific study – “Geoengineering the Atmosphere”

Posted on Categories News, Uncategorized

The cost of California’s public pensions is breaking the bank. Here’s one reason this problem is so hard to fix

From left to right, former California attorneys general Bill Lockyer and Kamala Harris and current attorney general Xavier Becerra. (Rich Pedroncelli / Associated Press; Damian Dovarganes / Associated Press; Gary Coronado / Los Angeles Times)

The fate of reform measures hangs on ballot language written by the state attorney general, usually a Democrat elected with strong union support.

More than 20 times in the last 15 years, political leaders looking to control California’s fast-growing public pension costs have tried to put reform initiatives before the voters.

None of the proposals has made it onto the ballot.

Often, advocates could not raise enough money for signature gathering, advertising and other costs of an initiative campaign. Some of the most promising efforts, however, ran into a different kind of obstacle: an official summary, written by the state attorney general, that described the initiative in terms likely to alienate voters. Facing bleak prospects at the polls, the sponsors abandoned the campaigns.

Taxpayer advocates contend that the attorneys general — Democrats elected with robust support from organized labor — put a finger on the scale, distilling the initiatives in language that echoed labor’s rhetoric.

Labor leaders say the summaries were neutral and accurate, and that the problem lay with the initiatives — which, they contend, would have diluted benefits already promised to public employees.

The attorney general’s title and summary, which appear on petitions and in the official voter guide, can powerfully shape attitudes toward a ballot measure. The language has emerged as a battleground between those seeking to overhaul California’s public retirement system and those determined to defend it.

“It’s the one thing every voter will see, and it’s the last thing every voter will see,” said Thomas W. Hiltachk, a lawyer who specializes in California initiatives and has run campaigns in support of Republican ballot measures. “Whether you have a well-funded campaign or an underfunded campaign, those words are critically important.”

Retirement benefits are the fastest-growing expense in many municipal budgets. In Los Angeles and other cities, they account for 20% or more of general fund spending. The burden has pushed some cities to the edge of bankruptcy.

Yet a string of court rulings, known collectively as “the California Rule,” has posed a formidable barrier to change. It prohibits cuts in pension benefits already granted or promised. Under the rule, pensions are considered binding contracts protected by the state Constitution.

For that reason, many of the cost-saving measures passed by the Legislature in recent years, including later retirement ages and smaller monthly pension checks, did not affect employees already on the payroll. They applied only to newly hired workers. As a result, the savings will not kick in for many years.

Pension reform advocates say that achieving real relief in the near term will require reductions in benefits already granted to current employees. Because of the California Rule, that can be done only by amending the Constitution. And that requires a ballot initiative.

A wide majority of California voters surveyed have favored changing the pension system to save money. Support drops sharply when the change is framed as reducing benefits for teachers, police and firefighters.

That’s why the attorney general’s choice of words is so important. By law, the title and summary “shall be true and impartial” and not likely to “create prejudice for or against the proposed measure.”

In 2013, San Jose Mayor Chuck Reed, left, and San Diego City Councilman Carl DeMaio, right, proposed a constitutional amendment to allow government agencies to reduce current employees’ pension benefits — but only for future years of service. Benefits already earned would not have been affected. (Paul Sakuma / Associated Press; Lenny Ignelzi / Associated Press)

Disputes over the language have figured prominently in several major reform attempts. The most recent, in 2013-14, was led by then-San Jose Mayor Chuck Reed and former San Diego City Councilman Carl DeMaio.

Reed, a Democrat, and DeMaio, a Republican, proposed a constitutional amendment to alter the California Rule by targeting future benefits of current employees. Workers would keep retirement benefits they had earned, but future benefits would no longer be guaranteed; they would be determined through collective bargaining or public referendum.

A survey conducted for labor groups opposed to the initiative found that majority support for pension reform collapsed if it was described as “eliminating police, firefighters, and other public employees’ vested pension benefits” or “eliminating state constitutional protections.”

The word “eliminate” “fosters a visceral negative response from voters,” according to a memo by the labor coalition’s Washington pollsters.

The Sacramento Bee published an article about the memo in December 2013. Three weeks later, then-Atty. Gen. Kamala Harris issued her summary of the initiative.

It said the Reed-DeMaio measure “eliminates constitutional protections for vested pension and retiree healthcare benefits for current public employees, including teachers, nurses, and peace officers, for future work performed.”

Reed and DeMaio sued the attorney general, accusing her of modeling her ballot language on the labor survey. The suit suggested an alternative summary: “Amends California constitution to allow government employers to negotiate with government employees to modify pension and retiree healthcare benefits for future work performed.”

The court sided with Harris, ruling that Reed and DeMaio had not proved the summary was false or misleading and that the attorney general is afforded “considerable latitude” in crafting the language.

Reed and DeMaio dropped the initiative in March 2014 after concluding that it was unlikely to win with Harris’ ballot language.

“I personally didn’t think she would be so obviously, egregiously negative,” said Reed, now special counsel at Hopkins & Carley, a Silicon Valley law firm.

Sen. Kamala Harris, a former California attorney general, has enjoyed strong support from public employee unions during her political career. Above, Harris speaks at the Democratic State Convention in May 2015. (Patrick T. Fallon / For The Times)

Harris had been elected attorney general in 2010 with strong financial support from labor: more than $600,000 in donations to her campaign and to independent expenditure committees, according to the National Institute on Money in State Politics. She raised a total of $7.5 million that year.

Harris received an additional $400,000 from labor for her 2014 reelection effort, and she collected $73,102 from public employee unions in her successful $14-million campaign for the U.S. Senate last year.

Harris did not respond to requests to be interviewed for this article.

Steve Maviglio, a spokesman for Californians for Retirement Security, the labor coalition that opposed the initiative, said the campaign contributions to Harris don’t prove anything. He said the labor survey indicated that the initiative would lose “regardless of how the ballot language is written.”

Maviglio said recent pension initiatives have simply been too extreme for voters to support. “I think that’s just a lame excuse for their political malpractice,” he said.

A former senior advisor to Harris said the attorney general was keenly aware of how the title and summary could affect a ballot measure’s prospects.

Staff attorneys typically drafted multiple versions after consulting both advocates for and opponents of a particular initiative, the former advisor said. Staff lawyers would weigh the pros and cons of each, and Harris would approve the final wording.

Regarding the Reed-DeMaio initiative, the former advisor said the similarity between the attorney general’s summary and the labor memo reflected shared values, not a quid pro quo.

It was the second time Harris approved summary language that proponents of pension reform regarded as unfair.

California Pension Reform, a Republican-led advocacy group, proposed an initiative for the 2012 ballot that would have reduced benefits for both current and newly hired public workers. It called for imposing caps on how much government employers could contribute toward workers’ retirements.

The attorney general’s summary stated that the initiative “eliminates constitutional protections for current and future public employees’ vested pension benefits.”

California Pension Reform dropped the initiative, asserting that the “false and misleading title and summary make it nearly impossible to pass.” At the time, Harris’ office rejected the criticism, saying the title and summary accurately described “the initiative’s chief points and purposes.”

Dan Pellissier, president of the advocacy group and a former aide to Assembly Republicans, said there wasn’t enough time to challenge the attorney general in court and still collect enough signatures to meet the ballot deadline. He said the summary was unfair because it stated as fact that pension benefits are constitutionally protected when the issue is in dispute.

One of Harris’ predecessors, Democratic Atty. Gen. Bill Lockyer, was accused of writing politically charged language for a pension measure in 2005. The initiative, proposed by then-Gov. Arnold Schwarzenegger, would have given future state workers 401(k)-style retirement accounts instead of traditional pensions.

Schwarzenegger said in his State of the State Address that year that California’s pension obligations had risen from $160 million in 2000 to $2.6 billion, “threatening our state.”

But the Republican governor abandoned the initiative in April 2005, after Lockyer’s office issued a title and summary that said the measure would eliminate death and disability benefits for future public employees.

Schwarzenegger’s initiative did not mention death benefits. But the governor’s advisors appeared to have overlooked a key detail: death and disability benefits were tied to guaranteed pensions. Newly hired civil servants, who wouldn’t have such pensions, wouldn’t have the associated benefits either, unless they were provided separately.

Opponents of the measure seized on the issue and mobilized widows of slain police officers to speak out against Schwarzenegger’s proposal.

Schwarzenegger said at the time that he would never eliminate police death benefits, and that Lockyer had misinterpreted the initiative.

The governor’s communications director, Rob Stutzman, suggested that the attorney general was trying to curry favor with labor unions to mount a possible bid for governor. Lockyer received more than $1.5 million in campaign contributions from public employee unions during his two terms as attorney general.

Lockyer, now a lawyer with the firm Brown Rudnick in Orange County, said his staff’s analysis of the Schwarzenegger initiative was correct. “They complained about it, but it was a lot of political whining,” he said.

Jon Coupal, president of the Howard Jarvis Taxpayers Assn. which backed Schwarzenegger’s proposal, disagreed. He said nothing in the initiative would have prevented death and disability benefits from continuing. “They created ambiguity out of whole cloth,” he said.

Reed and other proponents of pension reform plan to put a new measure on the ballot next year. If they do, the title and summary will be written by California’s new attorney general, former U.S. Rep. Xavier Becerra, a Democrat from Los Angeles.

Becerra was nominated to serve the remainder of Harris’ term after she was sworn in as a U.S. senator in January. During his confirmation hearing, Becerra was asked about the attorney general’s obligation to write neutral summaries for ballot measures.

“I understand the importance of a word,” he said, adding: “The words I get to issue on behalf of the people of this state will be the words that are operative for everyone.”

After his confirmation, during his first news conference as attorney general, Becerra addressed the issue again. He said he recognized the need for “fiscally sound” pension policies, but added that his father was a retired union construction worker, with a pension.

“Do I want to see someone like my father be told that he’s not going to get what he bargained for?” he said. “You drive on the roads that my dad built. I think anyone who works hard deserves to get what they bargained for.”


Judy Lin is a reporter at CALmatters, a nonprofit journalism venture in Sacramento.

Posted on Categories News