Novim News

The Murky Future of Nuclear Power in the United States

A view into Unit 4 at the Alvin W. Vogtle generating station in Georgia. The complex plans to use AP1000 reactors from Westinghouse. Credit via Georgia Power

This was supposed to be America’s nuclear century.

The Three Mile Island meltdown was two generations ago. Since then, engineers had developed innovative designs to avoid the kinds of failures that devastated Fukushima in Japan. The United States government was earmarking billions of dollars for a new atomic age, in part to help tame a warming global climate.

But a remarkable confluence of events is bringing that to an end, capped in recent days by Toshiba’s decision to take a $6 billion loss and pull Westinghouse, its American nuclear power subsidiary, out of the construction business.

The reasons are wide-ranging. Against expectations, demand for electricity has slowed. Natural-gas prices have tumbled, eroding nuclear power’s economic rationale. Alternative-energy sources like wind and solar power have come into their own.

And, perhaps most significantly, attempts to square two often-conflicting forces — the desire for greater safety, and the need to contain costs — while bringing to life complex new designs have blocked or delayed nearly all of the projects planned in the United States.

“You can make it go fast, and you can make it be cheap — but not if you adhere to the standard of care that we do,” said Mark Cooper of the Institute for Energy and the Environment at Vermont Law School, referring to the United States regulatory body, which is considered one of the most meticulous in the world. “Nuclear safety always undermines nuclear economics. Inherently, it’s a technology whose time never comes.”

In the process, the United States could lose considerable influence over standards governing safety and waste management, nuclear experts say. And the world may show less willingness to move toward potentially safer designs.

“I’m concerned that if the U.S. is not seen as a big player, and doesn’t have that kind of market presence, that we won’t be in a competitive position to bring those standards back up,” said Richard Nephew, a senior research scholar at the Center on Global Energy Policy at Columbia. “If you’ve got more lax safety standards worldwide, I think that’s a problem from an industry perspective as well as just a human standard.”

This may be an advantage for state-owned nuclear industries worldwide. Often they benefit from long-term national policies in places like Eastern Europe, Asia and the Middle East.

By contrast, the Toshiba-Westinghouse withdrawal from nuclear construction shows how daunting it can be for the private sector to build these plants, even with generous government subsidies like loan guarantees and tax credits. Projects take decades to complete. Safety concerns change along the way, leading to new regulations, thousands of design alterations, delays and spiraling costs for every element.

In one case, even the dirt used to backfill excavated holes at the Westinghouse project in Georgia became a point of contention when it did not measure up to Nuclear Regulatory Commission standards, leading to increased costs and a lawsuit.

Thus far in the United States, only the Tennessee Valley Authority, itself a government corporation, has been able to bring a new nuclear reactor into operation in the last 20 years.

Of the dozens of new reactors once up for licensing with the Nuclear Regulatory Commission, only four are actively under construction. Two are at the Alvin W. Vogtle generating station in Georgia, and two at the Virgil C. Summer plant in South Carolina. Both projects, which plan to use a novel reactor from Westinghouse, have been plagued by delays and cost overruns, some stemming, paradoxically, from an untested regulatory system intended to simplify and accelerate their development.

The projects, more than three years late and billions over budget, are what pushed Westinghouse — one of the last private companies building nuclear reactors — and its parent, Toshiba, to the brink of financial ruin, resulting in Toshiba’s chairman stepping down.

The company has said that Westinghouse will complete the reactors for the projects it already has underway, including two in China. But the fate of other projects in the United States and abroad that plan to use the Westinghouse reactor, known as the AP1000, are in doubt, along with the role of the United States in the future of nuclear energy. It is also unclear how President Trump will approach nuclear energy development, which has broad and overlapping implications for tax and trade policies, economic development and national security.

The AP1000 is considered one of the world’s most advanced reactors, with simplified structures and safety equipment which were intended to make it easier and less expensive to install, operate and maintain. It has been designed with an improved ability to withstand earthquakes and plane crashes and is less vulnerable to a cutoff of electricity, which is what set off the triple meltdown at Fukushima.

The industry has lurched through boom and bust cycles before.

Nuclear construction had all but disappeared in the United States, particularly after the partial meltdown at Three Mile Island in Pennsylvania in 1979. Concerns over climate change led to renewed interest in building new plants under the administration of George W. Bush, however. The Bush-era energy policy acts authorized $18.5 billion in loan guarantees, plus tax credits like those available for wind and solar.

The Alvin W. Vogtle generating station in Georgia, one of only four of the dozens of new reactors once up for licensing with the Nuclear Regulatory Commission still under construction. The Vogtle project has been marred by delays and cost overruns. Credit via Georgia Power

Determined to avoid the delays and ballooning costs that were common as plants were built in the 1970s and ’80s, federal regulators had devised a new licensing process.

Under the old system, companies received construction permits based on incomplete plans and then applied for an operating license, often leading to rebuilding and lengthy delays. The idea for the new system was that companies would submit much more complete design plans for approval, and then receive their operating licenses as construction started. That way, as long as they built exactly what they said they would, the process could move more quickly.

In the meantime, companies like Westinghouse and General Electric were developing a new generation of reactors intended to operate more safely. With the AP1000, for instance, emergency cooling for the reactor mainly relies on natural forces, like gravity, to propel the coolant, rather than relying on mechanical pumps powered by electricity. The problem is that electricity can fail, as it did at Fukushima, which can lead to disastrous overheating in a damaged reactor of an older design.

In addition, Westinghouse was engineering its equipment so that large components of the plants could be made in sections at factories, then welded together and lifted into place with cranes at the construction site. In theory, this approach would save money and time, requiring far less skilled labor than the old, bespoke approach, in which workers assembled more parts onsite.

By 2008, Westinghouse had deals to expand two existing plants with the electric utilities Georgia Power and South Carolina Electric & Gas. Little went as hoped.

Because nuclear construction had been dormant for so long, American companies lacked the equipment and expertise needed to make some of the biggest components, like the 300-ton reactor vessels. Instead, they were manufactured overseas, adding to expense and delays.

One reactor vessel, headed for Georgia Power’s Vogtle plant from the Port of Savannah, almost slipped off a specialized rail car. That led to a weekslong delay before a second attempt was made to deliver it.

And, in a separate snafu, while working on the plant’s basement contractors installed 1,200 tons of steel reinforcing bar in a way that differed from the approved design. That triggered a seven-and-a-half month delay to get a license amendment.

To some extent, the unexpected delays were to be, well, expected, given the novelty of the design and the fact that builders were decades out of practice. Any large undertaking involving so many first-of-a-kind complexities would be likely get tripped up somewhere, said Daniel S. Lipman, vice president of supplier and international programs at the Nuclear Energy Institute, which represents the industry.

“Whether you’re building a nuclear power plant or providing a new locomotive or a new fighter jet complex for the Defense Department, the first of a kind almost always takes longer to be deployed,” he said.

And then there was Fukushima, when an earthquake and tsunami knocked out both grid and backup emergency power at the plant, disabling its cooling systems and leading to the meltdown of three reactors. The plant remains shut down, and the decommissioning and cleanup process is projected to take as long as 40 years.

The Japan disaster prompted regulators to revisit safety standards, slowing approval of the Westinghouse designs and resulting in new requirements even after the Nuclear Regulatory Commission gave the go-ahead for the Georgia and South Carolina projects. That led to more costly delays as manufacturing orders had to be changed.

As all of that unfolded, Westinghouse was having troubles with the contractor it chose to complete the projects, a company that struggled to meet the strict demands of nuclear construction and was undergoing its own internal difficulties after a merger. As part of an effort to get the delays and escalating costs under control, Westinghouse acquired part of the construction company, which set off a series of still-unresolved disputes over who should absorb the cost overruns and how Westinghouse accounted for and reported values in the transaction.

Toshiba, which would like to sell all or part of its controlling interest in Westinghouse, has said it will continue to look into Westinghouse’s handling of the purchase.

“Certainly they underestimated the amount of liability or cost overruns that these projects were in,” Robert Norfleet, a managing director at Alembic Global Advisors who has followed the machinations, said of Westinghouse. “I don’t really know how they can’t take the blame for that. That’s something within their own due diligence that they needed to do.”

In the meantime, the main stage for nuclear development will move overseas to places like China, Russia, India, Korea and a handful of countries in the Middle East, where Westinghouse will have to find partners to build its designs.

In China, plants using an earlier model of the AP1000 are moving toward completion. If they are successful, that may stir up more interest in the technology, and future installations may go more smoothly. But Toshiba’s ambitions of installing 45 new reactors worldwide by 2030 no longer look feasible.

Indeed, despite the much-ballyhooed ingenuity of a new generation of reactors designed by the likes of Westinghouse and G.E., countries may stick with older technologies that they can produce and install more quickly and cheaply. “Until several of these new designs — including the AP1000 from Westinghouse — come online on time and on budget,” said Brent Wanner, an analyst at the International Energy Agency, “it will be an uphill battle.”

Posted on Categories News

Storms Bring Relief to Drought-Stricken California, but Santa Barbara Misses Out: Central Coast community’s plight underscores geographic and logistic vagaries of state’s water distribution

SANTA BARBARA, Calif.—Tom Fayram admits to a fair amount of water envy.

“When you see flooding rivers in Northern California, you wish you could see some of that here,” said Mr. Fayram, deputy director of Santa Barbara County’s water resources division.

Yet Santa Barbara, about 100 miles north of Los Angeles along the state’s Central Coast, has been left high and dry compared with almost every other part of California this wet winter, showing the geographic and logistic vagaries of water distribution in this drought-plagued state.

California Drought Comparison: Intensity

About half the state has emerged from a six-year drought that prompted severe water restrictions on residents and businesses. Even Southern California, still stuck in drought conditions, has seen improvements as a parade of storms soaks the region.

But the rainfall has largely danced around Santa Barbara. Lake Cachuma, which serves as the largest supply of water for the county’s 450,000 residents, stood at just 14.4% of capacity as of Tuesday. By comparison, 82% for Northern California’s Shasta Lake and 86% for Southern California’s Castaic Lake.

Santa Barbara remains the only county in California listed as mostly still under extreme drought as of the end of January, according to most recent estimates by the University of Nebraska’s National Drought Mitigation Center.

Just 2% of California was listed in that category—including parts of neighboring Kern and San Luis Obispo counties—compared with 64% a year earlier.

That has forced Santa Barbara to continue harsh water restrictions while the rest of the state is easing them.

Surfers enjoyed the large waves at the entrance to Santa Barbara, Calif., harbor on Jan. 21, 2017. A winter storm brought much higher than usual waves to the area, but has failed to alleviate the drought that is still hurting the community there. PHOTO: MIKE ELIASON/SANTA BARBARA COUNTY FIRE DEPARTMENT/ASSOCIATED PRESS

On Jan. 1, the city of Santa Barbara put into effect a new ordinance that bans lawn watering, with limited exceptions, to cut water use to 40% from 35% last year.

The water situation for the rest of California, meanwhile, is largely much improved this winter. Electronic readings Tuesday by the state Department of Water Resources measured the snow pack in the Sierra Nevada mountains at 182% of normal.

That counts for as much as a third of California’s supply, meaning cities and farms that are part of a distribution network tied to the snow pack stand to receive most of their normal deliveries this year. Santa Barbara receives about a third of its water this way.

Water agencies in San Diego and Orange counties declared an end to local drought conditions in the past few weeks, and many are calling on the State Water Resources Control Board to suspend emergency regulations that were adopted in 2014. Agency board members, while acknowledging drought conditions have greatly eased, voted Wednesday to extend the regulations through at least May, to reassess the situation when the California rainy season typically draws to a close.

Kira Redmond, executive director of Santa Barbara Channelkeeper, a local environmental group, is seen at Lake Cachuma in Santa Barbara. PHOTO: JAKE NICOL/THE WALL STREET JOURNAL

The U.S. Southeast experienced a severe drought last summer and fall that lowered lakes and rivers and hurt cattle ranchers and farmers. The extended drought sparked wildfires through southern Appalachia, including deadly blazes in and around Gatlinburg, Tenn.

Winter rains have erased the drought’s impact in much of the South, but extreme drought remains in some parts of Alabama and Georgia, according to the U.S. Drought Monitor. Bill Murphey, Georgia’s state climatologist, said Lake Lanier, which supplies water to Atlanta, is still below normal levels, as is the area around the headwaters of the Chattahoochee River. He hoped the region’s ground would be recharged with more rain in coming months before the drier, warmer spring and summer take hold, he said.

Santa Barbara’s water troubles, in part, can be traced to its location on the rugged California coast, where communities often have to depend heavily on rain and other local water sources because of the difficulty piping the Sierra Nevada water there.

The 17.6 inches of rainfall measured near the reservoir are only mildly ahead of normal; another 30 inches would need to fall to refill it, said Mr. Fayram of the county water resources division.

“We have a long way to go,” he said.

Santa Barbara started at a deep water deficit: a mere 63 inches of rain has fallen at a measuring spot in the local mountains between 2011 and 2016—far below the 93 inches that fell there during the previous worst drought, from 1986-1991, said Joshua Haggmark, the city’s water resources manager.

A DC-10 tanker dropped fire retardant at a low altitude to help combat a wildfire near Santa Barbara, on June 17, 2016. PHOTO: NICK UT/ASSOCIATED PRESS

The city of Santa Barbara spent the last two years and millions of dollars banking unused state water in the San Luis Reservoir hundreds of miles away for an emergency. However, that reservoir has filled up so fast from recent rains that much of the banked supply will be lost by mid-February to spillage, Mr. Haggmark said. Under state law, banked supplies aren’t protected from spillage.

The city is completing the reactivation of a desalination plant it froze after another drought ended in the early 1990s. But that project has encountered cost overruns and is fiercely opposed by some local environmentalists, who call desalination too expensive and harmful to marine waters. Mr. Haggmark said the city turned to desalination as a last resort.

“It’s been like a perfect storm for us,” Mr. Haggmark said. “Things are really pretty dismal right now.”

Lake Cachuma in Santa Barbara PHOTO: JAKE NICOL/THE WALL STREET JOURNAL

Environmentalists say semiarid places such as Santa Barbara need to conserve and recycle more.

Santa Barbara Channelkeeper, a local environmental group, recently obtained a grant to deploy used wine barrels for rain capture. Kira Redmond, executive director of the group, said other techniques such as converting sewage water to potable use must be deployed.

“I think that’s the wave of the future,” Ms. Redmond said.

Standing on a boat ramp overlooking the nearly empty Cachuma one day last week, Lauri Rawlins-Betta took in the arid landscape.

“This is so sad,” said Ms. Rawlins-Betta, 65 years old, who grew up near here and was visiting from another part of the state.

The county’s population has more than quadrupled since the lake was built 60 years ago.

“To me,” Ms. Rawlins-Betta said, “it’s overgrowth, big time.”

—Cameron McWhirter in Atlanta contributed to this article.

Posted on Categories News

Cutting jobs, street repairs, library books to keep up with pension costs: Generous retirement benefits for public safety employees could help push the Bay Area city of Richmond into bankruptcy

Richmond, a working-class city of 110,000 on the east shore of San Francisco Bay, has been struggling with the cost of employee retirement benefits. Pension-related expenses have risen from $25 million to $44 million annually in the last five years and could reach $70 million by 2021. (Robert Durell / CALmatters)

When the state auditor gauged the fiscal health of California cities in 2015, this port community on the eastern shore of San Francisco Bay made a short list of six distressed municipalities at risk of bankruptcy.

Richmond has cut about 200 jobs — roughly 20% of its workforce — since 2008. Its credit rating is at junk status. And in November, voters rejected a tax increase that city leaders had hoped would help close a chronic budget deficit.

“I don’t think there’s any chance we can avoid it,” said former City Councilman Vinay Pimple, referring to bankruptcy.

A major cause of Richmond’s problems: relentless growth in pension costs.

Payments for employee pensions, pension-related debt and retiree healthcare have climbed from $25 million to $44 million in the last five years, outpacing all other expenses.

By 2021, retirement expenses could exceed $70 million — 41% of the city’s general fund.

Richmond is a stark example of how pension costs are causing fiscal stress in cities across California. Four municipalities — Vallejo, Stockton, San Bernardino and Mammoth Lakes — have filed for bankruptcy protection since 2008. Others are on the brink.

“The truth is that there are cities all over the state that just aren’t owning up to all their problems,” said San Bernardino City Manager Mark Scott.

Increasingly, pension costs consume 15% or more of big city budgets, crowding out basic services and leaving local governments more vulnerable than ever to the next economic downturn.

Richmond is a racially diverse, working-class city of 110,000 whose largest employer is a massive Chevron oil refinery. Like many California municipalities, Richmond dug a financial hole for itself by granting generous retirement benefits to police and firefighters on the assumption that pension fund investments would grow fast enough to cover the cost.

That optimism proved unfounded, and now the bill is coming due.

City Manager Bill Lindsay insists that Richmond can avoid going off a cliff. Last year, financial consultants mapped a path to stability for the city by 2021 — but at a considerable cost in public services.

The city cut 11 positions, reduced after-school and senior classes, eliminated neighborhood clean-ups to tackle illegal trash dumping, and trimmed spending on new library books — saving $12 million total.

City officials also negotiated a four-year contract with firefighters that freezes salaries and requires firefighters to pay $4,800 a year each toward retirement healthcare. Until then, the benefit was fully funded by taxpayers.

“I’ve seen some of my good friends go through it in Vallejo and Stockton, and what we found out during those [bankruptcies] is that your union contracts aren’t necessarily guaranteed,” said Jim Russey, president of Richmond Firefighters Local 188.

Richmond’s consultants said the city had to find $15 million more in new revenue or budget cuts by 2021. Lindsay said the city has been looking hard for additional savings, and the police union recently agreed to have its members contribute toward retirement healthcare.

July 26, 2016
Tough sledding
Financial consultants with the National Resource Network spelled out the daunting challenges Richmond faces in righting its finances.

“If you look at the five-year forecast, with reasonable assumptions, even with the growth in pension cost, it does start to generate a surplus,” Lindsay said.

Joe Nation, a former Democratic state legislator who teaches public policy at Stanford’s Institute for Economic Policy Research, is not so sanguine. He reviewed Richmond’s retirement cost projections and said they leave little room to maneuver.

Over the next five years, every dollar the city collects in new revenue will go toward retirement costs, leaving little hope of restoring city services, Nation said.

“If there is an economic downturn of any kind, I can imagine that they could be pushed to the brink of bankruptcy, if not bankruptcy,” Nation said.

Last month, the California Public Employees’ Retirement System (CalPERS), the state’s main pension fund, lowered its projected rate of return on investments from 7.5% to 7% per year. That means Richmond and other communities will have to pay more each year to fund current and future pension benefits.

Dec. 21, 2016
Lower returns, higher cost
CalPERS told local governments it was lowering its projected rate of return on investments. That means taxpayers will have to pay more to fund retirement benefits.

The change is expected to increase local government pension payments by at least 20% starting in 2018, according to CalPERS spokeswoman Amy Morgan.

An analysis by the nonprofit news organization CALmatters indicates that Richmond’s retirement-related expenses could grow to more than $70 million per year by 2021. That represents 41% of a projected $174-million general fund budget.

Lindsay said the city’s estimates of future pension costs are lower because of different assumptions about salary increases and other costs.

The city of Richmond’s pension-related budget problems have taken a toll on public services, including street repair. (Robert Durell / CALmatters)

Voters approved a sales tax increase in 2014 to help stabilize the city’s finances. But in November, voters rejected an increase in the property transfer tax that was expected to bring in an additional $4 million to $6 million annually.

Lindsay said the city was never counting on the property transfer tax in its 5-year plan. If the city needed more cash, he says Richmond has properties it can sell.

“Budget management is much more difficult in Richmond than in Beverly Hills, but you still manage it,” Lindsay said. “To say it’s spiraling out of control into bankruptcy does incredible damage to our community and it’s just not accurate.”

Richmond is especially hard hit by personnel costs because of high salaries for public employees. The city’s average salary of $92,000 for its 938 employees was fifth highest in California as of 2015, according to the state controller. The city’s median household income is $54,857.

Police officers and firefighters in Richmond make more than $137,000 per year on average, compared with an average $128,000 per year for Berkeley police and firefighters, where housing prices are more than 60% higher than in Richmond.

Public safety salaries averaged $115,000 in Oakland and $112,000 in Vallejo.

Mayor Tom Butt says of Richmond’s pension-related financial problems: “It’s a huge mess … One of these days, it’s just going to come crashing down.” (Robert Durell / CALmatters)

Richmond Mayor Tom Butt, an architect and general contractor who has served on the city council for two decades, says the city that was once among the state’s most dangerous has little choice but to pay higher salaries to compete for employees with nearby communities that are safer and more affluent.

“You can’t convince anyone here that they deserve less than anybody in any other city,” Butt said.

Lindsay said the decision to offer higher salaries for public safety employees was strategic.

“The city council made a conscious decision to put a lot into public safety, in particular reducing violent crime. And largely, we’ve been successful,” Lindsay said.

Violent crimes have been declining in the city over the past decade with homicides dropping to a low of 11 in 2014. But Richmond is experiencing an uptick, recording 24 homicides in 2016, according to the police department.

Part of the challenge with public safety costs dates to 1999, when Richmond, like many local governments, matched the state’s decision to sweeten retirement benefits for California Highway Patrol Officers.

CHP officers could retire as early as 50 with 3% of salary for each year of service, providing up to 90% of their peak salaries in retirement. Other police departments soon demanded and got similar treatment.

Richmond firefighters are eligible to retire at age 55 with 3% of salary for each year of service. Recent hires will have to work longer to qualify for a less generous formula under legislation passed in 2013.

Richmond’s actuarial pension report shows there are nearly two retirees for every police officer or firefighter currently on the job.

To cope with severe budgetary pressures, the city of Richmond put this Fire Department training facility up for sale. (Robert Durell / CALmatters)

In a way, Richmond is a preview of what California cities face in the years ahead. According to CalPERS, there were two active workers for every retiree in its system in 2001. Today, there are 1.3 workers for each retiree. In the next 10 or 20 years, there will be as few as 0.6 workers for each retiree collecting a pension.

Because benefits have already been promised to today’s workers and retirees, cuts in pension benefits for new employees do little to ease the immediate burden. It “means decades before the full burden of this will be completely dealt with,” said Phil Batchelor, former Contra Costa County administrator and former interim city manager for Richmond.

Today, Richmond’s taxpayers are spending more to make up for underperforming pension investments. CalPERS projects that the city’s payments for unfunded pension liabilities will more than double in the next five years, from $11.2 million to $26.8 million.

Now, the lower assumed rate of investment return is projected to add nearly $9 million to Richmond’s costs by 2021.

“It’s a huge mess,” said Mayor Butt. “I don’t know how it’s going to get resolved. One of these days, it’s just going to come crashing down.”

Judy Lin is a reporter at CALmatters, a nonprofit journalism venture in Sacramento covering state policy and politics.

Posted on Categories News

Toshiba to Exit Nuclear Construction Business: Facing billions of dollars in losses after ill-fated bet, Westinghouse unit will limit future nuclear business to selling reactor designs

The Plant Vogtle nuclear power plant in Waynesboro, Ga., is one of the two U.S. facilities where Westinghouse is in the process of building additional reactors. PHOTO: JOHN BAZEMORE/ASSOCIATED PRESS

Toshiba Corp. plans to stop building nuclear power plants after incurring billions of dollars in losses trying to complete long-delayed projects in the U.S., a move that could have widespread ramifications for the future of the nuclear-power industry.

The Japanese industrial conglomerate is set to announce plans to exit nuclear construction by the middle of February, according to a Toshiba executive familiar with the matter. The executive also said Toshiba’s chairman, Shigenori Shiga, and Danny Roderick, a Toshiba executive and the former head of its Pittsburgh-based nuclear power unit, Westinghouse Electric Co., are expected to step down.

Toshiba’s decision deals a fatal blow to its ambitions to become a major player in the nuclear construction business. The company has bet aggressively on Westinghouse’s AP1000 reactor design, which it hoped would anchor a new generation of nuclear power plants that were supposed to be easier to build and to deliver on time. But signs emerged that the AP1000 wasn’t as easy to build as hoped, and yet Toshiba remained confident and took on added financial risk, according to legal filings and interviews with people involved with the construction process.

Toshiba declined to comment. The company previously said it would disclose the size of Westinghouse’s losses on Feb. 14. In December, it said it was likely to take a write-down of several billion dollars, and people familiar with the situation say the losses could approach $6 billion—plunging the company into a new crisis just as it was seeking to move away from an earlier accounting scandal.

Westinghouse will continue to design nuclear reactors, the Toshiba executive said, and is expected to complete construction work at two U.S. nuclear facilities it is still in the process of building—in Georgia and South Carolina, commissioned by utilities Southern Co. and Scana Corp., respectively.

Toshiba’s future involvement with nuclear plants will be limited to selling its designs; it will let other companies handle the risk of building the facilities, an approach it already takes in China.

“We are closely monitoring [Westinghouse’s] financial status, as well as that of Toshiba,” a Scana spokeswoman said.

Southern officials said they are confident shareholders and customers are protected through a $920 million letter of credit from Westinghouse and a fixed-price contract which transfers responsibility for cost overruns to Westinghouse.

In October 2015, as Toshiba faced a very public accounting scandal centered on its computer business, it was quietly dealing with another crisis in nuclear power-plant construction—and made a series of bold moves in an attempt to fix it.

The company bought out a partner in a nuclear-construction consortium, settled lawsuits and renegotiated contracts with Southern and Scana, which put Toshiba overwhelmingly on the hook if the two construction projects continued to run over budget.

Toshiba’s decision to exit the nuclear construction business could have widespread ramifications. Nuclear power appears to be “too big, too expensive, and most of all, too slow to compete effectively in what is an increasingly ferocious competition,” said Mycle Schneider, a nuclear expert based in Paris.

Toshiba’s Danny Roderick, left, and Shigenori Shiga, second right, are expected to step down when the company announces plans to exit nuclear construction in February. PHOTO: KAZUHIRO NOGI/AGENCE FRANCE-PRESSE/GETTY IMAGES

The nuclear construction business, led by a General Electric Co.-Hitachi Ltd. venture and France’s Areva SA, has been under pressure since the 2011 Fukushima nuclear-plant meltdowns in Japan.

Toshiba plunged into the business in 2006, when it won a bidding war to acquire Westinghouse. Analysts worried at the time that it had overbid. But within a couple of years the bet appeared to be paying off: Southern chose Westinghouse’s design for the first new nuclear plant to be built in the U.S. in 30 years, and the next month Scana also chose the AP1000 for a plant in South Carolina.

The U.S. government approved the designs in early 2012 and work began. Within a few months, legal disputes arose between Westinghouse, its construction consortium partner, Stone & Webster, and Southern over who would pay for unexpected costs resulting from post-Fukushima tougher safety standards, according to filings.

Relations between Westinghouse and Stone & Webster’s owner, Chicago Bridge & Iron NV, broke down by 2015, according to filings. William Jacobs, the independent construction monitor for the plant Southern is building, said Westinghouse and CB&I were “incurring very large costs beyond those being publicly reported” due in part to having so many employees for a project that was years behind schedule.

In March 2015, CB&I broached a possible sale of Stone & Webster to Toshiba. As the talks intensified, Toshiba became mired in the accounting scandal, prompting it to acknowledge it padded profits in its personal computer and other businesses.

Toshiba worried that if the lawsuits with Southern and CB&I over the Fukushima-related safety-cost overruns continued, Toshiba might have had to acknowledge that Westinghouse faced big liabilities, according to company executives. A large write-down at that stage threatened to wipe out the company’s capital.

To end the litigation, Toshiba made several deals in October 2015. It acquired Stone & Webster for $229 million in deferred payments and became the only guarantor on the engineering contract, releasing CB&I. Scana agreed to push back the completion date for the South Carolina plant, but negotiated a deal where it would pay Toshiba $505 million in exchange for switching to a fixed-price contract. Toshiba agreed.

Southern faced up to $1.5 billion in liability in the lawsuits over post-Fukushima safety-cost overruns, and settled for about $350 million in October 2015. The deal restricted Westinghouse’s ability to “seek further increases in the contract price,” Southern said—meaning that if the nuclear plant couldn’t be completed in a timely manner, Toshiba would shoulder the costs.

As problems continued, Westinghouse and CB&I last year sued each other in a dispute over the Stone & Webster sale. Then Toshiba said it might need to take a write-down of several billion dollars related to the value of Stone & Webster, caused by cost overruns.

While Southern said it is insulated from cost overruns, it is unclear if the $920 million line of credit from Westinghouse would be sufficient to complete its two generating units if Westinghouse’s financial problems prevent it from fulfilling its contract.

“I don’t see how Southern and Scana are confident they won’t be responsible for any further cost increases,” said Sara Barczak, a critic of the projects who works for the Southern Alliance for Clean Energy, a nonpartisan advocacy group.

Posted on Categories News

Affordable water may soon dry up, especially if you live here

Water may become unaffordable for a third of American households within the next five years. Photo by Enid Martindale/via Flickr

Remember this number: $120. It’s the average monthly water bill in America.

Researchers at Michigan State University predict this figure will rise by $49 over the next five years. And if it does, water may become unaffordable for one-third of American households, according to a study, published recently in PLOS ONE, that maps the U.S. areas due to be hit hardest based on local incomes.

“The project deals with looking at the economic impacts of rising water prices on both households and regional economies, said Elizabeth Mack, an MSU geographer who led the work. When she first pitched the research idea to her colleagues, some scoffed. While water unaffordability is common overseas, Mack said, most assume Americans have the resources and the willingness to do whatever it takes to pay for water.

But rising water prices are quickly eroding this line of reasoning, according to the investigation conducted by Mack and her colleague Sarah Wrase. Two years ago, a survey of 30 major U.S. cities found water bills rose by 41 percent between 2010 and 2015. This dilemma is well-documented in Detroit, where 50,000 households have lost water access since 2014, or in Philadelphia, where 40 percent of the city’s 227,000 water bills are past due.

Mack took these reports and multiple others to expose how pocketbooks could be affected by escalating water prices on a national level. To do so, she peered into an American Water Works Association survey of water utilities across the U.S. in order to determine the annual water bill for an average consumer. The analysis settled on a bill of $120 for nation’s average monthly consumption of 12,000 gallons in 2014, which is based on figures from the U.S. Environmental Protection Agency.

The EPA also provides an estimate for much Americans can afford to spend on water and wastewater services. If water prices rise above 4.5 percent of a household’s income, then “that means you’re going to have to take expenditures from other portions of your budget and allocate them to water,” Mack said.

To meet this affordability benchmark, a household must earn at least $32,000 per year, according to Mack and Wrase’s assessment. Based on their numbers, nearly 14 million American households — 11.9 percent — couldn’t afford water in 2014. If water prices continue to rise at the same rate (41 percent over five years), then a third of American households — 40 million — may lose access to affordable water, they found.

The team examined median income data for individual areas in the U.S. to chart a map of the communities most at-risk for water poverty.

Counties (census tracts) with a high-risk (black) or at-risk (grey) of losing water affordability. High-risk is defined as areas with a median income below $32,000, while at-risk communities have median incomes of $32,000 and $45,120. Image by Mack EA and Wrase S, 2017, PLoS ONE

The South, urban centers and low-income communities carry the most risk. For instance, 81 percent of high-risk and 63 percent of at-risk communities are concentrated in urban areas. Mississippi, Louisiana and Alabama topped the list with the largest numbers of county subdivisions — or census tracts — facing a high-risk of future water poverty. Many of the at-risk areas also have higher rates of disability, food stamp usage, unemployment and black and Hispanic residents, according to the study.

“Some regions are affected more than others in regards to rising water prices, but it’s unlikely that there are any regions that won’t see increases,” said Justin Mattingly, a research manager at the Water Environment & Reuse Foundation, who wasn’t involved in the study. “Aging infrastructure is a problem for everybody, and water scarcity is becoming a bigger problem in many regions as well. There have been years of disinvestment for water infrastructure, and it’s starting to come back to us now.”

Much of the nation’s water infrastructure dates back to World War II, if not earlier. Washington, D.C. still runs water through wooden pipes from the mid-1800s. On Tuesday, Senate democrats unveiled a $1 trillion infrastructure plan that would allocate $110 billion to water and sewer rehabilitation. But water policy agencies predict a total overhaul of America’s water would itself cost $1 trillion. Tack on another $36 billion to adjust for drought, seawater intrusion into aquifers, flooding and other climate change-based shifts to water systems.

Most of the time, water pipes are installed by housing developers, said Theresa Connor of the One Water Solutions Institute at Colorado State University. But water utilities take over the costs of upkeep once they start serving a new neighborhood.

“Although the major cost is the pipes, you also have to keep your plants up-to-date. If there are any new regulations, you might have to do improvements of the pipes and plants,” said Connor, who wasn’t involved in the study.

Infrastructure replacement is the primary driver of the water price surge in America, Connor said. In Atlanta, which spends more on water than any other state, there was a regulatory initiative to prevent stormwater from discharging into wastewater. The move prevented raw sewage from mixing into the streams used for drinking water. But this regulatory decision plus the privatization of water services bumped Atlanta’s water bills to $325 per month on average.

Recent water regulations — like the Drinking Water Protection Act — have forced some water utilities to update their systems to protect from emerging contaminants like agricultural nutrients. But both Connor and Mattingly said those costs are small relative to the spending needed to address aging infrastructure. “At the end of the day, it’s still aging infrastructure that’s driving much of the rise in water rates,” Mattingly said.

Urban flight is another factor in rising water prices. As populations decline in places like Detroit, water utilities are forced to spread their expenses across fewer people, which boosts rates. Meanwhile, cities like Phoenix and Las Vegas have low prices due to human growth.

“Many utilities are looking at alternative billing structures to take some of the burden off low-income households,” Mattingly said. One tactic involves higher charges for those who use more water, rather than a flat fee for everyone.

Both Mattingly and Connor said Mack’s study is a solid first step in understanding water poverty, but noted that its resolution is limited given the data are based on a small portion of America’s 155,000 or so water systems.

“This reality underlies the problem of a lack of available data on water usage and other metrics in the United States,” Mattingly said. “Without proper data, decision making at the local and national level can be hindered.”

Billing rates can vary dramatically between water providers, even within a single city. In the future, Mack hopes to apply the same analysis on individual cities to offer more guidance on water affordability.

As Flint, Michigan and other cities with water catastrophes have proven, the stakes for water infrastructure improvements are high. Delays can expose citizens to health hazards.

“While Flint has certainly garnered the most attention for its water infrastructure problems – and with good reason – they are certainly not alone,” Mattingly said.

Posted on Categories News

John Arnold Made a Fortune at Enron. Now He’s Declared War on Bad Science

John Arnold’s tweet, “A new study shows …” are “the four most dangerous words,” is a perfect crystallization of the philanthropist’s skeptical attitude toward a lot of scientific research. Photo: BRENT HUMPHREYS

Brian Nosek had pretty much given up on finding a funder. For two years he had sent out grant proposals for his software project. And for two years they had been rejected again and again—which was, by 2011, discouraging but not all that surprising to the 38-year-old scientist. An associate professor at the University of Virginia, Nosek had made a name for himself in a hot subfield of social psychology, studying people’s unconscious biases. But that’s not what this project was about. At least, not exactly.

Like a number of up-and-coming researchers in his generation, Nosek was troubled by mounting evidence that science itself—through its systems of publication, funding, and advancement—had become biased toward generating a certain kind of finding: novel, attention grabbing, but ultimately unreliable. The incentives to produce positive results were so great, Nosek and others worried, that some scientists were simply locking their inconvenient data away.

The problem even had a name: the file drawer effect. And Nosek’s project was an attempt to head it off at the pass. He and a graduate student were developing an online system that would allow researchers to keep a public log of the experiments they were running, where they could register their hypotheses, methods, workflows, and data as they worked. That way, it would be harder for them to go back and cherry-pick their sexiest data after the fact—and easier for other researchers to come in and replicate the experiment later.

Nosek was so taken with the importance of redoing old experiments that he had also rallied more than 50 like-minded researchers across the country to participate in something he called the Reproducibility Project. The aim was to redo about 50 studies from three prominent psychology journals, to establish an estimate of how often modern psychology turns up false positive results.

It was little wonder, then, that funders didn’t come running to support Nosek: He wasn’t promising novel findings, he was promising to question them. So he ran his projects on a shoestring budget, self-­financing them with his own earnings from corporate speaking engagements on his research about bias.

But in July 2012, Nosek received an email from an institution whose name he didn’t recognize: the Laura and John Arnold Foundation. A Google search told him that the Arnolds were a young billionaire couple in Houston. John, Nosek learned, had made his first millions as a wunderkind natural gas trader at Enron, the infamous energy company, and he’d managed to walk away from Enron’s 2001 collapse with a seven-­figure bonus and no accusations of wrong­doing attached to his name. After that Arnold started his own hedge fund, Centaurus Energy, where he became, in the words of one hedge fund competitor, “the best trader that ever lived, full stop.” Then Arnold had abruptly retired at the ripe age of 38 to focus full time on philanthropy.

As Nosek tells it, John Arnold had read about the Reproducibility Project in The Chronicle of Higher Education and wanted to talk. By the following year, Nosek was cofounding an institution called the Center for Open Science with an initial $5.25 million grant from the Arnold Foundation. More than $10 million more in Arnold Foundation grants have come since. “It completely transformed what we could imagine doing,” Nosek says. Projects that Nosek had once envisioned as modest efforts carried out in his lab were now being conducted on an entirely different scale at the center’s startup-like offices in downtown Charlottesville, with some 70 employees and interns churning out code and poring over research. The skeletal software behind the data-sharing project became a slick cloud-based platform, which has now been used by more than 30,000 researchers.

The Reproducibility Project, meanwhile, swelled to include more than 270 researchers working to reproduce 100 psychology experiments—and in August 2015, Nosek revealed its results. Ultimately his army of volunteers could verify the findings of only about 40 percent of the studies. Media reports declared the field of psychology, if not all of science, to be in a state of crisis. It became one of the biggest science stories of the year.

But as it happens, Nosek is just one of many researchers who have received unsolicited emails from the Arnold Foundation in the past few years—researchers involved in similar rounds of soul-searching and critique in their own fields, who have loosely amounted to a movement to fix science.

John Ioannidis was put in touch with the Arnolds in 2013. A childhood math prodigy turned medical researcher, Ioannidis became a kind of godfather to the science reform crowd in 2005, when he published two devastating papers—one of them titled simply “Why Most Published Research Findings Are False.” Now, with a $6 million initial grant from the Arnold Foundation, Ioannidis and his colleague Steven Goodman are setting out to turn the study of scientific practice—known as meta-research—into a full-fledged field in its own right, with a new research center at Stanford.

British doctor Ben Gold­acre also got an email from the Arnold Foundation in 2013. Famous in England as a sharp-witted scourge of “bad science,” Goldacre spent years building up a case that pharmaceutical companies, by refusing to reveal all their data, have essentially deceived the public into paying for worthless therapies. Now, with multiple grants from the Arnolds, he is leading an effort to build an open, searchable database that will link all publicly available information on every clinical trial in the world.

A number of the Arnolds’ reform efforts have focused on fixing nutrition science. In 2011 the science journalist Gary Taubes received an email from Arnold himself. Having spent more than a decade picking apart nutrition science, Taubes soon found himself cofounding an organization with a substantial grant from the Arnold Foundation, to rebuild the study of obesity from the ground up. And in 2015 the Arnold Foundation paid journalist Nina Teicholz to investigate the scientific review process that informs the US Dietary Guidelines. Just weeks before the federal guidelines were due for an update, Teicholz’s blistering report appeared in the prominent medical journal The BMJ, charging that the government’s panel of scientists had failed to consider evidence that would have done away with long-held worries about eating saturated fat.

And those are just a few of the people who are calling out iffy science with Arnold funding. Laura and John Arnold didn’t start the movement to reform science, but they have done more than anyone else to amplify its capabilities—typically by approaching researchers out of the blue and asking whether they might be able to do more with more money. “The Arnold Foundation has been the Medici of meta-research,” Ioannidis says. All told, the foundation’s Research Integrity initiative has given more than $80 million to science critics and reformers in the past five years alone.

Not surprisingly, researchers who don’t see a crisis in science have started to fight back. In a 2014 tweet, Harvard psychologist Daniel Gilbert referred to researchers who had tried and failed to replicate the findings of a senior lecturer at the University of Cambridge as “shameless little bullies.” After Nosek published the results of his reproducibility initiative, four social scientists, including Gilbert, published a critique of the project, claiming, among other things, that it had failed to accurately replicate many of the original studies. The BMJ investigation, in turn, met with angry denunciations from nutrition experts who had worked on the US Dietary Guidelines; a petition asking the journal to retract Teicholz’s work was signed by more than 180 credentialed professionals. (After an external and internal review, The BMJ published a correction but chose not to retract the investigation.)

The backlash against Teic­holz also furnished one of the few occasions when anyone has raised an eyebrow at the Arnolds’ funding of science critics. On the morning of October 7, 2015, the US House Agriculture Committee convened a hearing on the controversy surrounding the dietary guidelines, fueled by the BMJ article. For two and a half hours, a roomful of testy representatives asked why certain nutrition studies had been privileged over others. But about an hour in, Massachusetts representative Jim McGovern leaned into his microphone. Aiming to defend the science behind the guidelines, McGovern suggested that the doubts that had been cast over America’s nutrition science were being driven by a “former Enron executive.” “I don’t know what Enron knows about dietary guidelines,” McGovern said. But “powerful special interests” are “trying to question science.”

McGovern’s quip about Enron, a company that hasn’t existed in 15 years, was a bit of a potshot. But given the long history of deep-pocketed business interests sowing doubt in research, his underlying question was a fair one: Who is John Arnold, and why is he spending so much money to raise questions about science?

Fortune Magazine once dubbed Arnold “one of the least-known billionaires in the US.” His profile in the public consciousness is almost nonexistent, and he rarely gives interviews. But among hedge funders and energy traders, Arnold is a legend. John D’Agostino, former head of strategy of the New York Mercantile Exchange, says that in Arnold’s heyday, people in the industry would discuss him in “hushed and reverent tones.” In 2006, Centaurus reportedly saw returns of over 300 percent; the next year Arnold became the youngest billionaire in the country. “If Arnold decided he wanted to beat hunger,” D’Agostino says, “I wouldn’t want to bet on hunger.”

For all the swagger of that description, Arnold himself has virtually none. He is universally described as quiet and introspective. At Enron, a company famous for its brash, testosterone-laced cowboy culture, the perennially boyish-looking trader was reportedly so soft-spoken that his colleagues had to gather in close to hear him at restaurants. “People would read into it, and they would say he’s just being cagey,” D’Agostino says. “And then, after a couple of years, people were like, oh, no, he’s actually like that.”

Arnold is still quiet. “Usually the division of labor in most of our work is that I talk,” Laura Arnold says in a phone interview. By all accounts, Laura, who attended Harvard College and Yale Law School and worked as an oil executive, has been equally influential in setting the direction for the foundation. But when I visit the Arnold Foundation’s Houston headquarters in June, Laura has been called away on a family emergency, leaving John to do the talking. Arnold is 5’10”, trim, and blandly handsome, his unusually youthful appearance now somewhat concealed by a salt-and-pepper beard.

Arnold grew up in Dallas. His mother was an accountant (she would later help manage the books at his hedge fund). His father, who died when Arnold was 18, was a lawyer. By kindergarten, Arnold’s talent for math was apparent. “I think I was just born with a natural gift for seeing numbers in a special way,” he says. Gregg Fleisher, who taught him calculus in high school, recalls an occasion when Arnold instantly solved a math puzzle that had been known to stump PhDs. But he also stood out for his skepticism. “He questioned everything,” Fleisher says.

By the time he was 14, Arnold was running his first company, selling collectible sports cards across state lines. Those were the early days of the internet, and he managed to gain access to an online bulletin board intended only for card dealers. The listings let him see that the same cards were sold at different prices in different parts of the country—which presented an opportunity for arbitrage. “Hockey cards didn’t have much of a market in Texas,” he tells me. “I would buy up all the premium hockey cards and send them to Canada or upstate New York.” He called the company Blue Chip Cards. Arnold estimates that he made $50,000 before he finished high school.

Arnold graduated from Vanderbilt University in 1995, taking only three years to finish his degree. He started working at Enron four days later. A year after that, at age 22, he was overseeing Enron’s Texas natural gas trading desk, one of the company’s core businesses.

Arnold’s work at Enron—seeking to capitalize on seasonal price differences in natural gas—wasn’t all that different from what he’d done as a teenager selling sports cards. In Hedge Hogs, a 2013 book about hedge fund traders, Jeff Shankman, another star trader at Enron, is quoted describing Arnold as “the most thoughtful, deliberate, and inquisitive person” he worked with on the gas floor. But Shankman recognized that he and Arnold were different in one key respect: Arnold had a greater appetite for risk, a quality that seemed at odds with his quiet demeanor. On some days at Enron, Arnold would trade more than a billion dollars’ worth of gas contracts. In 2001, even as Enron was collapsing amid an accounting scandal that covered up billions in debt, he was reported to have earned $750 million for the company. A former executive at Salomon Brothers later told The New York Times that there were very few incidents in the history of Wall Street comparable to Arnold’s success that year.

As Enron neared bankruptcy, executives scrambled to hold its operation together, offering bonuses to keep traders on board. Arnold was given $8 million, the biggest payout of all, just days before Enron filed for bankruptcy. He started Centaurus the next year, bringing along a small group of former Enron traders, who worked out of a single large room.

Arnold says he wasn’t sure if he could match the success he’d enjoyed as a futures trader at Enron. As a pipeline company, Enron had a direct view onto many of the factors that influence gas prices. Now he’d have to rely purely on his prowess with data. By law, natural gas pipelines had to make much of their information public, and around the time Centaurus was forming, more of that information began to appear online. “A lot of people didn’t know it was out there,” Arnold says. “People who did, didn’t know how to clean it up and analyze it as well as we did.”

It wasn’t long before Arnold had the answer to his doubts. In 2006, Centaurus reportedly generated a 317 percent return overall, after taking the opposite side of a risky bet that another hedge fund, Amaranth, had made on fluctuations in natural gas prices. Amaranth, which was gambling with money from large pension funds, suffered a $6 billion loss and collapsed. By 2009, Centaurus was managing over $5 billion and had more than 70 employees. In its first seven years, according to Fortune, the fund never returned less than 50 percent.

But Arnold had to come down to earth eventually. In 2010, Centaurus experienced its first annual loss. And though the fund bounced back the next year, tighter regulations on trading and a far less volatile market—thanks to a growing supply of natural gas from shale rock—made it unlikely that Arnold would again see the astonishing returns of only a few years earlier. And so, at age 38, Arnold walked away from it all. He announced that he was closing Centaurus in a letter to investors: “After 17 years as an energy trader, I feel that it is time to pursue other interests.”

Arnold tells me that he had lost some of his passion for trading. At the time, his net worth was estimated to be around $3 billion. In 2010 the Arnolds had signed the Giving Pledge, promising to give away at least half their wealth—and he wanted to be as strategic about that goal as he had once been about trading. Arnold has said that the first phase of his life was “100 percent trying to make money” and that it’s now “100 percent trying to do good.” As The Wall Street Journal noted, in “US history, there may have never been a self-made individual with so much money who devoted himself to philanthropy at such a young age.”

Posted on Categories News

A Big Test for Big Batteries

The Aliso Canyon gas storage facility above the Porter Ranch section of Los Angeles, the site of a major gas leak in 2015. Credit Coley Brown for The New York Times

ESCONDIDO, Calif. — In Southern California in the fall of 2015, a giant natural gas leak not only caused one of the worst environmental disasters in the nation’s history, it also knocked out a critical fuel source for regional power plants.

Energy regulators needed a quick fix.

But rather than sticking with gas, they turned to a technology more closely associated with flashlights: batteries. They freed up the utilities to start installing batteries — and lots of them.

It is a solution that’s audacious and risky. The idea is that the batteries can store electricity during daylight hours (when the state’s many solar panels are flooding the grid with power), then release it as demand peaks (early evening, when people get home). In effect, the rechargeable batteries are like an on-demand power plant, and, in theory, able to replace an actual plant.

Utilities have been studying batteries nationwide. But none have moved ahead with the gusto of those in Southern California.

This idea has far-reaching potential. But the challenge of storing electricity has vexed engineers, researchers, policy makers and entrepreneurs for centuries. Even as countless technologies have raced ahead, batteries haven’t yet fulfilled their promise.

And the most powerful new designs come with their own risks, such as fire or explosion if poorly made or maintained. It’s the same problem that forced Samsung to recall 2.5 million Galaxy Note 7 smartphones in September because of fire risk.

After racing for months, engineers here in California have brought three energy-storage sites close to completion to begin serving the Southern California electric grid within the next month. They are made up of thousands of oversize versions of the lithium-ion batteries now widely used in smartphones, laptop computers and other digital devices.

One of the installations, at a San Diego Gas & Electric operations center surrounded by industrial parks in Escondido, Calif., 30 miles north of San Diego, will be the largest of its kind in the world, developers say. It represents the most crucial test yet of an energy-storage technology that many experts see as fundamental to a clean-energy future.

Here, about 130 miles southeast of Aliso Canyon, the site of the immense gas leak in 2015 — the global-warming equivalent of operating about 1.7 million cars over the course of a year — 19,000 battery modules the size of a kitchen drawer are being wired together in racks. They will operate out of two dozen beige, 640-square-foot trailers.

Batteries being wired in Escondido, Calif. Credit Coley Brown for The New York Times

Made by Samsung, the batteries are meant to store enough energy to serve as a backup in cases of fuel shortages. They are also designed to absorb low-cost energy, particularly solar power, during the day and feed it back to the grid after dusk. They in effect can fill in for the decades-old gas-fired plants that might lack the fuel to fully operate because of the disastrous leak.

“California is giving batteries the opportunity to show what they can do,” said Andrés Gluski, chief executive of AES, which is installing the storage systems.

AES is installing a smaller array for the electric utility in El Cajon, a suburb east of San Diego. And separately Tesla, the company perhaps better known for its electric cars, has built an array for a different utility on the grid, Southern California Edison, near Chino, Calif.

The stakes are high for both energy storage companies. If their projects struggle or fail, it could jeopardize not only the stability of Southern California’s grid but also interest in the technology over all.

After a smaller, but pioneering battery project at a wind farm on Oahu in Hawaii went up in flames in 2012, investment in battery storage all but dried up for a few years. That installation, which used 12,000 lead-acid batteries to help even out fluctuations in the power flow, caught fire three times in its first 18 months of operation. The storage developer, Xtreme, eventually went bankrupt. The wind farm turned to a different technology to smooth its output.

Keeping a close eye on the Southern California battery efforts is Susan Kennedy, who helped shepherd California’s energy policy for more than a decade as a state utility regulator and high-level operative for two governors — Gray Davis, a Democrat, and Arnold Schwarzenegger, a Republican. She now runs an energy storage start-up, one not involved in the battery-building response to the Aliso Canyon gas leak.

“The moment one fails,” Ms. Kennedy said of the big bet on batteries, “they won’t build any more.”

As soon as AES’s chief executive, Mr. Gluski, learned last June that San Diego Gas & Electric had awarded AES the big battery contract, he leapt out of his chair and interrupted a meeting in his board room at the company’s headquarters in Arlington, Va. As employees watched in astonishment, he barreled down two flights of stairs, grabbed a mallet and, with a ceremonial flourish, banged a gong that one of his executives kept on hand for big news.

Mr. Gluski had not had much occasion to celebrate since he had taken the AES reins five years earlier. The company was struggling with debt and trying to coax profits from far-flung fossil-fuel projects around the developing world that are buffeted by instability in politics, currency and commodity prices.

His first steps included an austerity program in which he gave up many of his own executive perks: No more country-club membership. No more corporate Audi A8, with driver. But the more far-reaching part of his plan would be AES’s battery division, which was then fledgling. The unit had roots in two midlevel executives who had been speculating about a Jetsons-like future over beers.

One of the energy-storage installations, at a San Diego Gas & Electric operations center surrounded by industrial parks in Escondido, Calif., 30 miles north of San Diego, will be the largest of its kind in the world, developers say. Credit Coley Brown for The New York Times

Those two men, John Zahurancik, a science fiction fan, and Chris Shelton, a former physics teacher, had started talking about batteries a decade ago, before electric cars became fashionable or even feasible. In 2006, Mr. Shelton had come across a professor’s paper that predicted a future dominated by electric cars that, when parked, could be connected to the power grid so their batteries could act as storage devices to help balance electricity demand.

He and Mr. Zahurancik bounced the idea off some AES colleagues, who said it was at least theoretically feasible. So the two continued their bull sessions but decided that stationary battery arrays may make more sense than relying on electric cars.

At the time, lithium-based batteries, the standard in consumer products, were widely in use in the transportation and power tool industries, but no one had paired them with the technology necessary to serve the power grid.

Earlier grid-scale experiments with lead-acid and other types of batteries worked only for a year or two before conking out. A different technology, “flow batteries,” which use chemicals dissolved in liquids in tanks, were considered even more experimental.

But lithium packs more energy per weight than other metals, offering the promise of greater energy density and longevity. The trick would be to figure out how to harness all that power, which creates heat, while avoiding the fires such batteries have caused in any number of vehicles and gadgets, including Teslas, HP computers, hoverboards and, most recently, Samsung Galaxy Note 7s.

The two hit upon a design and persuaded executives to begin a pilot project in 2008. That eventually led to the first commercial lithium-ion battery on a grid. Mr. Zahurancik, who owns the gong, is now president of AES Energy Storage. Mr. Shelton is now the company’s vice president and chief technology officer.

AES does not actually make its batteries but buys them, along with other equipment, from manufacturers like Samsung, LG Chem and Panasonic. It designs and assembles the arrays, stacking the boxy batteries into racks inside locker-like containers.

In Escondido, where local radio stations still carry public service announcements about the natural-gas shortage, the AES battery packs are being installed at a critical spot on the regional electrical grid: the place where the giant wires from power plants and wind and solar arrays connect to the network of local wires.

The batteries are intended to relieve the pressure on the system. Mainly, they will serve as a kind of sponge, soaking up excess or low-cost solar energy during the day and then squeezing it back into the grid in the evening, when demand surges as the sun sets. There is enough capacity in the containers full of batteries to power about 20,000 homes for four hours.

The idea is that they help the utility lessen its dependence on the type of natural gas plants known as “peakers,” which can turn on and off quickly to meet sudden peaks of demand but are generally used only for short periods and at great expense. And peakers, by burning fossil fuel, are also at odds with California’s green-energy goals.

Workers at a battery storage facility in Escondido. Credit Coley Brown for The New York Times

The project is also being watched closely by advocates for renewable energy. The reason: If utility-scale battery installations work as designed, they would help wind or solar generators to act more like conventional power plants by working steadily even when the sun isn’t shining or the wind isn’t blowing.

“Energy storage is really the tool to do renewables integration for a utility infrastructure company like us,” said Josh Gerber, advanced technology integration manager of SDG&E, as workers smoothed the thigh-high concrete pads that support the containers at the Escondido site. “Without it, you have more risk that the variability of renewables is going to cause reliability problems.”

Under the contract, AES is responsible for making sure the batteries perform for 10 years, after which SDG&E will take over. One potential downside is that if the batteries are fully charged and discharged each day, they could degrade more quickly.

The executives involved expressed confidence in the design and reliability, despite Samsung’s recent smartphone problems. Not only are these batteries a different configuration than the smartphone units, executives said, but the larger footprint allows for the inclusion of sophisticated monitoring as well as industrial safety and cooling and ventilation equipment.

The project, along with the smaller array AES has installed in El Cajon, could provide the proof-of-concept leap Mr. Gluski has been striving for.

AES has a deal for an even bigger installation in a $1 billion project in Long Beach with Southern California Edison that is not part of the Aliso Canyon remediation effort; it is projected to go online by the end of 2020. The electric company plans to use batteries as part of a plan to replace an aging gas plant along the San Gabriel River.

Long term, Mr. Gluski plans to shift the company’s power-generation portfolio — still heavily based on coal and natural gas — toward more renewable energy. He sees the storage systems as vital components in turning solar and wind energy into a dominant power source in the parts of Latin America, Asia and Africa where AES is active.

Whatever progress it has made, AES still has its share of problems, with $20 billion in debt and a stock price less than one-fifth the value that it had at the start of the century. It faces wary, if not outright skeptical, treatment by Wall Street utility analysts and energy experts, who say the technology AES is peddling on such a large scale in California remains untested and financially risky.

“The problem comes if there is a hiccup with the battery storage business in California,” said Charles Fishman, a utilities analyst at Morningstar. “You don’t have the deep-pocket parent that can push money to it and keep it out of trouble.”

Despite all the battery activity in California, executives around the utility industry remain cautious. “The reason we don’t have widespread batteries on our system is because it is not cost-effective for us,” said Alice Jackson, vice president for strategic revenue initiatives at Xcel, a giant electricity and gas utility serving eight Western and Midwestern states.

One of many battery-system components that, it is hoped, will help California bolster its power grid. Credit Coley Brown for The New York Times

Xcel has been testing batteries about as long as AES, but almost exclusively in small pilots. “It’s fair to say we don’t have long-range experience with this technology to say that it is perfect, or a nirvana,” Ms. Jackson said. “It’s something we’ll observe as California goes through its experience.”

California’s latest experiment with batteries is but the latest bout in the state’s long struggle to match its energy needs with its environmental sensibilities.

In the early 2000s, after market deregulation and Enron’s notorious manipulation of gas supplies led to blackouts and financial instability among the power companies, state officials decided to lessen reliance on natural gas by encouraging the development of wind and solar.

Under Mr. Schwarzenegger, who was governor until 2011, officials pushed through a raft of overlapping regulations that created a boom in renewables, especially solar. But that upended the traditional patterns of supply and demand, making the overall energy system technically and economically difficult to manage.

Batteries were the logical solution. But the technology wasn’t fully developed and was still too expensive. In order for companies to make the necessary investments, they needed a signal that there would be a big enough market for their products.

So in 2010, the state approved one of the first energy-storage mandates, ultimately requiring utilities to install some form of storage equipment in their territories. That set off a flurry of new investment and innovation and, after the sudden closure of the San Onofre nuclear plant on the coast in northwest San Diego County in 2012 when a steam generator tube sprung a leak, new contracts for battery installations.

But the Aliso Canyon accident, which began on Oct. 23, 2015, when the Southern California Gas Company first detected the leak, put that process on fast-forward. The noxious-smelling gas and intermittent oily mist that spewed forth over almost four months traveled into the surrounding neighborhoods on the strong winds that sweep down from the Santa Susana Mountains. At the same time, it forced the battery strategy into its most high-profile test yet.

Now it’s showtime, and the pressure to succeed is high all around. For AES, it could signify an important step for a long-troubled conventional-energy relic that is seeking to revitalize itself as a powerhouse in battery storage and other advanced technologies.

For clean-energy advocates — including residents of the Porter Ranch section of Los Angeles, so picture-perfect that Steven Spielberg chose it as the setting for the 1982 movie “E.T.,” but where many still complain of the rashes, headaches and other debilitating symptoms that drove thousands from their homes during the leak — it could be a powerful weapon in the fight to keep the gas depot closed.

But the pressure may be highest for the Southern California utilities, their reputations still blackened by waves of forced electricity cuts that followed the Enron debacle. No one wants to contemplate a repeat of that chapter, when blackouts affected factories and even some hospitals.

“When the power goes out, people die,” said Ms. Kennedy, the former state official. “Failure is not an option here on any level.”

Correction: January 14, 2017
An earlier version of this article misstated the location of AES headquarters. It is in Arlington, Va., not Alexandria.

Correction: January 16, 2017
An earlier version of this article misstated the status of an array near Chino, Calif., for the utility Southern California Edison. It has already been built by Tesla.

Posted on Categories News

As Rains Soak California, Farmers Test How To Store Water Underground

Helen Dahlke, a scientist from the University of California, Davis, stands in an almond orchard outside Modesto that's being deliberately flooded. This experiment is examining how flooding farmland in the winter can help replenish the state's depleted aquifers. Joe Proudman/Joe Proudman / Courtesy of UC Davis
Helen Dahlke, a scientist from the University of California, Davis, stands in an almond orchard outside Modesto that’s being deliberately flooded. This experiment is examining how flooding farmland in the winter can help replenish the state’s depleted aquifers.
Joe Proudman/Joe Proudman / Courtesy of UC Davis

Six years ago, Don Cameron, the general manager of Terranova Ranch, southwest of Fresno, Calif., did something that seemed kind of crazy.

He went out to a nearby river, which was running high because of recent rains, and he opened an irrigation gate. Water rushed down a canal and flooded hundreds of acres of vineyards — even though it was wintertime. The vineyards were quiet. Nothing was growing.

“We started in February, and we flooded grapes continuously, for the most part, until May,” Cameron says.

Cameron was doing this because for years, he and his neighbors have been digging wells and pumping water out of the ground to irrigate their crops. That groundwater supply has been running low. “I became really concerned about it,” Cameron says.

So his idea was pretty simple: Flood his fields and let gravity do the rest. Water would seep into the ground all the way to the aquifer.

Don Cameron, general manager of Terranova Ranch, flooded his grapevines with floodwaters from a branch of King's River, southwest of Fresno, Calif. Courtesy of Don Cameron
Don Cameron, general manager of Terranova Ranch, flooded his grapevines with floodwaters from a branch of King’s River, southwest of Fresno, Calif.
Courtesy of Don Cameron

The idea worked. Over four months, Cameron was able to flood his fields with a large amount of water — equivalent to water three feet deep across 1,000 acres. It all went into the ground, and it didn’t harm his grapes.

These days, Cameron’s unconventional idea has become a hot new trend in California’s water management circles — especially this week, with rivers flooding all over the state.

“This is going to be the future for California,” Cameron says. “If we don’t store the water during flood periods, we’re not going to make it through the droughts.”

Helen Dahlke, a groundwater hydrologist at the University of California, Davis, is working with a half-dozen farmers who are ready to flood their fields this year. “We have test sites set up on almonds, pistachios and alfalfa, just to test how those crops tolerate water that we put on in the winter,” she says.

There are two big reasons for these experiments.

The first is simply that California’s aquifers are depleted. It got really bad during the recent drought, when farmers couldn’t get much water from the state’s surface reservoirs. They pumped so much groundwater that many wells ran dry. The water table in some areas dropped by 10, 20, or even 100 feet. Aquifers are especially depleted in the southern part of California’s Central Valley, south of Fresno. Flooding fields could help the aquifers recover.

The second reason to put water underground is climate change.

California has always counted on snow, piling up in the Sierra Nevada mountains, to act as a giant water reservoir. Water is released gradually as the snow melts.

But because of a warming climate, California now is getting less snow in winter, and more rain. The trend is expected to intensify. But heavy rain isn’t as useful because it quickly outstrips the capacity of the state’s reservoirs and just runs into the ocean. Meanwhile, the state gets very little rain during the summer, when crops need water.

“We really have to find new ways of storing and capturing rainfall in the winter, when it’s available,” says Dahlke.

There’s no better place to store water than underground. Over the years, California’s farmers have extracted twice as much water from the state’s aquifers as the total storage capacity of the state’s dams and man-made lakes. In theory, farmers could replace that water.

Peter Gleick, a water expert and co-founder of the Pacific Institute, says that after winter storms, there is enough water available to recharge those groundwater aquifers.

The hard part, he says, will be getting the state’s farmers and irrigation managers to go along with the plan. Because it will require flooding hundreds of thousands — and possibly millions — of acres.

“I’m cautiously optimistic that we can do this,” he says. But it’s going to require a different way of thinking. It’s going to require a lot of farmers and owners of ag land to be willing to flood land when the water’s available.”

And Gleick says, even if this large-scale flooding can be accomplished, it won’t be enough, by itself, to protect groundwater supplies. It will have to be accompanied by strict limits on how much water farmers can pump from aquifers. Groundwater — which until recently was almost completely unregulated — will have to be managed so that water is there when farmers really need it, when the rains don’t fall.

Posted on Categories News

California storms add 350 billion gallons to parched reservoirs

A riverfront property in Guerneville, Calif., sits in waters up to its first floor windows Monday, Jan. 9, 2017, after the Russian River crested above flood stage following Sunday's big storm. (Karl Mondon/Bay Area News Group)
A riverfront property in Guerneville, Calif., sits in waters up to its first floor windows Monday, Jan. 9, 2017, after the Russian River crested above flood stage following Sunday’s big storm. (Karl Mondon/Bay Area News Group)

The powerful storms that soaked Northern California over the past week did more than trigger power outages, mudslides and flash floods.

They sent roughly 350 billion gallons of water pouring into California’s biggest reservoirs — boosting their storage to levels not seen in years, forcing dam operators to release water to reduce flood risks and all but ending the five-year drought across much of Northern California, even though it remains in the south, experts said Monday.

“California is a dry state and probably always will be in most years, but we certainly don’t have a statewide drought right now,” said Jay Lund, a professor of engineering and director of the Center for Watershed Sciences at UC Davis.

“We have to be careful about crying wolf here,” he said. “You have to maintain credibility with the public when there are critically dry years, so you have to call it like it is when conditions improve.”

On Monday much of the state began drying out from the weekend drenching that caused at least three fatalities and triggered flooding in Morgan Hill, Sonoma County, Yosemite and parts of the Sacramento Valley, even as forecasters said another storm system was coming in Tuesday.

That new storm system should bring 1 to 2 inches of rain around much of the Bay Area, and up to 6 inches in the Santa Cruz Mountains and Big Sur, with more rain in the North Bay, tapering off Wednesday.

“It’s not going to be as heavy,” National Weather Service forecaster Steve Anderson said. “But even though the amount of rainfall will be less, the impact will still be there.”

Despite concerns that the weekend storm’s warmer temperatures would significantly deplete the Sierra Nevada snowpack, it grew significantly. Last Monday, it was 70 percent of historic average. This Monday, it had grown to a staggering 126 percent for this time of the year.

In fact, since Oct. 1, more precipitation has fallen across the key watersheds of Northern California — eight areas from Lake Tahoe to Mount Shasta that feed many of the state’s largest reservoirs — so far this winter than any time since 1922, according to state totals.

In a typical year, that “Northern Sierra eight-station index” receives 50 inches of precipitation. As of Monday it was already at 40 inches — 199 percent of the historic average for this date — and running slightly above 1982-83 and 1997-98, both of which were marked by severe El Niño flooding.

The rain and snow could shut off, as happened three years ago in January, although the reservoirs now are so full in many areas there wouldn’t be water shortages for several years.

SJM-DROUGHT-0110-90

Officially, California’s drought won’t end until Gov. Jerry Brown rescinds or revises the emergency drought declaration he signed in January 2014.

Lund, of UC Davis, said that because other parts of the state — particularly Santa Barbara and other parts of Southern California — are still well short of rain and suffering from low reservoir levels, Brown should issue an updated drought declaration that reflects the regional differences.

That is one of the options he is considering, said Nancy Vogel, a spokeswoman for the state Natural Resources Agency. But a decision may not be made until the end of the winter snow and rain season, she said.

“It’s early and the precipitation patterns could dry up at any time,” she said. “We’ll see where we are in March or April.”

Rain from Sunday’s storm fell in sheets at time, flooded roads and storm drains, and toppled trees. It fell most forcefully in the Big Sur area of Monterey County, dumping more than 12 1/2 inches over a 72-hour period. More than 9 3/4 inches fell in the Lexington Hills in Santa Clara County and more than 6 inches soaked areas of San Mateo County.

A vehicle drives through standing water along Gate 5 Road in Sausalito, Calif., on Tuesday, Jan. 10, 2017. (Robert Tong/Marin Independent Journal)
A vehicle drives through standing water along Gate 5 Road in Sausalito, Calif., on Tuesday, Jan. 10, 2017. (Robert Tong/Marin Independent Journal)

In Contra Costa County, 4 1/2 inches of rain fell atop Mount Diablo, and 3 1/4 inches fell in Orinda. San Francisco and parts of Oakland saw 2 1/2 inches of rain. Only 1.03 inches fell at Mineta San Jose International Airport, but that still set a record for Jan. 8.

More importantly, the recent storms have sent reservoirs swelling.

The 154 largest reservoirs tracked by the state Department of Water Resources added 1.1 million acre feet of water from Jan. 1 to Monday, boosting their capacity to 97 percent of historic average, said Maury Roos, longtime state hydrologist.

“It’s excellent news,” said Roos. “I don’t make the decision on the official drought, but from the Bay Area north we are in good shape for this time of the season.”

Specifically, Loch Lomond, the main reservoir serving Santa Cruz, filled to capacity. All seven reservoirs that serve the Marin Municipal Water District were 100 percent full. Pardee Reservoir, the main reservoir that provides water to 1.3 million people in Alameda and Contra Costa County, spilled on Monday.

Lexington Reservoir, near Los Gatos, has gone up 31 feet since New Year’s Day, surging to 93 percent full from 42 percent full a week ago.

Perhaps most dramatic was San Luis Reservoir, California’s fifth largest, located between Gilroy and Los Banos. Sitting at 10 percent full in August, it now is 66 percent full, having risen 134 feet. At current rates, it may fill to the top for the first time since 2011, said Roger George of Fresno, a professional guide who leads fishing trips for striped bass there.

“Back in August, it was scary. I was beginning to wonder if we were going to have a die-off of the fish,” he said. “Now it looks like an ocean.”

Similarly, the state’s second-largest reservoir, Oroville in Butte County, has risen 35 feet since New Year’s Day. It added 250,000 acre-feet of water over the weekend, enough for 1.3 million people’s needs for a year. It now stands at 64 percent full, or 102 percent of historic average.

On Monday, officials at Yosemite National Park announced they would reopen Yosemite Valley Tuesday morning. The park suffered some damage when the Merced River jumped its banks, but the flood levels were only two or three feet above flood stage, less than had been earlier feared.

The storm unleashed mud and rock slides throughout the Santa Cruz Mountains early Monday, halting traffic during the morning commute on Highway 17. A slide just north of Scotts Valley shut down northbound lanes and traffic was detoured onto the southbound side.

In Gilroy, two people were rescued Sunday night from the second story of their home after water surrounding the residence rose to about four feet. The San Jose Fire Department’s Urban Search and Rescue team had to use a boat to help the people out of the home, according to Cal Fire spokeswoman Pam Temmermand.

At least three people were killed in the weekend storm, including 57-year-old Jarnail Singh, whom police said lost control of a white cab he was driving and crashed into an estuary near the Oakland International Airport on Sunday morning.

An unidentified motorist also died in a crash on Interstate 880 in Fremont.

A San Ramon woman died Saturday after a tree fell on her at a golf course in San Ramon. Deborah McKeown, 56, was taking a walk when high winds knocked over a tree that landed on her. McKeown, a freelance writer for the Bay Area News Group who wrote under the pseudonym Kathleen Ford, was taken to a hospital from the Canyon Lakes Golf Course on Bollinger Canyon Way.

Staff writers Patrick May, Rick Hurd and Eric Kurhi contributed to this report.

Posted on Categories News

How a Low-Carbon Economy Increases Cybersecurity Risks

WSJ Energy Expert Jason Bordoff explains the risks inherent in a more interconnected, electrified, and digitalized energy grid. PHOTO: ISTOCK PHOTO
WSJ Energy Expert Jason Bordoff explains the risks inherent in a more interconnected, electrified, and digitalized energy grid. PHOTO: ISTOCK PHOTO

Since the 1973 Arab Oil Embargo, America’s reliance on imported oil has always been the primary energy security concern. Every president since Nixon has promised “energy independence,” and the goal of reducing oil imports has dominated the energy policy agenda. Transitioning to a low-carbon economy that moves off oil is thus not only necessary to address climate change, but also brings many energy security benefits. Around the world, renewable energy that is locally generated reduces the energy-security risks of fossil fuels that are globally traded.

Yet the transition to a low-carbon economy may bring new and different energy security risks of its own that have so far received relatively little attention. Among the most important is the threat from cybersecurity, as a low-carbon economy becomes more electrified and interconnected.

Electricity is vital to nearly every aspect of our daily life and the economy. Electricity is needed to produce food and purify water. Our financial and telecommunications systems do not function without it. It is key to transportation, energy production, hospitals and emergency services. Reliable electric power is also essential to our homeland security and national defense.

Moving away from fossil fuels is likely to require large-scale electrification, and then the generation of yet more electricity from low-carbon energy sources. The International Energy Agency predicts that the share of electricity in final global energy consumption will increase from 18% in 2014 to as high as 28% in the agency’s low-carbon scenario by 2050. Decarbonization of transportation very likely means more electric vehicles, which are falling in cost, growing rapidly in sales each month, and may even become mandated in many urban areas. Transitioning away from fossil fuels for heating in the residential and industrial sectors also means more electrification.

Along with increased electrification, the digital age also means that our electric devices—from household appliances to electric vehicles to increasingly ubiquitous smart sensors—are more and more interconnected through smart grids and the “Internet of Things.” Nearly 50 billion devices are projected to be connected to the internet by 2020, twice as many as last year, according to an FTC report from January 2015. The economywide benefits of greater interconnectedness are enormous: A recent McKinsey report estimates that digitization of services and physical assets like cars and buildings could add more than $2.2 trillion to the annual GDP of the U.S. by 2025. And increased efficiency and renewable capacity can be achieved through new connected devices and sensors.

But an increasingly interconnected, digital and electrified energy system also poses vast new physical and cybersecurity risks that the next administration must prioritize in its new energy security agenda. A more digitalized electricity system is vulnerable to malicious attacks. Ukraine experienced this firsthand when a coordinated cyberattack targeting regional electricity distribution companies in December 2015 left 225,000 customers in the dark for hours. Incidents of cyberattack aimed at the grid in the U.S. are on the rise.Reported incidents rose 20% in 2015 from the year prior, with the energy sector the second largest target. Speaking of critical infrastructure like power generation, the head of U.S. Cyber Command told Congress “it is only a matter of when, not if, we are going to see something dramatic.”

While smart-grid systems promise energy efficiency gains to both consumers and grid operators as each gain greater control over the energy in homes and businesses, low regulatory oversight and a lack of clear security standards for new devices leaves this digitally connected energy system vulnerable. Last month’s internet outage, triggered by a hack on internet-connected devices like cellphone cameras and baby monitors, was a potent reminder of the risks from billions of connected devices with little or no cybersecurity protections.

Consider the risks presented by electrification and digitalization of the transport sector. In a world of electric vehicles, the consequences of electricity outages would be far more severe, as electricity is more difficult to store than gasoline or oil. Moreover, concerns about car-hacking—both electric and nonelectric models—are real. In the second example, hackers took control of a car’s steering wheel and accelerator from a laptop. Imagine the dire consequences in self-driving cars, which have more possible ways into the cars’ networks. Or the potential for malicious code to be installed and later activated in our vehicles. Will stringent safeguards exist in other countries like China that have ambitious plans to manufacture and export cheap electric vehicles?

Grid-connected renewable energy sources, such as utility-scale solar and wind power plants, and distributed energy resources, such as rooftop solar systems, are also vulnerable to cybersecurity risks. To be sure, distributed generation—combined with energy storage—can increase grid resiliency, especially during hurricanes and other weather-related power outages. But as long as these systems are connected to the grid (or a virtual power plant), they present new risks in the form of thousands of potential backdoors for hackers to the electricity grid.

Smart grids, which use intelligent gadgets and two-way communications between electric devices and the grid, are also potential soft targets for cyberattacks—and have received too little attention to date from local utilities. Without proper safety measures, these tools for improving energy efficiency can be used by malicious hackers to shut down entire electricity networks.

Transitioning away from fossil fuels more quickly must be a priority to address the urgent challenge of climate change. And that means a more rapid shift to an electrified and digitalized energy system. While that shift reduces energy security risks we have faced for decades, it also presents many new ones as well that the incoming administration must address with greater urgency through cybersecurity standards and regulations for the new technologies and devices that will power the clean energy economy.

Jason Bordoff (@JasonBordoff), a former energy adviser to President Obama, is a professor of professional practice in international and public affairs and founding director of the Center on Global Energy Policy at Columbia University.

Posted on Categories News