Novim News

How a Funder and a Top News Agency Are Collaborating to Boost Science Journalism

There’s a lot of exciting work happening in science journalism lately for sure, whether it’s the surge in science blogging or new publications like STAT, Undark, and Nautilus.

The news media is still struggling financially, however, and the industry is undergoing some major shifts. For some outlets, that’s led to shrinking or even disappearing science desks. Journalists are a scrappy bunch, though, always adjusting to keep the news business alive and kicking. Philanthropic resources have become an important part of that retooling.

The latest move by a funder to give a jolt to journalism come from the Howard Hughes Medical Institute, one of the largest private funders of academic science research. HHMI also has a large science education program, giving $86 million in fiscal year 2016.

That funding has backed work like creating science resources for educators, providing research opportunities to college students, and engaging the public. The last area includes a film production arm, and collaborations with outlets like NOVA, the New York Times, and Science Friday. In fiscal year 2014, HHMI funded nonprofit news outlet The Conversation with $500,000 for science journalism, according to tax forms.

Now, a collaboration between HHMI and the AP is backing two year-long projects to bolster science coverage. One is a series of stories, profiles, videos and graphics about genetic medicine; the second supports multimedia coverage aimed at putting scientific evidence in the context of subjects like the environment and public health.

Funds will increase the number of journalists on the AP’s team and the number of stories the service can publish. While the announcement states that HHMI will offer expert background information and educational materials, the AP assures that it will retain editorial control over what gets published.

While we’ve seen some funders back journalism in response to attacks from the Trump administration, HHMI’s Sean B. Carroll, vice president of science education, says this was in the works well before the election. The institute takes a longer view in its science education work, and this collaboration is more in response to larger struggles in journalism, he said via email.

“The pressures on newsrooms have led to a widespread reduction of science journalists. As the world’s largest news gathering organization, AP is perfectly positioned to provide its vast client base with more and deeper science reporting,” Carroll said.

The AP is a large news cooperative, so there’s potential for stories the funder is facilitating to have greater reach.

Generally speaking, there are a couple kinds of journalism funders out there—those supporting the field on basic principle (Omidyar, Knight) and those enhancing coverage of a particular topic that’s been neglected for whatever reason. We’ve seen a lot of nonprofit environmental coverage, for example. RWJF backs health journalism in a big way. Gates is a major backer of education reporting. And on the science beat, the Sloan Foundation supports various media, including popular books like Hidden Figures and shows like Radiolab.

Media philanthropy isn’t new, but as it’s grown, funders are still feeling out some of the ground rules and best practices. There’s been controversy regarding what kind of influence funders have on the coverage of news outlets.

My only criticism of this particular initiative is that HHMI is not disclosing the amount of funding involved. I don’t believe there is anything sinister behind that decision, and HHMI staff say they generally refrain from sharing financials. Both partners publicized the collaboration and cited motivation to increase attention to and understanding of science. But one of the guiding principles in journalism philanthropy should always be a high level of transparency, so we’d like to see all the cards on the table when it comes to this kind of grant.

Posted on Categories News

California’s Drought May Be Over, But Its Water Troubles Aren’t

The structural issues at the overwhelmed dam at Lake Oroville are the latest chapter in California’s struggle with both droughts and flooding.

In the years before California’s civil engineers got around to confining the Sacramento River, it often spilled over its banks, inundating huge swaths of the Central Valley. Sometimes the floodwater would stand for a hundred days at a time. The botanist William Henry Brewer, writing in 1862, after a season of torrential rains, described the valley as “a lake extending from the mountains on one side to the coast range hills on the other.” The water was so deep, he reported, that cargo steamers could navigate it. “Nearly every house and farm over this immense region is gone,” Brewer wrote. “America has never before seen such desolation by flood as this has been, and seldom has the Old World seen the like.” Half a century later, to solve the problem, California built a number of flood-control systems, including the Sacramento Weir, a series of forty-eight hand-operated gates placed strategically along the Sacramento and American Rivers. When the waters rose, they would now be shunted into an unpopulated expanse known as the Yolo Bypass, a floodplain roughly equivalent in size to twenty Central Parks.

This winter, for the first time in a decade, and after five years of a crippling statewide drought, the Yolo Bypass is submerged again. Situated at the heart of the Pacific Flyway, a great migratory corridor stretching from Alaska to the tip of South America, the area teems with sandhill cranes, California brown pelicans, and dozens of other bird species. But its estuarine tranquillity is deceptive. In the past five months—the wettest since record-keeping began, in 1895—California has experienced widespread hydrological chaos. In January, after a series of heavy rainstorms, water managers activated the Sacramento Weir, filling the Yolo Bypass. In February, emergency releases from Anderson Lake Dam, in Santa Clara County, flooded hundreds of homes in San Jose. The rain also caused landslides near Big Sur, washing out several roads and bridges and leaving about four hundred people stranded. But it was the near-failure of the dam at Lake Oroville, three and a half hours north of San Francisco, that made the scale of the crisis clear. Oroville is the state’s second-largest reservoir but arguably its most important; it feeds the California Aqueduct, which supplies drinking water to twenty-five million residents across greater Los Angeles and irrigates millions of acres of Central Valley farmland.

Less than a year ago, Lake Oroville was a vivid symbol of the state’s prolonged drought. Aerial images showed a landscape of spider-webbed mudflats and desiccated tributaries as the reservoir fell to levels not seen in almost forty years. Starting in January, though, the lake rapidly filled to capacity. To prevent the water from breaching the dam, engineers began discharging it at a rate of 2.7 billion gallons per hour—about the same flow as at Niagara Falls. The frothing cascade, with its countless bubbles acting as tiny jackhammers, hollowed out a cavernous pit in the concrete spillway. The engineers diverted the flow to an earthen emergency spillway, but the torrent rapidly chewed away at that, too. With the integrity of the dam under threat, close to two hundred thousand local residents were evacuated. (A total failure of the structure, according to one water manager, would have sent a thirty-foot wave tearing through communities downstream.)

Ask most Californians, however, and they’ll tell you that the chaos is in service of a greater good. As of last week, according to the National Drought Mitigation Center, more than three-quarters of the state is out of the drought, with barely one per cent falling into the “severe” category—almost the reverse of the situation at this time last year. Already in 2017, many parts of California have received more than twice their average annual precipitation. The numbers would seem to paint a picture of watery salvation. But Peter Gleick, the chief scientist at the Oakland-based Pacific Institute, told me that one year of heavy precipitation, even a record-breaking one, will not undo the most serious repercussion of the drought: a severe deficit of groundwater. For years, Central Valley farmers have drawn liberally from the region’s aquifers to compensate for reduced supplies from canals and aqueducts. When a large enough volume of groundwater is pumped away, the land can slump like a punctured air mattress. Areas along the valley’s western edge have sunk by nearly thirty feet since the nineteen-twenties, and in some places the local infrastructure—roads, bridges, even the California Aqueduct itself—is at risk. Farmers and municipalities have responded by digging deeper wells, but such measures seem to be prolonging the inevitable. In Tulare County, south of Fresno, where groundwater overdraft has been particularly severe, the number of reported well failures has continued to climb, almost quadrupling since 2014, in spite of last year’s above-average precipitation and this year’s deluge.

Climate change is a significant contributor to the problem. As the Stanford climatologist Noah Diffenbaugh noted in 2015, California’s reservoirs, aqueducts, and canals are vestiges of a cooler, less drought-prone past. The state’s model of water storage is snowpack-dependent, meaning that it works properly only when the bulk of the water in the system is locked up in mountain snow. These days, though, more precipitation falls as rain than as snow, placing stress on the reservoirs. And even though this year has seen record snowpack—a hundred and eighty-five per cent of the average, as of March 1st—California has also experienced dozens of so-called rain-on-snow events, which further hasten the melting. Meanwhile, warmer temperatures are projected to shift the snow line to higher altitudes, dramatically shrinking the over-all size of the state’s snow reservoir. At current rates of warming, the Sierra Nevada could lose a quarter of its snowpack by the middle of the century, according to the California Department of Water Resources.

Sudden swings between drought and flood have been part of California’s climatic history for a long time, Diffenbaugh told me, but those swings now stand to become more extreme. “This is exactly what climate scientists have predicted for at least the last thirty years,” he said. The solution, Gleick said, is to prioritize the aquifers—and quickly, because severe land subsidence can permanently eliminate storage space. “Unless a massive effort is made to both reduce overdraft and to artificially enhance recharge rates, California’s groundwater will continue to decline,” he wrote in an e-mail. Not only are there fewer regulatory hurdles involved in underground water-banking than, say, permitting a new reservoir or desalination plant, but the costs of groundwater storage are far lower than these other options, scientists at Stanford’s Bill Lane Center for the American West recently found.

Last Friday, the state’s Department of Water Resources reopened the patched-up concrete spillway at Lake Oroville. Hundreds of millions of dollars’ worth of repairs remain, but water managers must make room in the reservoir for the spring melt. At the moment, there is no large-scale engineering system that would allow the huge surge of surface water currently flowing across California to be delivered to the Central Valley’s aquifers. And more to the point, perhaps, the state lacks the sorts of regulations that would make such a system viable; a law passed in 2014 requires that government agencies “achieve sustainability” in how they apportion groundwater, but not until 2040. Ultimately, Gleick told me, California won’t pursue artificial recharge until it can keep better track of who is using what. “It’s like putting money into a bank account that anyone else can withdraw,” he said. “Until it’s monitored, no one will make a deposit.”

Jeremy Miller is a writer in Richmond, California

Posted on Categories News

How California Utilities Are Managing Excess Solar Power

As California ramps up renewable energy, utility companies are looking to batteries to solve a supply-demand mismatch, storing excess solar power and feeding it as needed to the grid. Here, a solar farm and wind turbines in Palm Springs Calif. PHOTO: MOMENT EDITORIAL/GETTY IMAGES

‘Virtual power plants’ would store renewable energy in batteries by day and redistribute it when demand surges after sunset

California utilities including PG&E Corp., Edison International and Sempra Energy are testing new ways to network solar panels, battery storage, two-way communication devices and software to create “virtual power plants” that manage green power and feed it into the power grid as needed.

The Golden State is ramping up renewable energy as it pledges to be a bulwark against the Trump administration’s pro-fossil fuel policies. But first, it has to figure out what to do with all the excess power it generates when the sun is shining and the wind is blowing.

California’s solar farms create so much power during daylight hours that they often drive real-time wholesale prices in the state to zero. Meanwhile, the need for electricity can spike after sunset, sometimes sending real-time prices as high as $1,000 a megawatt-hour.

Utility companies are looking to correct that supply-demand mismatch and ease the strain on the electric grid as California considers retiring its last nuclear plant in 2025 while nearly doubling the power it gets from renewable sources to 50% by 2030.

Last month, power company AES Corp. flipped the switch on a bank of 400,000 lithium-ion batteries it installed in Escondido, Calif., for Sempra Energy. Sempra’s San Diego utility plans to use the batteries, made by Samsung SDI Co. Ltd., to smooth out power flows on its grid.

Tesla Inc. is supplying batteries to a Los Angeles-area network that would serve Edison International, which would be the world’s largest of its kind when finished in 2020, according to the developer, Advanced Microgrid Solutions. The network would spread across more than 100 office buildings and industrial properties.

When the Edison utility needs more electricity on its system, the batteries would be able to deliver 360 megawatt-hours of extra power to the buildings and the grid, enough to power 20,000 homes for a day, on short notice. At other times, the batteries would help firms hosting the arrays to cut their utility bills, said Susan Kennedy, chief executive of Advanced Microgrid Solutions, which is developing the project.

“It will show how you can use communication and control technology to make a bunch of distributed energy assets act like one big one,” said J.B. Straubel, Tesla’s chief technical officer.

The companies declined to say how much the project would cost.

PG&E plans to use clean energy to replace the 2,200-megawatt Diablo Canyon nuclear power plant, which it is proposing to shut down in 2025. The San Francisco utility, which plans to invest about $1 billion through 2020 to modernize its grid, is testing batteries, software and other technologies.

“We are rethinking the grid and how it operates,” said Steve Malnight, PG&E’s senior vice president of strategy and policy.

An array of solar panels in Oakland, Calif. The Golden State often sells excess power at low prices or gives it away to other states. PHOTO: LUCY NICHOLSON/REUTERS

Virtual power plants remain a considerably more expensive option than building a traditional power plant to meet peak demand.

Stored power from lithium-ion batteries can do the work of a natural-gas peaker plant at an average cost of between $285 and $581 a megawatt-hour, according to a December report by Lazard Ltd. In contrast, electricity from a new gas peaker plant costs between $155 and $227 a megawatt-hour, according to Lazard.

But some of the equipment barely existed five years ago: As prices for technologies such as battery storage fall, utilities should be able to adopt more of them, said Michael Picker, president of the state Public Utilities Commission.

California currently has to sell excess solar power at low prices or give it away to utilities in Arizona and other states, through a real-time power market run by California’s Independent System Operator, which oversees the state grid.

Sometimes, offering the excess power at low prices isn’t enough and prices go negative, as a way for power suppliers to encourage other utilities to take power they can’t use. That happened on 178 days last year.

Utilities in Colorado, New York and other states that plan to get a higher percentage of their power from renewables are also experimenting with virtual power-plant technology.

Consolidated Edison Inc. is using solar panels, batteries and power conservation technologies at several dozen New York City buildings to reduce peak demand by as much as 52 megawatts. Because of the $200 million project, the utility can postpone installing more than $1 billion of conventional power equipment for another 20 years, said Matthew Ketschke, a Con Edison vice president.

Virtual power plants alone, however, may not solve problems created by boosting intermittent renewable energy.

In Arizona, regulators want to double the state’s renewable energy target to 30% by 2030. But some utilities worry that adding more solar power on top of California’s already-robust supply could be costly and wasteful, even with battery storage.

Tesla Inc. batteries, installed at office buildings in Los Angeles, are part of a virtual power plant providing electricity to the grid. PHOTO: ADVANCED MICROGRID SOLUTIONS

“Storage may help you within the day, but a battery isn’t designed to store energy from March until it’s needed in June,” said Jeff Guldner, senior vice president of public policy at Arizona Public Service Co. in Phoenix.

Posted on Categories News

How Molten Salt Reactors Might Spell a Nuclear Energy Revolution

Liquid FLiBe salt. Credit: Wikimedia Commons

Since former NASA engineer Kirk Sorensen revived forgotten molten salt reactor (MSR) technology in the 2000s, interest in MSR technology has been growing quickly. Since 2011, four separate companies in North America have announced plans for MSRs: Flibe Energy (started by Sorenson himself), Transatomic Power (started by two recent MIT graduates), Terrestrial Energy (based in Canada, which recently partnered with Department of Energy’s Oak Ridge National Laboratory), and Martingale, Inc., which recently made public its design for its ThorCon MSR.

In addition, there is now renewed interest in MSRs in Japan, Russia, France and China, with China also announcing that MSR technology is one of its “five innovation centers that will unite the country’s leading talents for research in advanced science and technology fields, according to the Chinese Academy of Sciences.”

Why this sudden interest in a nuclear technology that dates back to the 1950s? The answer lies in both the phenomenal safety of MSRs and their potential to help solve so many of today’s energy related problems, from climate change to energy poverty to the intermittency of wind and solar power. In fact, MSRs can operate so safely, they may alleviate public fears about nuclear energy. Before looking at the potential of MSRs, though, it is useful to first take a high-level look at what they are and how they work.

What is a Molten Salt Reactor?

A molten salt reactor (MSR) is a type of nuclear reactor that uses liquid fuel instead of the solid fuel rods used in conventional nuclear reactors. Using liquid fuel provides many advantages in safety and simplicity of design.

Generation IV roadmap published by US Department of Energy

The figure above shows one type of MSR design. As shown towards the left, the reactor contains “fuel salt”, which is fuel (such as uranium-235) dissolved in a mixture of molten fluoride salts. After a fission chain reaction starts in the reactor, the rate of fission stabilizes once the fuel salt reaches around 700 degrees Celcius. If the reactor gets hotter than 700 degrees, the resulting expansion of the fuel salt pushes some of the fuel into the circulation loop; this, in turn, decreases the fission rate (since fission cannot be maintained in the loop), causing the fuel to cool.

Unlike conventional reactors, the rate of fission in an MSR is inherently stable. Nonetheless, should the fuel salt become too hot to operate safely, a freeze plug (made of salts kept solid by a cooling fan) below the reactor will melt and the liquid content of the reactor will flow down into emergency dump tanks where it cannot continue to fission and can cool safely.

The control rods at the top of the reactor provide further control of the rate of fission by absorbing neutrons that might otherwise cause a fission reaction. A test in the 1960s showed that an MSR can continue to run safely without operator intervention even after intentional removal of a control rod during full operation.

The fuel salt is circulated through a heat exchanger where it is cooled by another molten salt loop that is free of radioactive fuel and fission products. The heat from this second loop can be used to do work, such as heating water to turn a steam turbine to generate electricity.

The fuel salt is also circulated through a chemical processing plant. This plant is used to both remove undesired fission products and add more fuel to the reactor.

Why Molten Salt Reactors?

MSRs are a huge departure from the conventional reactors most people are familiar with. Key features include:

Unparalleled safety

MSRs are walk-away safe. They cannot melt down as can conventional reactors because they are molten by design. An operator cannot even force an MSR to overheat. If for some reason an MSR were to overheat, the heat would melt a freeze-plug at the bottom of the reactor vessel and the liquid fuel salts would drain into the emergency cooling tanks where it would cool and solidify. No operator interaction nor even emergency backup power is needed for this to happen.

Even a human engineered breach (such as a terrorist attack) of an MSR cannot cause any significant release of radioactivity. The fuel salts for MSRs work at normal atmospheric pressure, so a breach of the reactor containment vessel would simply leak out the liquid fuel which would then solidify in as it cooled. (By comparison, a breach of a conventional reactor leads to the highly pressurized and radioactive water coolant spewing into the atmosphere and potentially leaking into surrounding bodies of water.) Additionally, radioactive byproducts of fission like iodine-131, cesium-134 and cesium-137 (such as those released into the atmosphere and ocean by the Fukushima meltdown) are physically bound to the hardened coolant and do not leave the reactor site.

A solution to nuclear waste and stockpiles of plutonium

Conventional reactors use solid ceramic fuel rods containing enriched uranium. The fission of uranium in the fuel releases gases, such as xenon, which causes the fuel rods to crack. This cracking, in turn, makes it necessary to remove and replace the fuel rods well before most of the actinides (elements that remain radioactive for thousands of years) such as uranium have fissioned. This is why nuclear waste is radioactive for a very long time.

However, the actinides that remain in the cracked fuel rods is still an excellent source of fuel for reactors. France, for example, recycles the waste instead of burying it so that these actinides can be placed in new fuel rods and used to make more electricity.

Because MSRs use liquid fuel, the release of gases simply bubbles up, typically to an off-gas unit in the coolant loop (not shown in figure) where it can be removed. Since the liquid fuel is unaffected by the releases of gas, the fuel can be left in the reactor until almost all the actinides are fissioned, leaving only elements that are radioactive for a relatively short time (300 years or less). The result is that MSRs have no long term issue with regard to nuclear waste.

Not only do MSRs not have a long term waste issue, they can be used to dispose of current stockpiles of nuclear waste by using those stockpiles as fuel. Even stockpiles of plutonium can be disposed of this way. In fact, conventional reactors typically use only 3-to-5% of the available energy in their fuel rods before the fuel rods must be replaced because of cracking. MSRs can use up most of the rest of the available fuel in these rods to make electricity.

Note: The reason that conventional reactors can’t use up all actinides in their fuel rods is a bit more complex than what is described above. The neutrons in conventional reactors only move fast enough to cause enriched uranium to fission. Fissioning most all of the actinides also requires much faster moving neutrons, which can be achieved in both MSRs and solid-fuel reactors, such as the GE Hitachi PRISM.

Abundant energy cheaper than energy from coal

How do we get all 7 billion people on the planet (perhaps 9 billion by 2050) to agree to drastically cut their CO2 emissions? The answer: make it in their immediate self-interest by providing cheap C02-free energy, energy cheaper than they can get by burning coal.

MSRs can be made cheaply because they are simple compared to conventional reactors that have large pressurized containment domes and many engineered (and not inherent) and redundant safety systems. Having far few parts than conventional reactors, MSRs are inherently cheaper. This simplicity also allows MSRs to be small, which in turn makes them ideal for factory-based mass production (unlike conventional reactors). The cost efficiencies associated with mass production further drive down the cost and can make the ramp up of nuclear power much faster.

Load following solar and wind power

A significant limitation of solar and wind power is their intermittency and unreliability. Currently these issues are dealt with in the U.S. by quickly firing-up natural gas plants to load follow solar and wind power. In other words, gas plants must ramp up quickly when power from wind and sun is scarce, and ramp down quickly when the sun is shining or the wind is blowing. Unfortunately, this is an inefficient way to burn natural gas, which can result in almost as much CO2 output from gas plants ramping up and down as from when they simply run continuously. And, of course, continued use of natural gas requires continued fracking. (Although many hope that a grid-level energy storage technology will someday negate the need to use natural gas plants, no economic energy storage is on the horizon.)

Unlike conventional nuclear reactors, the characteristics of MSRs make them good candidates for CO2-free load following of solar and wind power. This is because slowing down nuclear reactions results in an increased release of xenon gas. When conventional reactors do this, they must wait several days to restart while the xenon gas decays. This extra xenon is not a problem for MSRs because of their off-gas system, which allows immediate removal of xenon; hence, no delay is needed after ramping up or down an MSR.

Note that conventional reactors can be designed to load follow, but typically haven’t been for economic reasons (because more profit can be made by running conventional reactors at full power for base load applications).

Abundant energy for millions of years

Although it is sometimes claimed that nuclear power is not sustainable, the truth is that there is enough nuclear fuel on earth to provide humanity with abundant energy for millions of years. MSRs can run on uranium and existing stockpiles of plutonium and nuclear waste. A variant of an MSR, a liquid fluoride thorium reactor (LFTR), will be able to use abundant thorium as a fuel. In addition, breeder reactors (which include some types of MSRs) make it possible to use uranium-238 as fuel, which makes up 93.3% of all natural uranium. Conventional reactors use only uranium-235, which makes up a mere 0.7% of natural uranium.

Replaces fossil fuels where wind and solar are problematic

MSR technology has potential far beyond generating electricity cheaply and without emitting CO2. For example, MSRs could be used to replace fossil fuels for high heat industrial processes such as water desalinization and the production of cement and aluminum. (In the U.S., industrial processes account for a little over 5% of greenhouse gases.) MSRs can even provide high heat for cheap production of feedstock for synthetic, CO2-free liquid fuels.

MSRs could also be used to power large container ships, which currently run on diesel. The 15 largest of these ships produce as much air pollution every day as do all of the cars on the planet.

Weapons Proliferation Concerns

No nuclear reactor can be made proliferation proof, but MSRs have some significant advantages for proliferation resistance. First, the waste from MSRs is not useful for use in nuclear weapons since MSRs fission almost all actinides. Second, MSRs can use up existing stockpiles of nuclear waste from conventional reactors as well as existing stockpiles of plutonium, making these materials unavailable for use in nuclear weapons.

A Very Brief History of MSR Technology

A welder finishing up the Oak Ridge MSR over 40 years ago. Image: Public Domain

MSRs were first developed in the U.S. in the 1950s for use in a nuclear-powered aircraft bomber (the idea being that the bomber could remain in the air indefinitely). Though a small experimental reactor ran successfully, the program was canceled when it became clear that in-air refueling of bombers was viable.

Under the supervision of Alvin Weinberg in the 1960s, Oak Ridge National Laboratory built an experimental MSR that ran successfully for four years. Weinberg realized early on that MSRs were the ideal type of reactor for civilian use because they cannot melt down. He was eventually fired by the Nixon administration for this advocacy.

The NB-36 made a number of flights in the 1950s carrying an operating nuclear reactor. The crew worked from a lead-shielded cockpit. Image: ASME

In the 2000s, then NASA engineer Kirk Sorenson, who was tasked with how to power a station on the moon, found that MSRs were the best solution. He also realized that MSRs are a great solution on earth. His tireless advocacy for MSRs has generated much interest.


The Intergovernmental Panel on Climate Change, the International Energy Agency, the United Nations, the Obama Administration and even over 70% of climate scientists agree that we must ramp up nuclear power if we are going succeed in dealing with climate change. Because of its exceptional safety and low cost, perhaps MSR technology is a nuclear technology that most everyone can embrace.

Corrections, additions and clarifications (January 25, 2015):

“cesium-137 and iodine-141” changed to: “iodine-131, cesium-134 and cesium-137”.
Added the following to section on load following: “Note that conventional reactors can be designed to load follow, but typically haven’t been for economic reasons (because more profit can be made by running conventional reactors at full power for base load applications).”
Added the following to section on using nuclear waste and plutonium as fuel: “Note: The reason that conventional reactors can’t use up all actinides in their fuel rods is a bit more complex than what is described above. The neutrons in conventional reactors only move fast enough to cause enriched uranium to fission. Fissioning most all of the actinides also requires much faster moving neutrons, which can be achieved in both MSRs and solid-fuel reactors, such as the GE Hitachi PRISM.“
Changed sentence on breeder reactors to: “In addition, breeder reactors (which include some types of MSRs) make it possible to use uranium-238 as fuel, which makes up 93.3% of all natural uranium. Conventional reactors use only uranium-235, which makes up a mere 0.7% of natural uranium.”
Added the following sentence to “Replaces fossil fuels where wind and solar are problematic”: “MSRs can even provide high heat for cheap production of feed-stock for synthetic, CO2-free liquid fuels.”

Posted on Categories News

Coal Industry Casts Itself as a Clean Energy Player

Carbon capture equipment at NRG’s power generating station southwest of Houston. Credit Michael Stravato for The New York Times

President Trump has questioned the science behind climate change as “a hoax” in positioning himself as a champion of coal. The three largest American coal producers are taking a different tack.

Seeking to shore up their struggling industry, the coal producers are voicing greater concern about greenhouse gas emissions. Their goal is to frame a new image for coal as a contributor, not an obstacle, to a clean-energy future — an image intended to foster their legislative agenda.

Executives of the three companies — Cloud Peak Energy, Peabody Energy and Arch Coal — are going so far as to make common cause with some of their harshest critics, including the Natural Resources Defense Council and the Clean Air Task Force. Together, they are lobbying for a tax bill to expand government subsidies to reduce the environmental impact of coal burning.

The technology they are promoting is carbon capture and sequestration — an expensive and, up to now, unwieldy method of trapping carbon dioxide emitted from coal-fired power plants before the gas can blanket the atmosphere and warm the planet.

“We can’t turn back time,” said Richard Reavey, vice president for government and public affairs at Cloud Peak Energy. “We have to accept that there are reasonable concerns about carbon dioxide and climate, and something has to be done about it. It’s a political reality, it’s a social reality, and it has to be dealt with.”

Workers at NRG’s power generating station near Houston. Credit Michael Stravato for The New York Times

The coal executives say the steady gains of renewable energy — along with robust environmental regulations in recent years, many of which they still oppose — are not sufficient to stabilize the climate and still meet energy needs in the years to come. They reason that coal and other fossil fuels will still dominate the fuel mix for the next several decades, and that only capturing carbon from coal-fired and gas-fired power plants can meaningfully shift the world to a low-carbon future. Their argument is backed, at least in part, by many world energy experts and environmentalists.

A similar, at least partial metamorphosis has taken place in the oil and gas and utility industries in recent years with mixed results, although there has been progress in expanding the deployment of renewables like wind and solar for power and in the capture of methane in oil fields to stem a powerful greenhouse gas. The coal executives argue that given the same incentives and subsidies as renewables, carbon capture and sequestration can also take off.

Support among coal executives for capturing carbon at power plants is not entirely new, but their increasingly vocal acknowledgment of climate science in support of the technology is a far stretch from many of the views expressed in recent years.

“We need a low-carbon fossil solution,” said Deck S. Slone, senior vice president for strategy and public policy at Arch Coal. “The political landscape is always shifting and carbon concerns are certainly not going away. We think there is a solution out there in the form of technology that is an answer to the climate challenge and that quite frankly will be good for our business long term.”

Coal executives remain strongly opposed to the Obama administration’s blueprint for reducing dependence on coal for power, known as the Clean Power Plan, which is being contested in the courts. But they say that any rollback of Obama regulatory policies by the new administration may not be enough to keep utilities from switching from coal to low-cost natural gas and renewables, and that only assurances of government support for carbon capture and sequestration can give utilities certainty that coal has a long-term future and encourage them to retrofit old power plants to be cleaner burning.

Trucks lined up to load coal at Cloud Peak Energy’s mine in Rawlins, Wyo. Cloud Peak is one of three coal producers lobbying for a tax bill to expand government subsidies to reduce the environmental impact of coal burning. Credit Jim Wilson/The New York Times

Last year, total United States coal production was 18 percent lower than in 2015 and was the lowest level since 1978. Many companies were forced into bankruptcy. With gas prices rising in recent months, coal made a modest rebound at the end of last year, especially in the Powder River Basin of Montana and Wyoming, where the production economics are generally best.

Vic Svec, a Peabody senior vice president, said that his company was looking to make “a fresh start” as it comes out of bankruptcy, and that part of that fresh start was recognizing that fossil fuels “contribute to greenhouse gas emissions and concern regarding these emissions has become part of the global, societal regulatory landscape.” He added, “There is a market for low-carbon energy sources, and we want to be part of that future.”

Environmentalists say they believe that the coal industry, having dealt with a sharp downturn in recent years and facing an aggressive investor divestment movement, may be shifting its views on climate change more for its own business interests than any newfound love for the environment.

“To the extent that they are saying things that seem much more rational than in the past,” said David Hawkins, director of the climate program at the National Resources Defense Council, “they are trying to persuade skeptical investors that coal has a future.” Nevertheless, he added that his group was willing to work with the companies, even while it was suing them in court on other issues, “if they are willing to join in properly crafted legislation.”

The carbon legislation, introduced last year, would increase the federal tax credit for capture and sequestration to $50 per ton of carbon dioxide from $20. And it would expand available credits by more than a third for permanent storage for the purpose of flooding the carbon into declining oil fields to coax production. The method, already popular in West Texas and supported by the oil and gas industry, gives utilities that deploy the technology an added revenue stream.

When introduced, the measure had broad support from senators as varied as Sheldon Whitehouse, a Rhode Island liberal who is active on climate issues, and Mitch McConnell, the Republican leader from Kentucky, who is one of the strongest backers of the coal industry in Congress. Proponents are preparing to reintroduce the legislation, and coal executives say they hope the Trump administration will get on board.

Senator Heidi Heitkamp, Democrat of North Dakota, who is a leading sponsor of the legislation and a former director in a coal gasification company, said she had seen a shift in the stance of coal executives. “I see people at the table who weren’t at the table before,” she said. “As long as they see that the issue of CO2 is not going to go away, they are going to roll up their sleeves and try to find a way that works for the utility industry and the coal industry.”

One obstacle to the bill could be cost. Supporters have asked the Joint Committee on Taxation to evaluate the effect of the legislation on the federal budget but have not heard back yet.

Opponents say it would merely extend the life of the coal industry.

“For 40 years, I’ve been told clean coal is right around the corner, just give us another few subsidies,” said Dan Becker, director of the Safe Climate Campaign, an environmental group. “Carbon capture and sequestration may work someday in the distant future, but right now it barely works on a technical level. It’s way far away from working on a cost-effectiveness level.”

There are only a handful of commercial-scale operations for carbon capture and sequestration globally. But coal executives say proper permitting and legal protections, along with the tax credits, could bring a surge in construction in the United States within a decade. And as the technology improves and implementation becomes less expensive, the United States could export the technology and make coal-fired power cleaner around the world.

But developing commercial-scale carbon capture has been bedeviled by cost overruns and long delays. The operations not only are expensive to build but also require a lot of power, making plants less efficient. The federal government canceled one such project, called FutureGen, after it was granted more than $1 billion by the Obama administration.

Still, coal executives are staking much of their futures on the technology.

“We’re confident,” Mr. Svec of Peabody said, “that it needs to be a part of any serious effort toward reducing greenhouse gases from industrial sources.”

Correction: February 27, 2017
An earlier version of this article misstated the amount of a proposed increase in a federal tax credit for capture and sequestration of carbon dioxide, as well as the current amount. Legislation would raise the credit to $50 per ton (not $20), from the current $20 (not $10). The article also misstated a former role of Senator Heidi Heitkamp at a coal-gasification company. She was a director, not an executive.

Posted on Categories News

The Murky Future of Nuclear Power in the United States

A view into Unit 4 at the Alvin W. Vogtle generating station in Georgia. The complex plans to use AP1000 reactors from Westinghouse. Credit via Georgia Power

This was supposed to be America’s nuclear century.

The Three Mile Island meltdown was two generations ago. Since then, engineers had developed innovative designs to avoid the kinds of failures that devastated Fukushima in Japan. The United States government was earmarking billions of dollars for a new atomic age, in part to help tame a warming global climate.

But a remarkable confluence of events is bringing that to an end, capped in recent days by Toshiba’s decision to take a $6 billion loss and pull Westinghouse, its American nuclear power subsidiary, out of the construction business.

The reasons are wide-ranging. Against expectations, demand for electricity has slowed. Natural-gas prices have tumbled, eroding nuclear power’s economic rationale. Alternative-energy sources like wind and solar power have come into their own.

And, perhaps most significantly, attempts to square two often-conflicting forces — the desire for greater safety, and the need to contain costs — while bringing to life complex new designs have blocked or delayed nearly all of the projects planned in the United States.

“You can make it go fast, and you can make it be cheap — but not if you adhere to the standard of care that we do,” said Mark Cooper of the Institute for Energy and the Environment at Vermont Law School, referring to the United States regulatory body, which is considered one of the most meticulous in the world. “Nuclear safety always undermines nuclear economics. Inherently, it’s a technology whose time never comes.”

In the process, the United States could lose considerable influence over standards governing safety and waste management, nuclear experts say. And the world may show less willingness to move toward potentially safer designs.

“I’m concerned that if the U.S. is not seen as a big player, and doesn’t have that kind of market presence, that we won’t be in a competitive position to bring those standards back up,” said Richard Nephew, a senior research scholar at the Center on Global Energy Policy at Columbia. “If you’ve got more lax safety standards worldwide, I think that’s a problem from an industry perspective as well as just a human standard.”

This may be an advantage for state-owned nuclear industries worldwide. Often they benefit from long-term national policies in places like Eastern Europe, Asia and the Middle East.

By contrast, the Toshiba-Westinghouse withdrawal from nuclear construction shows how daunting it can be for the private sector to build these plants, even with generous government subsidies like loan guarantees and tax credits. Projects take decades to complete. Safety concerns change along the way, leading to new regulations, thousands of design alterations, delays and spiraling costs for every element.

In one case, even the dirt used to backfill excavated holes at the Westinghouse project in Georgia became a point of contention when it did not measure up to Nuclear Regulatory Commission standards, leading to increased costs and a lawsuit.

Thus far in the United States, only the Tennessee Valley Authority, itself a government corporation, has been able to bring a new nuclear reactor into operation in the last 20 years.

Of the dozens of new reactors once up for licensing with the Nuclear Regulatory Commission, only four are actively under construction. Two are at the Alvin W. Vogtle generating station in Georgia, and two at the Virgil C. Summer plant in South Carolina. Both projects, which plan to use a novel reactor from Westinghouse, have been plagued by delays and cost overruns, some stemming, paradoxically, from an untested regulatory system intended to simplify and accelerate their development.

The projects, more than three years late and billions over budget, are what pushed Westinghouse — one of the last private companies building nuclear reactors — and its parent, Toshiba, to the brink of financial ruin, resulting in Toshiba’s chairman stepping down.

The company has said that Westinghouse will complete the reactors for the projects it already has underway, including two in China. But the fate of other projects in the United States and abroad that plan to use the Westinghouse reactor, known as the AP1000, are in doubt, along with the role of the United States in the future of nuclear energy. It is also unclear how President Trump will approach nuclear energy development, which has broad and overlapping implications for tax and trade policies, economic development and national security.

The AP1000 is considered one of the world’s most advanced reactors, with simplified structures and safety equipment which were intended to make it easier and less expensive to install, operate and maintain. It has been designed with an improved ability to withstand earthquakes and plane crashes and is less vulnerable to a cutoff of electricity, which is what set off the triple meltdown at Fukushima.

The industry has lurched through boom and bust cycles before.

Nuclear construction had all but disappeared in the United States, particularly after the partial meltdown at Three Mile Island in Pennsylvania in 1979. Concerns over climate change led to renewed interest in building new plants under the administration of George W. Bush, however. The Bush-era energy policy acts authorized $18.5 billion in loan guarantees, plus tax credits like those available for wind and solar.

The Alvin W. Vogtle generating station in Georgia, one of only four of the dozens of new reactors once up for licensing with the Nuclear Regulatory Commission still under construction. The Vogtle project has been marred by delays and cost overruns. Credit via Georgia Power

Determined to avoid the delays and ballooning costs that were common as plants were built in the 1970s and ’80s, federal regulators had devised a new licensing process.

Under the old system, companies received construction permits based on incomplete plans and then applied for an operating license, often leading to rebuilding and lengthy delays. The idea for the new system was that companies would submit much more complete design plans for approval, and then receive their operating licenses as construction started. That way, as long as they built exactly what they said they would, the process could move more quickly.

In the meantime, companies like Westinghouse and General Electric were developing a new generation of reactors intended to operate more safely. With the AP1000, for instance, emergency cooling for the reactor mainly relies on natural forces, like gravity, to propel the coolant, rather than relying on mechanical pumps powered by electricity. The problem is that electricity can fail, as it did at Fukushima, which can lead to disastrous overheating in a damaged reactor of an older design.

In addition, Westinghouse was engineering its equipment so that large components of the plants could be made in sections at factories, then welded together and lifted into place with cranes at the construction site. In theory, this approach would save money and time, requiring far less skilled labor than the old, bespoke approach, in which workers assembled more parts onsite.

By 2008, Westinghouse had deals to expand two existing plants with the electric utilities Georgia Power and South Carolina Electric & Gas. Little went as hoped.

Because nuclear construction had been dormant for so long, American companies lacked the equipment and expertise needed to make some of the biggest components, like the 300-ton reactor vessels. Instead, they were manufactured overseas, adding to expense and delays.

One reactor vessel, headed for Georgia Power’s Vogtle plant from the Port of Savannah, almost slipped off a specialized rail car. That led to a weekslong delay before a second attempt was made to deliver it.

And, in a separate snafu, while working on the plant’s basement contractors installed 1,200 tons of steel reinforcing bar in a way that differed from the approved design. That triggered a seven-and-a-half month delay to get a license amendment.

To some extent, the unexpected delays were to be, well, expected, given the novelty of the design and the fact that builders were decades out of practice. Any large undertaking involving so many first-of-a-kind complexities would be likely get tripped up somewhere, said Daniel S. Lipman, vice president of supplier and international programs at the Nuclear Energy Institute, which represents the industry.

“Whether you’re building a nuclear power plant or providing a new locomotive or a new fighter jet complex for the Defense Department, the first of a kind almost always takes longer to be deployed,” he said.

And then there was Fukushima, when an earthquake and tsunami knocked out both grid and backup emergency power at the plant, disabling its cooling systems and leading to the meltdown of three reactors. The plant remains shut down, and the decommissioning and cleanup process is projected to take as long as 40 years.

The Japan disaster prompted regulators to revisit safety standards, slowing approval of the Westinghouse designs and resulting in new requirements even after the Nuclear Regulatory Commission gave the go-ahead for the Georgia and South Carolina projects. That led to more costly delays as manufacturing orders had to be changed.

As all of that unfolded, Westinghouse was having troubles with the contractor it chose to complete the projects, a company that struggled to meet the strict demands of nuclear construction and was undergoing its own internal difficulties after a merger. As part of an effort to get the delays and escalating costs under control, Westinghouse acquired part of the construction company, which set off a series of still-unresolved disputes over who should absorb the cost overruns and how Westinghouse accounted for and reported values in the transaction.

Toshiba, which would like to sell all or part of its controlling interest in Westinghouse, has said it will continue to look into Westinghouse’s handling of the purchase.

“Certainly they underestimated the amount of liability or cost overruns that these projects were in,” Robert Norfleet, a managing director at Alembic Global Advisors who has followed the machinations, said of Westinghouse. “I don’t really know how they can’t take the blame for that. That’s something within their own due diligence that they needed to do.”

In the meantime, the main stage for nuclear development will move overseas to places like China, Russia, India, Korea and a handful of countries in the Middle East, where Westinghouse will have to find partners to build its designs.

In China, plants using an earlier model of the AP1000 are moving toward completion. If they are successful, that may stir up more interest in the technology, and future installations may go more smoothly. But Toshiba’s ambitions of installing 45 new reactors worldwide by 2030 no longer look feasible.

Indeed, despite the much-ballyhooed ingenuity of a new generation of reactors designed by the likes of Westinghouse and G.E., countries may stick with older technologies that they can produce and install more quickly and cheaply. “Until several of these new designs — including the AP1000 from Westinghouse — come online on time and on budget,” said Brent Wanner, an analyst at the International Energy Agency, “it will be an uphill battle.”

Posted on Categories News

Storms Bring Relief to Drought-Stricken California, but Santa Barbara Misses Out: Central Coast community’s plight underscores geographic and logistic vagaries of state’s water distribution

SANTA BARBARA, Calif.—Tom Fayram admits to a fair amount of water envy.

“When you see flooding rivers in Northern California, you wish you could see some of that here,” said Mr. Fayram, deputy director of Santa Barbara County’s water resources division.

Yet Santa Barbara, about 100 miles north of Los Angeles along the state’s Central Coast, has been left high and dry compared with almost every other part of California this wet winter, showing the geographic and logistic vagaries of water distribution in this drought-plagued state.

California Drought Comparison: Intensity

About half the state has emerged from a six-year drought that prompted severe water restrictions on residents and businesses. Even Southern California, still stuck in drought conditions, has seen improvements as a parade of storms soaks the region.

But the rainfall has largely danced around Santa Barbara. Lake Cachuma, which serves as the largest supply of water for the county’s 450,000 residents, stood at just 14.4% of capacity as of Tuesday. By comparison, 82% for Northern California’s Shasta Lake and 86% for Southern California’s Castaic Lake.

Santa Barbara remains the only county in California listed as mostly still under extreme drought as of the end of January, according to most recent estimates by the University of Nebraska’s National Drought Mitigation Center.

Just 2% of California was listed in that category—including parts of neighboring Kern and San Luis Obispo counties—compared with 64% a year earlier.

That has forced Santa Barbara to continue harsh water restrictions while the rest of the state is easing them.

Surfers enjoyed the large waves at the entrance to Santa Barbara, Calif., harbor on Jan. 21, 2017. A winter storm brought much higher than usual waves to the area, but has failed to alleviate the drought that is still hurting the community there. PHOTO: MIKE ELIASON/SANTA BARBARA COUNTY FIRE DEPARTMENT/ASSOCIATED PRESS

On Jan. 1, the city of Santa Barbara put into effect a new ordinance that bans lawn watering, with limited exceptions, to cut water use to 40% from 35% last year.

The water situation for the rest of California, meanwhile, is largely much improved this winter. Electronic readings Tuesday by the state Department of Water Resources measured the snow pack in the Sierra Nevada mountains at 182% of normal.

That counts for as much as a third of California’s supply, meaning cities and farms that are part of a distribution network tied to the snow pack stand to receive most of their normal deliveries this year. Santa Barbara receives about a third of its water this way.

Water agencies in San Diego and Orange counties declared an end to local drought conditions in the past few weeks, and many are calling on the State Water Resources Control Board to suspend emergency regulations that were adopted in 2014. Agency board members, while acknowledging drought conditions have greatly eased, voted Wednesday to extend the regulations through at least May, to reassess the situation when the California rainy season typically draws to a close.

Kira Redmond, executive director of Santa Barbara Channelkeeper, a local environmental group, is seen at Lake Cachuma in Santa Barbara. PHOTO: JAKE NICOL/THE WALL STREET JOURNAL

The U.S. Southeast experienced a severe drought last summer and fall that lowered lakes and rivers and hurt cattle ranchers and farmers. The extended drought sparked wildfires through southern Appalachia, including deadly blazes in and around Gatlinburg, Tenn.

Winter rains have erased the drought’s impact in much of the South, but extreme drought remains in some parts of Alabama and Georgia, according to the U.S. Drought Monitor. Bill Murphey, Georgia’s state climatologist, said Lake Lanier, which supplies water to Atlanta, is still below normal levels, as is the area around the headwaters of the Chattahoochee River. He hoped the region’s ground would be recharged with more rain in coming months before the drier, warmer spring and summer take hold, he said.

Santa Barbara’s water troubles, in part, can be traced to its location on the rugged California coast, where communities often have to depend heavily on rain and other local water sources because of the difficulty piping the Sierra Nevada water there.

The 17.6 inches of rainfall measured near the reservoir are only mildly ahead of normal; another 30 inches would need to fall to refill it, said Mr. Fayram of the county water resources division.

“We have a long way to go,” he said.

Santa Barbara started at a deep water deficit: a mere 63 inches of rain has fallen at a measuring spot in the local mountains between 2011 and 2016—far below the 93 inches that fell there during the previous worst drought, from 1986-1991, said Joshua Haggmark, the city’s water resources manager.

A DC-10 tanker dropped fire retardant at a low altitude to help combat a wildfire near Santa Barbara, on June 17, 2016. PHOTO: NICK UT/ASSOCIATED PRESS

The city of Santa Barbara spent the last two years and millions of dollars banking unused state water in the San Luis Reservoir hundreds of miles away for an emergency. However, that reservoir has filled up so fast from recent rains that much of the banked supply will be lost by mid-February to spillage, Mr. Haggmark said. Under state law, banked supplies aren’t protected from spillage.

The city is completing the reactivation of a desalination plant it froze after another drought ended in the early 1990s. But that project has encountered cost overruns and is fiercely opposed by some local environmentalists, who call desalination too expensive and harmful to marine waters. Mr. Haggmark said the city turned to desalination as a last resort.

“It’s been like a perfect storm for us,” Mr. Haggmark said. “Things are really pretty dismal right now.”


Environmentalists say semiarid places such as Santa Barbara need to conserve and recycle more.

Santa Barbara Channelkeeper, a local environmental group, recently obtained a grant to deploy used wine barrels for rain capture. Kira Redmond, executive director of the group, said other techniques such as converting sewage water to potable use must be deployed.

“I think that’s the wave of the future,” Ms. Redmond said.

Standing on a boat ramp overlooking the nearly empty Cachuma one day last week, Lauri Rawlins-Betta took in the arid landscape.

“This is so sad,” said Ms. Rawlins-Betta, 65 years old, who grew up near here and was visiting from another part of the state.

The county’s population has more than quadrupled since the lake was built 60 years ago.

“To me,” Ms. Rawlins-Betta said, “it’s overgrowth, big time.”

—Cameron McWhirter in Atlanta contributed to this article.

Posted on Categories News

Cutting jobs, street repairs, library books to keep up with pension costs: Generous retirement benefits for public safety employees could help push the Bay Area city of Richmond into bankruptcy

Richmond, a working-class city of 110,000 on the east shore of San Francisco Bay, has been struggling with the cost of employee retirement benefits. Pension-related expenses have risen from $25 million to $44 million annually in the last five years and could reach $70 million by 2021. (Robert Durell / CALmatters)

When the state auditor gauged the fiscal health of California cities in 2015, this port community on the eastern shore of San Francisco Bay made a short list of six distressed municipalities at risk of bankruptcy.

Richmond has cut about 200 jobs — roughly 20% of its workforce — since 2008. Its credit rating is at junk status. And in November, voters rejected a tax increase that city leaders had hoped would help close a chronic budget deficit.

“I don’t think there’s any chance we can avoid it,” said former City Councilman Vinay Pimple, referring to bankruptcy.

A major cause of Richmond’s problems: relentless growth in pension costs.

Payments for employee pensions, pension-related debt and retiree healthcare have climbed from $25 million to $44 million in the last five years, outpacing all other expenses.

By 2021, retirement expenses could exceed $70 million — 41% of the city’s general fund.

Richmond is a stark example of how pension costs are causing fiscal stress in cities across California. Four municipalities — Vallejo, Stockton, San Bernardino and Mammoth Lakes — have filed for bankruptcy protection since 2008. Others are on the brink.

“The truth is that there are cities all over the state that just aren’t owning up to all their problems,” said San Bernardino City Manager Mark Scott.

Increasingly, pension costs consume 15% or more of big city budgets, crowding out basic services and leaving local governments more vulnerable than ever to the next economic downturn.

Richmond is a racially diverse, working-class city of 110,000 whose largest employer is a massive Chevron oil refinery. Like many California municipalities, Richmond dug a financial hole for itself by granting generous retirement benefits to police and firefighters on the assumption that pension fund investments would grow fast enough to cover the cost.

That optimism proved unfounded, and now the bill is coming due.

City Manager Bill Lindsay insists that Richmond can avoid going off a cliff. Last year, financial consultants mapped a path to stability for the city by 2021 — but at a considerable cost in public services.

The city cut 11 positions, reduced after-school and senior classes, eliminated neighborhood clean-ups to tackle illegal trash dumping, and trimmed spending on new library books — saving $12 million total.

City officials also negotiated a four-year contract with firefighters that freezes salaries and requires firefighters to pay $4,800 a year each toward retirement healthcare. Until then, the benefit was fully funded by taxpayers.

“I’ve seen some of my good friends go through it in Vallejo and Stockton, and what we found out during those [bankruptcies] is that your union contracts aren’t necessarily guaranteed,” said Jim Russey, president of Richmond Firefighters Local 188.

Richmond’s consultants said the city had to find $15 million more in new revenue or budget cuts by 2021. Lindsay said the city has been looking hard for additional savings, and the police union recently agreed to have its members contribute toward retirement healthcare.

July 26, 2016
Tough sledding
Financial consultants with the National Resource Network spelled out the daunting challenges Richmond faces in righting its finances.

“If you look at the five-year forecast, with reasonable assumptions, even with the growth in pension cost, it does start to generate a surplus,” Lindsay said.

Joe Nation, a former Democratic state legislator who teaches public policy at Stanford’s Institute for Economic Policy Research, is not so sanguine. He reviewed Richmond’s retirement cost projections and said they leave little room to maneuver.

Over the next five years, every dollar the city collects in new revenue will go toward retirement costs, leaving little hope of restoring city services, Nation said.

“If there is an economic downturn of any kind, I can imagine that they could be pushed to the brink of bankruptcy, if not bankruptcy,” Nation said.

Last month, the California Public Employees’ Retirement System (CalPERS), the state’s main pension fund, lowered its projected rate of return on investments from 7.5% to 7% per year. That means Richmond and other communities will have to pay more each year to fund current and future pension benefits.

Dec. 21, 2016
Lower returns, higher cost
CalPERS told local governments it was lowering its projected rate of return on investments. That means taxpayers will have to pay more to fund retirement benefits.

The change is expected to increase local government pension payments by at least 20% starting in 2018, according to CalPERS spokeswoman Amy Morgan.

An analysis by the nonprofit news organization CALmatters indicates that Richmond’s retirement-related expenses could grow to more than $70 million per year by 2021. That represents 41% of a projected $174-million general fund budget.

Lindsay said the city’s estimates of future pension costs are lower because of different assumptions about salary increases and other costs.

The city of Richmond’s pension-related budget problems have taken a toll on public services, including street repair. (Robert Durell / CALmatters)

Voters approved a sales tax increase in 2014 to help stabilize the city’s finances. But in November, voters rejected an increase in the property transfer tax that was expected to bring in an additional $4 million to $6 million annually.

Lindsay said the city was never counting on the property transfer tax in its 5-year plan. If the city needed more cash, he says Richmond has properties it can sell.

“Budget management is much more difficult in Richmond than in Beverly Hills, but you still manage it,” Lindsay said. “To say it’s spiraling out of control into bankruptcy does incredible damage to our community and it’s just not accurate.”

Richmond is especially hard hit by personnel costs because of high salaries for public employees. The city’s average salary of $92,000 for its 938 employees was fifth highest in California as of 2015, according to the state controller. The city’s median household income is $54,857.

Police officers and firefighters in Richmond make more than $137,000 per year on average, compared with an average $128,000 per year for Berkeley police and firefighters, where housing prices are more than 60% higher than in Richmond.

Public safety salaries averaged $115,000 in Oakland and $112,000 in Vallejo.

Mayor Tom Butt says of Richmond’s pension-related financial problems: “It’s a huge mess … One of these days, it’s just going to come crashing down.” (Robert Durell / CALmatters)

Richmond Mayor Tom Butt, an architect and general contractor who has served on the city council for two decades, says the city that was once among the state’s most dangerous has little choice but to pay higher salaries to compete for employees with nearby communities that are safer and more affluent.

“You can’t convince anyone here that they deserve less than anybody in any other city,” Butt said.

Lindsay said the decision to offer higher salaries for public safety employees was strategic.

“The city council made a conscious decision to put a lot into public safety, in particular reducing violent crime. And largely, we’ve been successful,” Lindsay said.

Violent crimes have been declining in the city over the past decade with homicides dropping to a low of 11 in 2014. But Richmond is experiencing an uptick, recording 24 homicides in 2016, according to the police department.

Part of the challenge with public safety costs dates to 1999, when Richmond, like many local governments, matched the state’s decision to sweeten retirement benefits for California Highway Patrol Officers.

CHP officers could retire as early as 50 with 3% of salary for each year of service, providing up to 90% of their peak salaries in retirement. Other police departments soon demanded and got similar treatment.

Richmond firefighters are eligible to retire at age 55 with 3% of salary for each year of service. Recent hires will have to work longer to qualify for a less generous formula under legislation passed in 2013.

Richmond’s actuarial pension report shows there are nearly two retirees for every police officer or firefighter currently on the job.

To cope with severe budgetary pressures, the city of Richmond put this Fire Department training facility up for sale. (Robert Durell / CALmatters)

In a way, Richmond is a preview of what California cities face in the years ahead. According to CalPERS, there were two active workers for every retiree in its system in 2001. Today, there are 1.3 workers for each retiree. In the next 10 or 20 years, there will be as few as 0.6 workers for each retiree collecting a pension.

Because benefits have already been promised to today’s workers and retirees, cuts in pension benefits for new employees do little to ease the immediate burden. It “means decades before the full burden of this will be completely dealt with,” said Phil Batchelor, former Contra Costa County administrator and former interim city manager for Richmond.

Today, Richmond’s taxpayers are spending more to make up for underperforming pension investments. CalPERS projects that the city’s payments for unfunded pension liabilities will more than double in the next five years, from $11.2 million to $26.8 million.

Now, the lower assumed rate of investment return is projected to add nearly $9 million to Richmond’s costs by 2021.

“It’s a huge mess,” said Mayor Butt. “I don’t know how it’s going to get resolved. One of these days, it’s just going to come crashing down.”

Judy Lin is a reporter at CALmatters, a nonprofit journalism venture in Sacramento covering state policy and politics.

Posted on Categories News

Toshiba to Exit Nuclear Construction Business: Facing billions of dollars in losses after ill-fated bet, Westinghouse unit will limit future nuclear business to selling reactor designs

The Plant Vogtle nuclear power plant in Waynesboro, Ga., is one of the two U.S. facilities where Westinghouse is in the process of building additional reactors. PHOTO: JOHN BAZEMORE/ASSOCIATED PRESS

Toshiba Corp. plans to stop building nuclear power plants after incurring billions of dollars in losses trying to complete long-delayed projects in the U.S., a move that could have widespread ramifications for the future of the nuclear-power industry.

The Japanese industrial conglomerate is set to announce plans to exit nuclear construction by the middle of February, according to a Toshiba executive familiar with the matter. The executive also said Toshiba’s chairman, Shigenori Shiga, and Danny Roderick, a Toshiba executive and the former head of its Pittsburgh-based nuclear power unit, Westinghouse Electric Co., are expected to step down.

Toshiba’s decision deals a fatal blow to its ambitions to become a major player in the nuclear construction business. The company has bet aggressively on Westinghouse’s AP1000 reactor design, which it hoped would anchor a new generation of nuclear power plants that were supposed to be easier to build and to deliver on time. But signs emerged that the AP1000 wasn’t as easy to build as hoped, and yet Toshiba remained confident and took on added financial risk, according to legal filings and interviews with people involved with the construction process.

Toshiba declined to comment. The company previously said it would disclose the size of Westinghouse’s losses on Feb. 14. In December, it said it was likely to take a write-down of several billion dollars, and people familiar with the situation say the losses could approach $6 billion—plunging the company into a new crisis just as it was seeking to move away from an earlier accounting scandal.

Westinghouse will continue to design nuclear reactors, the Toshiba executive said, and is expected to complete construction work at two U.S. nuclear facilities it is still in the process of building—in Georgia and South Carolina, commissioned by utilities Southern Co. and Scana Corp., respectively.

Toshiba’s future involvement with nuclear plants will be limited to selling its designs; it will let other companies handle the risk of building the facilities, an approach it already takes in China.

“We are closely monitoring [Westinghouse’s] financial status, as well as that of Toshiba,” a Scana spokeswoman said.

Southern officials said they are confident shareholders and customers are protected through a $920 million letter of credit from Westinghouse and a fixed-price contract which transfers responsibility for cost overruns to Westinghouse.

In October 2015, as Toshiba faced a very public accounting scandal centered on its computer business, it was quietly dealing with another crisis in nuclear power-plant construction—and made a series of bold moves in an attempt to fix it.

The company bought out a partner in a nuclear-construction consortium, settled lawsuits and renegotiated contracts with Southern and Scana, which put Toshiba overwhelmingly on the hook if the two construction projects continued to run over budget.

Toshiba’s decision to exit the nuclear construction business could have widespread ramifications. Nuclear power appears to be “too big, too expensive, and most of all, too slow to compete effectively in what is an increasingly ferocious competition,” said Mycle Schneider, a nuclear expert based in Paris.

Toshiba’s Danny Roderick, left, and Shigenori Shiga, second right, are expected to step down when the company announces plans to exit nuclear construction in February. PHOTO: KAZUHIRO NOGI/AGENCE FRANCE-PRESSE/GETTY IMAGES

The nuclear construction business, led by a General Electric Co.-Hitachi Ltd. venture and France’s Areva SA, has been under pressure since the 2011 Fukushima nuclear-plant meltdowns in Japan.

Toshiba plunged into the business in 2006, when it won a bidding war to acquire Westinghouse. Analysts worried at the time that it had overbid. But within a couple of years the bet appeared to be paying off: Southern chose Westinghouse’s design for the first new nuclear plant to be built in the U.S. in 30 years, and the next month Scana also chose the AP1000 for a plant in South Carolina.

The U.S. government approved the designs in early 2012 and work began. Within a few months, legal disputes arose between Westinghouse, its construction consortium partner, Stone & Webster, and Southern over who would pay for unexpected costs resulting from post-Fukushima tougher safety standards, according to filings.

Relations between Westinghouse and Stone & Webster’s owner, Chicago Bridge & Iron NV, broke down by 2015, according to filings. William Jacobs, the independent construction monitor for the plant Southern is building, said Westinghouse and CB&I were “incurring very large costs beyond those being publicly reported” due in part to having so many employees for a project that was years behind schedule.

In March 2015, CB&I broached a possible sale of Stone & Webster to Toshiba. As the talks intensified, Toshiba became mired in the accounting scandal, prompting it to acknowledge it padded profits in its personal computer and other businesses.

Toshiba worried that if the lawsuits with Southern and CB&I over the Fukushima-related safety-cost overruns continued, Toshiba might have had to acknowledge that Westinghouse faced big liabilities, according to company executives. A large write-down at that stage threatened to wipe out the company’s capital.

To end the litigation, Toshiba made several deals in October 2015. It acquired Stone & Webster for $229 million in deferred payments and became the only guarantor on the engineering contract, releasing CB&I. Scana agreed to push back the completion date for the South Carolina plant, but negotiated a deal where it would pay Toshiba $505 million in exchange for switching to a fixed-price contract. Toshiba agreed.

Southern faced up to $1.5 billion in liability in the lawsuits over post-Fukushima safety-cost overruns, and settled for about $350 million in October 2015. The deal restricted Westinghouse’s ability to “seek further increases in the contract price,” Southern said—meaning that if the nuclear plant couldn’t be completed in a timely manner, Toshiba would shoulder the costs.

As problems continued, Westinghouse and CB&I last year sued each other in a dispute over the Stone & Webster sale. Then Toshiba said it might need to take a write-down of several billion dollars related to the value of Stone & Webster, caused by cost overruns.

While Southern said it is insulated from cost overruns, it is unclear if the $920 million line of credit from Westinghouse would be sufficient to complete its two generating units if Westinghouse’s financial problems prevent it from fulfilling its contract.

“I don’t see how Southern and Scana are confident they won’t be responsible for any further cost increases,” said Sara Barczak, a critic of the projects who works for the Southern Alliance for Clean Energy, a nonpartisan advocacy group.

Posted on Categories News

Affordable water may soon dry up, especially if you live here

Water may become unaffordable for a third of American households within the next five years. Photo by Enid Martindale/via Flickr

Remember this number: $120. It’s the average monthly water bill in America.

Researchers at Michigan State University predict this figure will rise by $49 over the next five years. And if it does, water may become unaffordable for one-third of American households, according to a study, published recently in PLOS ONE, that maps the U.S. areas due to be hit hardest based on local incomes.

“The project deals with looking at the economic impacts of rising water prices on both households and regional economies, said Elizabeth Mack, an MSU geographer who led the work. When she first pitched the research idea to her colleagues, some scoffed. While water unaffordability is common overseas, Mack said, most assume Americans have the resources and the willingness to do whatever it takes to pay for water.

But rising water prices are quickly eroding this line of reasoning, according to the investigation conducted by Mack and her colleague Sarah Wrase. Two years ago, a survey of 30 major U.S. cities found water bills rose by 41 percent between 2010 and 2015. This dilemma is well-documented in Detroit, where 50,000 households have lost water access since 2014, or in Philadelphia, where 40 percent of the city’s 227,000 water bills are past due.

Mack took these reports and multiple others to expose how pocketbooks could be affected by escalating water prices on a national level. To do so, she peered into an American Water Works Association survey of water utilities across the U.S. in order to determine the annual water bill for an average consumer. The analysis settled on a bill of $120 for nation’s average monthly consumption of 12,000 gallons in 2014, which is based on figures from the U.S. Environmental Protection Agency.

The EPA also provides an estimate for much Americans can afford to spend on water and wastewater services. If water prices rise above 4.5 percent of a household’s income, then “that means you’re going to have to take expenditures from other portions of your budget and allocate them to water,” Mack said.

To meet this affordability benchmark, a household must earn at least $32,000 per year, according to Mack and Wrase’s assessment. Based on their numbers, nearly 14 million American households — 11.9 percent — couldn’t afford water in 2014. If water prices continue to rise at the same rate (41 percent over five years), then a third of American households — 40 million — may lose access to affordable water, they found.

The team examined median income data for individual areas in the U.S. to chart a map of the communities most at-risk for water poverty.

Counties (census tracts) with a high-risk (black) or at-risk (grey) of losing water affordability. High-risk is defined as areas with a median income below $32,000, while at-risk communities have median incomes of $32,000 and $45,120. Image by Mack EA and Wrase S, 2017, PLoS ONE

The South, urban centers and low-income communities carry the most risk. For instance, 81 percent of high-risk and 63 percent of at-risk communities are concentrated in urban areas. Mississippi, Louisiana and Alabama topped the list with the largest numbers of county subdivisions — or census tracts — facing a high-risk of future water poverty. Many of the at-risk areas also have higher rates of disability, food stamp usage, unemployment and black and Hispanic residents, according to the study.

“Some regions are affected more than others in regards to rising water prices, but it’s unlikely that there are any regions that won’t see increases,” said Justin Mattingly, a research manager at the Water Environment & Reuse Foundation, who wasn’t involved in the study. “Aging infrastructure is a problem for everybody, and water scarcity is becoming a bigger problem in many regions as well. There have been years of disinvestment for water infrastructure, and it’s starting to come back to us now.”

Much of the nation’s water infrastructure dates back to World War II, if not earlier. Washington, D.C. still runs water through wooden pipes from the mid-1800s. On Tuesday, Senate democrats unveiled a $1 trillion infrastructure plan that would allocate $110 billion to water and sewer rehabilitation. But water policy agencies predict a total overhaul of America’s water would itself cost $1 trillion. Tack on another $36 billion to adjust for drought, seawater intrusion into aquifers, flooding and other climate change-based shifts to water systems.

Most of the time, water pipes are installed by housing developers, said Theresa Connor of the One Water Solutions Institute at Colorado State University. But water utilities take over the costs of upkeep once they start serving a new neighborhood.

“Although the major cost is the pipes, you also have to keep your plants up-to-date. If there are any new regulations, you might have to do improvements of the pipes and plants,” said Connor, who wasn’t involved in the study.

Infrastructure replacement is the primary driver of the water price surge in America, Connor said. In Atlanta, which spends more on water than any other state, there was a regulatory initiative to prevent stormwater from discharging into wastewater. The move prevented raw sewage from mixing into the streams used for drinking water. But this regulatory decision plus the privatization of water services bumped Atlanta’s water bills to $325 per month on average.

Recent water regulations — like the Drinking Water Protection Act — have forced some water utilities to update their systems to protect from emerging contaminants like agricultural nutrients. But both Connor and Mattingly said those costs are small relative to the spending needed to address aging infrastructure. “At the end of the day, it’s still aging infrastructure that’s driving much of the rise in water rates,” Mattingly said.

Urban flight is another factor in rising water prices. As populations decline in places like Detroit, water utilities are forced to spread their expenses across fewer people, which boosts rates. Meanwhile, cities like Phoenix and Las Vegas have low prices due to human growth.

“Many utilities are looking at alternative billing structures to take some of the burden off low-income households,” Mattingly said. One tactic involves higher charges for those who use more water, rather than a flat fee for everyone.

Both Mattingly and Connor said Mack’s study is a solid first step in understanding water poverty, but noted that its resolution is limited given the data are based on a small portion of America’s 155,000 or so water systems.

“This reality underlies the problem of a lack of available data on water usage and other metrics in the United States,” Mattingly said. “Without proper data, decision making at the local and national level can be hindered.”

Billing rates can vary dramatically between water providers, even within a single city. In the future, Mack hopes to apply the same analysis on individual cities to offer more guidance on water affordability.

As Flint, Michigan and other cities with water catastrophes have proven, the stakes for water infrastructure improvements are high. Delays can expose citizens to health hazards.

“While Flint has certainly garnered the most attention for its water infrastructure problems – and with good reason – they are certainly not alone,” Mattingly said.

Posted on Categories News