As Hurricane Sandy approached Virginia Beach, I watched churning surf form a troublesome backdrop to two skateboarders harnessing the wind to propel themselves rapidly along the boardwalk. Those same winds were piling up water to form a dangerous storm surge and portended a powerful blow that would ultimately cause widespread devastation throughout the region. Since that moment, I have been asked many questions about Hurricane Sandy. Here are answers to the most common ones.
Can we expect more extreme weather with climate change?
For extreme events observed since 1950, the evidence for the links with climate change are strongest for heat waves and coastal flooding and strong for intense precipitation in some areas and drought in others. The current state of scientific understanding is less clear for hurricanes overall, though aspects of hurricane development are expected to be influenced by a warming planet. These include warmer sea surface temperature during hurricane season, a warmer atmosphere concentrating precipitation, and higher storm surges compared to a century ago due to sea level rise.
Why was the storm surge in New York City so historic?
A nightmare combination occurred to make this storm surge even worse for lower Manhattan – a simultaneous high tide, full moon, and sea level rise. The first two are entirely natural. The latter is influenced by climate change. To make matters worse, local rises in sea level off the Northeast coast are among the highest sea level rise rates in the world.
Isn’t it unusual to have a hurricane so close to Halloween and so far north?
According to NOAA’s National Hurricane Center analysis, Sandy traversed sea surface temperatures that were far above average for this time of year. Warm ocean water fuels hurricanes and makes them more powerful. Looking at the threshold contour for temperature required for hurricane development, Sandy was above this threshold along most of the storm track, which allowed it to remain a hurricane when it made landfall in New Jersey.
What can be done to better protect coastal communities?
I think of city planners as the first responders for climate change. Local communities, including some within the vast sphere of influence of hurricane Sandy, are working with the best available science and forming climate action plans. Unfortunately, not all communities have taken these first steps and funding cuts are taking place within the U.S. government agency NOAA, the very agency that provided accurate tracking of Hurricane Sandy.
(Dr. Ekwurzel is a climate scientist and assistant director of climate research and analysis at Union of Concerned Scientists. She has expertise on many aspects of climate variability including Arctic Ocean and sea ice, wildfires, groundwater, and coastal erosion. She holds a Ph.D. in isotope geochemistry from Columbia University – Lamont-Doherty Earth Observatory).
You may feel that your hands are simply too full with work or raising your kids to get into the “saving the planet” business. If you are curious enough to look through Cooler Smarter, though, you will still find valuable information. Many of the choices offered in the book won’t just lower your emissions of carbon dioxide; they can also improve the quality of your life, save you money and time, and even improve your health.
That’s what the people of Salina, Kansas, found when they entered a yearlong competition with neighboring cities in their state to see who could save the most on their energy bills. Many residents of Salina have doubts about the findings of climate science. Nonetheless, these Kansans say they don’t like their nation’s dependence on foreign oil; plus, like most Americans, they are thrifty and very much like saving money. During this contest, the entire city of Salina (population 46,000) was able to reduce its overall carbon dioxide emissions by 5 percent. Jerry Clasen, a local grain farmer, captured the prevailing sentiment, commenting, “Whether or not the earth is getting warmer, it feels good to be part of something that works for Kansas and for the nation.”
As the folks in Salina discovered, the inefficient use of energy in the United States makes it easy for anyone seeking to reduce emissions to reap quick rewards. Did you know, for instance, that fossil fuel power plants typically release roughly two-thirds of their energy as waste heat? Or that less than 20 percent of the gasoline a car burns goes toward propelling it down the road? Even without changing to renewable power sources that can generate electricity with zero carbon emissions, we can dramatically increase the efficiency of our use of fossil fuels with cost-effective, off-the-shelf technology. By one estimate, technologies to recover energy from waste heat and other waste resources in the United States potentially could harness almost 100,000 megawatts of electricity—enough to provide about 18 percent of the nation’s electricity.
But we don’t have to wait for more efficiency to be built into the system. As end users of this energy, we have at our disposal a wide variety of simple techniques to squeeze much more out of our current energy use, saving money and reducing our emissions.
What this means for you is that you can probably make some simple changes that will yield real improvements in your energy efficiency. Not long ago, a Canadian utility company drove home this point in a much-lauded television commercial that urged its customers to conserve energy. The ad depicts individuals engaging in laughably wasteful behavior. One guy is wrapping his sandwich in aluminum foil, but instead of using one sheet, he keeps wrapping and wrapping until he has used the entire roll. A woman takes just one bite of an apple, then drops it on the ground and picks up a new one, repeating this mindless act until the camera zooms out to reveal the ground below her strewn with bitten apples. The spot ends with a family going out of their house without turning out any of its brightly burning lights. It leaves the viewer to ponder why this behavior isn’t every bit as preposterous as the others.
In many ways, the issue really is that simple. If you live in the United States, on average your activities emit a whopping 21 tons of carbon dioxide into the atmosphere annually. That’s one of the highest per-person emission rates in the world and some four times higher than the global average.
Compared with our counterparts around the world, we are responsible for outsized emissions and outsized costs. The emission levels of the average American are roughly four times the global average, as noted above, and they are also roughly 15 times those of the average citizen of India. To be sure, poverty in many parts of India, as in many countries, keeps personal consumption—and associated emissions—far below the level currently found in the United States. But on a per capita basis, even most industrialized European countries—with standards of living similar to those in the United States—emit less than half the carbon dioxide the United States does.
When you do the math, it reveals that, on average as an American, your activities emit just over 115 pounds of carbon dioxide daily. Think about that for a moment: your actions are responsible for sending a fair portion of your total body weight up smokestacks and out tailpipes everyday. And the heat-trapping carbon dioxide each of us is contributing is accumulating in the atmosphere to cause global warming.
Can we reduce our global warming emissions? Of course we can.
Bear in mind, for instance, that just two decades ago the chemicals in many common products, from refrigerators to hair spray, were eating away at the protective ozone layer in the atmosphere. The resulting ozone hole seemed to present an insurmountable global problem. But with effective planning and innovation, we tackled the problem. Citizens, scientists, and government officials came together to phase out the harmful substances responsible for the problem. Today the stratospheric ozone layer is on a path to recovery.
An equally dramatic example is the story of the Cuyahoga River in Ohio. Today the Cuyahoga supports a wide variety of recreational opportunities, from kayaking to fishing, and boasts some 44 species of fish. Just a few decades ago, however, the Cuyahoga was one of the most polluted rivers in the United States. But finally, when debris and chemicals in the Cuyahoga infamously caught fire in 1969, people were galvanized into action. Some have even called the public reaction to the Cuyahoga River fire the start of environmentalism, for that catastrophe helped spur a legislative response that included the Clean Water Act, the Great Lakes Water Quality Agreement, and the creation of the U.S. Environmental Protection Agency.
The point is that difficult problems aren’t always as intractable as they seem. That doesn’t mean they are easy to solve, of course, as any of the concerned citizens, activists, and government officials who fought to clean up the Cuyahoga River could attest. In fact, the Cuyahoga actually caught fire more than a dozen times, the first time in 1868. It took until 1969—more than 100 years—to spur the necessary actions.
Let’s be clear: global warming is much greater in scope than a burning river and more complex than a hole in the ozone layer. But people caused the problem, and people can solve it. We already have many of the tools and technologies we need to address global warming. The key is for each of us to begin to work toward solutions.
Testifying before the U.S. Congress on Tuesday, Union of Concerned Scientists nuclear safety expert David Lochbaum, dissected how rippling power outages forced the Fukushima nuclear power plant into a situation where workers couldn’t contain overheating radioactive fuel and spent fuel.
Lochbaum, director of the the Nuclear Safety Project for the UCS, outlined how the U.S. Nuclear Regulatory Commission could require better backup and fuel storage procedures at U.S. nuclear facilities to help prevent a breakdown similar to that still unfolding at Fukushima. He recommended that spent fuel be moved into dry cask storage containers more quickly to avoid crowded spent fuel pools prone to overheating because they are crowded with spent fuel rods.
Lochbaum also highlighted a weakness in battery back up power, which plants rely upon in the event of a blackout or loss of connection to the electrical grid. Many U.S. plants have only four hours of back-up battery power, less than what the Fukushima facility had.
Lochbaum’s full statement to the U.S. Senate Energy and Natural Resources Committee:
The Fukushima Dai-Ichi nuclear plant in Japan experienced a station blackout. A station blackout occurs when a nuclear power plant loses electrical power from all sources except that provided by onsite banks
of batteries. The normal power supply comes from the plant’s own main generator or from the electrical grid when the reactor is shut down. All the equipment needed to operate the plant on a daily basis as well
as the emergency equipment needed during an accident can be energized by the normal power supply.
When the normal power supply is lost, backup power is supplied from onsite emergency diesel generators. These generators provide electricity only to the smaller set of equipment needed to cool the
reactor cores and maintain the containments’ integrity during an accident.
At Fukushima, the earthquake caused the normal power supply to be lost. Within an hour, the tsunami caused the backup power supply to be lost. This placed the plant into a station blackout where the only
source of power came from batteries. These batteries provided sufficient power for the valves and controls of the steam-driven system—called the reactor core isolation cooling system— that provided
cooling water for the reactor cores on Units 1, 2, and 3. When those batteries were exhausted, there were no cooling systems for the reactor cores or the spent fuel pools. There are clear indications that the fuel in
the reactor cores of units 1, 2, and 3 and some spent fuel pools has been damaged due to overheating. Had either normal or backup power been restored before the batteries were depleted, we would not be
here today discussing this matter. The prolonged station blackout resulted in the inability to cool the reactor cores in Units 1, 2, and 3, the spent fuel pools for all six units, and the consolidated spent fuel
pool. There are lessons, learned at high cost in Japan, that can and should be applied to lessen the vulnerabilities at US reactors. And I cannot emphasis enough that the lessons from Japan apply to all US
reactors, but just the boiling water reactors like those affected at Fukushima. None are immune to station blackout problems. All must be made less vulnerable to those problems.
As at Fukushima, US reactors are designed to cool the reactor core during a station blackout of only a fairly short duration. It is assumed that either the connection to an energized electrical grid or the repair of
an emergency diesel generator will occur before the batteries are depleted. Eleven US reactors are designed to cope with a station blackout lasting eight hours, as were the reactors in Japan. Ninety-three of
our reactors are designed to cope for only four hours. But unless the life of the on-site batteries is long enough to eliminate virtually any chance that the batteries would be depleted before power from another
source is restored, one lesson from Fukushima is the need to provide workers with options for dealing with a station blackout lasting longer than the life of the on-site batteries. In other words, the moment that any US reactor enters a station blackout, response efforts should proceed along three parallel paths: (1) restoration of the electrical grid as soon as possible, (2) recovery of one or more emergency diesel generators as soon as possible, and (3) acquisition of additional batteries and/or temporary generators as soon as possible. If either of the first two paths leads to success, the station blackout ends and the re-energized safety systems can cool the reactor core and spent fuel pool. If the first two paths lead to failure, success on the third path will hopefully provide enough time for the first two paths to achieve belated success.
The timeline associated with the third path should determine whether the life of the on-site batteries is adequate or whether additional batteries should be required.. For example, the existing battery life may be
sufficient when a reactor is located near a facility where temporary generators are readily available, such as the San Onofre nuclear plant in California, which is next to the US Marine base at Camp Pendleton.
When a reactor is more remotely located, it may be necessary to add on-site batteries to increase the chance that the third path leads to success if the first two paths do not.
The second lesson from Fukushima is the need to address the vulnerability of spent fuel pools. At many US reactors, there is far more irradiated fuel in the spent fuel pool than in the reactor core. At all US
reactors, the spent fuel pool is cooled by fewer and less reliable systems than are provided for the reactor core. At all US reactors, the spent fuel pool is housed in far less robust structures than surround the
reactor core. This means that any release of radiation from the pool will not be as well contained as radiation released from the reactor core. It also means that spent fuel pools are more vulnerable to terrorist
attack than is the reactor itself. More irradiated fuel that is less well protected and less well defended is an undue hazard. There are two measures to better manage this risk: (1) accelerate the transfer of spent fuel
from spent fuel pools to dry cask storage, and (2) upgrade the guidelines for how to address an emergency and the operator training for spent fuel pool problems.
Currently, the US spent fuel storage strategy is to nearly fill the spent fuel pools to capacity and then to transfer fuel into dry cask storage to provide space for the new fuel discharged from the reactor core. This
keeps the spent fuel pools nearly filled with irradiated fuel, thus maintaining the risk level about as high as possible. Added to that risk is the risk from dry casks stored onsite, which is less than that from the
spent fuel pools but not zero.
A better strategy would be to reduce the inventory of irradiated fuel in the pools to the minimum amount, which would be only the fuel discharged from the reactor core within the past five years. Reducing the
spent fuel stored in the pools would lower the risk in two ways. First, less irradiated fuel in the pools would generate a lower heat load. If cooling of the spent fuel pool was interrupted or water inventory was
lost from the pool, the lower heat load would give workers more time to recover cooling and/or water inventory before overheating caused fuel damage. And second, if irradiated fuel in a spent fuel pool did
become damaged, the amount of radioactivity released from the smaller amount of spent fuel would be significantly less than that released from a nearly full pool. Reducing the amount of irradiated fuel in
spent fuel pools would significantly reduce the safety and security risks from a nuclear power plant. Following the 1979 accident at Three Mile Island, reactor owners significantly upgraded emergency
procedures and operator training.. Prior to that accident, procedures and training relied on the operators quickly and correctly diagnosing what had happened and taking steps to mitigate the consequences. If the
operators mis-diagnosed the accident they faced, the guidelines could lead them to take the wrong steps for the actual accident in progress. The revamped emergency procedures and training would guide the
operators’ response to an abnormally high pressure or an unusually low water level without undue regard for what caused the abnormalities. The revamped emergency procedures and training represent significant
improvements over the pre-TMI days. But they apply only to reactor core accidents. No comparable procedures and training would help the operators respond to a spent fuel pool accident. It is imperative
that comparable emergency procedures and training be provided for spent fuel pool accidents to supplement the significant gains in addressing reactor core accidents that were made following the TMI
The Nuclear Regulatory Commission has announced a two-phase response plan to Fukushima; a 90-day quick look followed by a more in-depth review. If the past three decades have demonstrated anything, it’s
that the NRC will likely come up with a solid action plan to address problems revealed at Fukushima, but will be glacially slow in implementing those identified safety upgrades. A comprehensive action plan
does little to protect Americans until its goals are achieved. We urge the US Congress to force the NRC to not merely chart a course to a safer place, but actually reach that destination as soon as possible.
(The following is an excerpt of a report by Ed Lyman of the UCS posted Sunday evening EDT.)
The nuclear crisis in Japan took a turn for the worse as serious problems developed at a second reactor at the Fukushima Dai-Ichi nuclear facility. Earlier concerns were focused on reactor Unit 1, but now the situation at Unit 3 is becoming serious.
Officials from Tokyo Electric reported that after multiple cooling system failures, the water level in the Unit 3 reactor vessel dropped 3 meters (nearly 10 feet), uncovering approximately 90 percent of each of the fuel rods in the core. Authorities were able to inject cooling water with a fire pump after reducing the containment pressure by a controlled venting of radioactive gas. As with Unit 1, they began pumping seawater into Unit 3. Seawater is highly corrosive and probably precludes any future use of the reactor, even if a crisis is averted.
However, Tokyo Electric recently reported that the water level in the Unit 3 reactor still remains more than 2 meters (6 feet) below the top of the fuel and company officials believe that water may be leaking from the reactor vessel. When the fuel is uncovered by water, it overheats and suffers damage. It is likely that the fuel has experienced significant damage at this point, and Japanese authorities have said they are proceeding on this assumption.
One particular concern with Unit 3 is the presence of mixed-oxide (MOX) fuel in the core. MOX is a mixture of plutonium and uranium oxides. In September 2010, plant operators loaded 32 fuel assemblies containing MOX fuel into this reactor. That amounts to approximately 6 percent of the core. MOX fuel generally worsens the consequences of severe accidents in which a large amount of radioactive gas and aerosol is released compared with non-MOX uranium fuel because MOX fuel contains greater amounts of plutonium and other actinides, which are highly toxic.
See the full report, and other updates on the situation in Japan at the UCS blog: All Things Nuclear.
A Boiling Water Reactor system (Source: Nuclear Information and Resource Service.)
An explosion at one of the Fukushima reactors on Saturday intensified concerns about possible escaping radiation.
But experts at the site reported that the threat of radiation was receding and that the explosion, which damaged an exterior concrete containment wall, had not compromised the metal containment surrounding the core.
“We’ve confirmed that the reactor container was not damaged. The explosion didn’t occur inside the reactor container. As such there was no large amount of radiation leakage outside,” Japan’s chief cabinet secretary, Yukio Edano, said in a news conference Saturday night. “At this point, there has been no major change to the level of radiation leakage outside, so we’d like everyone to respond calmly.”
Earlier, officials expanded the evacuation area around the facility to a 12-mile diameter.
The explosion resulted from a build-up of pressure inside the containment building for one of the plant’s reactors. Officials stressed, however, that it had not released additional radiation.
Reactors at Fukushima have been the focus of concern since the 8.9 magnitude earthquake and following tsunami knocked out electric power, and back up diesel power, to the cooling towers. The cooling towers keep the core containing the radioactive fuel from overheating their containment housing. The cooling towers have been operating on battery power, and plant operators said they planned to use sea water, to cool the reactor.
News reports quoted sources as saying that background radiation had risen to 1,000 times normal in the reactor’s control room at one point after the quake. But they tried to allay worries on Saturday, reporting that radiation leakage measured at the gate to the compound was relatively low.
From Green Right Now Reports (3-11-11)
Several news sources are reporting that the Fukushima Nuclear Power Plant, which lost power to its cooling station during the earthquake and tsunami that hit northeastern Japan, remains compromised.
The U.S.-based Union of Concerned Scientists reports that the threat remains of a “meltdown” in which a cooling tower fails to keep the radioactive fuel from overheating.
This afternoon, the UCS released this update:
NUCLEAR CRISIS IN FUKUSHIMA: WHAT WE KNOW
The massive earthquake off the northeast coast of Japan has caused a potentially catastrophic situation at one of Japan’s nuclear power plants. The situation is still evolving, but below is a preliminary assessment based on the facts as experts at the Union of Concerned Scientists currently understand them.
The plant’s owner, Tokyo Electric Power Company (TEPCO), reported that at 2:46 p.m. local time (12:46 a.m. EST) “turbines and reactors of Tokyo Electric Power Company’s Fukushima Daiichi Nuclear Power Station Unit 1 … and Units 2 and 3 … automatically shut down due to the Miyagiken-oki Earthquake.”
These reactors are three of the six operating reactors at the Fukushima I nuclear facility. All are boiling water reactors. Unit 1 has a rated output of 460 megawatts, and Units 2 and 3 each have a rated output of 784 megawatts.
Houses burn off the coast of Japan. (Photo by Reuters) See National Geographic.com for more photos and information about the disaster in Japan.
TEPCO went on to state the shutdowns were caused by the loss of off-site power “due to malfunction of one out of two off-site power systems.” This loss of power triggered emergency diesel generators, which automatically started to provide backup power to the reactors.
However, at 3:41 p.m. local time (1:46 a.m. EST), the emergency diesel generators shut down “due to malfunction, resulting in the complete loss of alternating current for all three units,” according to TEPCO. The failure of the diesel generators was most likely due to the arrival of the tsunami, which caused flooding in the area. The earthquake was centered 240 kilometers from Japan, and it would have taken the tsunami approximately an hour to reach the Japanese islands.
This power failure resulted in one of the most serious conditions that can affect a nuclear plant — a “station blackout” — during which off-site power and on-site emergency alternating current (AC) power is lost. Nuclear plants generally need AC power to operate the motors, valves and instruments that control the systems that provide cooling water to the radioactive core. If all AC power is lost, the options to cool the core are limited.
The boiling water reactors at Fukushima are protected by a Reactor Core Isolation Cooling (RCIC) system, which can operate without AC power because it is steam-driven and therefore does not require electric pumps. However, it does require DC power from batteries for its valves and controls to function.
If battery power is depleted before AC power is restored, however, the RCIC will stop supplying water to the core and the water level in the reactor core could drop. If it drops far enough, the core would overheat and the fuel would become damaged. Ultimately, a “meltdown” could occur: The core could become so hot that it forms a molten mass that melts through the steel reactor vessel. This would release a large amount of radioactivity from the vessel into the containment building that surrounds the vessel.
The containment building’s main purpose is to keep radioactivity from being released into the environment. A meltdown would build up pressure in the containment building. At this point we do not know if the earthquake damaged the containment building enough to undermine its ability to contain the pressure and allow radioactivity to leak out.
According to technical documents translated by Aileen Mioko Smith of Green Action in Japan, if the coolant level dropped to the top of the active fuel rods in the core, damage to the core would begin about 40 minutes later, and damage to the reactor vessel would occur 90 minutes after that.
Concern about a serious accident is high enough that while TEPCO is trying to restore cooling the government has evacuated a 3-km (2-mile) radius area around the reactor.
Bloomberg News reported that the battery life for the RCIC system is eight hours. This means that the batteries would have been depleted before 10 a.m. EST today. It is unclear if this report is accurate, since it suggests that several hours have elapsed without any core cooling. Bloomberg also reported that Japan had secured six backup batteries and planned to transport them to the site, possibly by military helicopter. It is unclear how long this operation would take.
There also have been news reports that Fukushima I Unit 2 has lost its core cooling, suggesting its RCIC stopped working, but that the situation “has been stabilized,” although it is not publicly known what the situation is. TEPCO reportedly plans to release steam from the reactor to reduce the pressure, which had risen 50 percent higher than normal. This venting will release some radioactivity.
UCS will issue updates as more information becomes available.
With time running short to act on climate change this year, a group of environmental organizations has sent President Obama a letter, asking him to encourage movement on the gridlocked Senate negotiations on climate and energy legislation.
“We strongly urge you to produce a bill, in conjunction with key senators, that responds to the catastrophe in the gulf, cuts oil use, and limits carbon pollution while maintaining current health and other key legal protections,” the letter said. “White House leadership is the only path we see to success, just as your direct leadership was critical in the passage of the recovery plan, health care reform and other administration successes.”
Most of those who signed the letter have been slow to chide Obama for his unwillingness to intervene, but now appear to believe there is danger the moment is about to be lost.
“The president’s been great on this issue,” Krupp said. “We need him and his staff to directly engage in the politics and the policy to actually produce a bill that can pass the Senate. And if he doesn’t do that, without his leadership, then everything else he has done so far will lead to nothing.”
In an effort to house all the info about efficient vehicles in one virtual garage, the Union of Concerned Scientists has created HybridCenter.org.
The Ford Fusion Hybrid achieves nearly 40 mpg.
At HybridCenter.org, you can read about hybrids, link to reviews of the latest models written by auto reviewers, find out about government rebates and most importantly, see which hybrids are the best value and least polluting vehicles in their class. The website scoops up all the pertinent information into a scorecard that can help you find the cleanest cars — and should help keep the automakers honest.
“Consumers should know that the best hybrids deliver emissions reductions and fuel efficiency at an affordable price without compromising vehicle utility. Many hybrids, however, are loaded with forced features that inflate the sticker price; knowing which forced features come with a car may save you money and provide a bargaining tool at the dealership,” note the UCS authors of the scorecard.
The Hybrid Scorecard, for instance, rates the Ford Fusion Hybrid as having “high” hybrid value because it gets great mileage (combined 39 mpg according to the EPA) and cuts emissions by nearly one-third compared with its non-hybrid counterpart. By contrast, Chevy’s Silverado gets a “low” rating for its hybrid value because more emissions reductions could have been squeezed out of the hybrid conversion. (The UCS, however, gives this vehicle a positive nod for being the first pickup of this size available as a hybrid.)
The UCS is an advocate for hybrid technology, because more efficient cars can help curb global warming. But the venerable scientists’ group isn’t out to promote hybrids at any cost. It wants to see the technology achieve the best results for the environment and not be perverted into a marketing or profiteering device that turns off potential adopters.
If consumers find good deals, and their car’s “hybrid technology is truly being used to maximize reductions in both global warming and smog-forming emissions,” then it’s all good.
This summer, the website is running a Green Travel Challenge asking readers to contribute their hybrid stories. The contest deadline is July 4, after which a drawing will be held for prizes that include a Garmin GPS and gift certificates to Southwest Airlines and Amtrak.
Seeing the pictures of the flooding in Nashville this past week may have reminded you of other recent U.S. floods — in Fargo, Iowa City and the Mississippi River Valley.
Nashville Flooding, May 2010 (Photo: ABC News)
And if you keep up with global warming, you may be wondering if this trend isn’t proving what scientists have been telling us about extreme rain events growing more severe and more frequent under climate change.
That question certainly came up in Nashville, according Rich Hayes, deputy communications director at the Union of Concerned Scientists and a Nashville resident.
“A lot of my friends here have asked me if this disaster is related to global warming. The fact is that climate change increases the probability of some types of weather, including heavy rains and flooding. As average temperatures rise, more rain falls during the heaviest downpours. Unfortunately, that is exactly what we experienced in Nashville over the weekend,” Hayes said in a prepared statement on Thursday.
“Warmer air holds more moisture. We’ve all seen it. Next time you take a shower, notice how the water vapor hangs in the warm air after you turn off the hot water. When warm air holding moisture meets cooler air in the atmosphere, the moisture can condense onto tiny particles to form floating droplets. If those drops get bigger and become heavy enough, they fall as precipitation.
Hayes went on to cite a report by the United States Global Change Research Program, a collaborative effort involving 13 federal agencies, which found that “one of the most pronounced precipitation trends in the United States is the increasing frequency and intensity of heavy downpours.”
As for those climate skeptics who might think the Nashville event is just an extreme example of an otherwise good thing, i.e., more rain. Listen in to the rest of what that report predicts:
“More precipitation is falling during very heavy events, often with longer dry periods in between. Climate models project more heavy downpours and fewer light precipitation events.”
In other words, we get rain that doesn’t really work that well for us anymore. It comes in sudden, heavy downpours that produce a lot of runoff and erosion, and when the water can’t escape, flooding. In between, we get dry spells. Ask any farmer if this is a good thing.
It would be akin to having water to drink one week, but no water the next. Most organisms can’t live that way.
Hayes goes on to note that the rain that fell on Nashville was undoubtedly outside the norm. A record of 13 inches fell on Saturday and Sunday, nearly double the previous record set in 1979, and that followed a hurricane.
All these flood events are different. Fargo and the Mississippi Valley had their own distinct issues. Fargo faced rapid heavy snow melt. In Iowa, the flooding was exacerbated by monoculture farming that has left the flat lands susceptible to run off. But they seem to be happening more frequently.
Increases in Very Heavy Precipitation 1958-2007 (Image: U.S. Global Change Research Program)
In Nashville, the sheer volume of rain in a short time overwhelmed the city. Unfortunately, under climate change models, what’s been outside the norm is becoming the norm.
The report cited by Hayes notes that “very heavy rain and snow events, defined as the heaviest 1 percent of all precipitation events, now drop 67 percent more water on the Northeast; 31 percent more on the Midwest and 20 percent more on the Southeast than they did 50 years ago.” (See image above from the USGCRP.)
“If the fossil fuel emissions that cause global warming continue unabated, scientists expect the amount of rainfall during the heaviest precipitation events across the country to increase more than 40 percent by the end of the century. Even if we dramatically curbed emissions, these downpours would still increase, but by only a little more than 20 percent,” according to the UCS statement.
“It’s going to take Nashville a long time to recover from the flooding,” says Hayes. “But when the flood waters do recede, and local officials turn to the question of how we plan for the future, they need to take climate change into account.”
You’d think in the era of the Weather Channel and 24-hour-news, Americans would be well informed about the difference between “the weather” and “the climate.”
Magnolia covered in rare Southern snow (Photo: Green Right Now)
And yet, people seem genuinely befuddled. This winter especially, with the Midwest up to its window sashes in snow and Texas through Florida experiencing protracted periods below freezing, people can be heard questioning climate change and global warming.
How can a warming world be so cold? they ask.
“It’s really confusing to a lot of people,” says climate scientist Brenda Eckwurzel. In particular, the term “global warming” seems to “paint a picture that everywhere on the planet should be hotter.”
But the climate changes do not follow consistent patterns. Some parts of the world, like North Africa and the U.S. Southwest stand to get a lot hotter, incrementally. But other places, like the U.S. Midwest, will not warm as noticeably.
And within the broad changes brought by climate change, the weather will continue to fluctuate, every few days or weeks, as it always has, though on a changing Bell Curve in which “we lose” the lows and see more of the highs, Eckwurzel said.
Climate change does not negate weather variations. Cool spells still occur. What climate change does is march steadily in a direction (in this case toward warmer average temps) over a period of decades.
And as the world warms, summer heat waves become longer and more intense, hitting more extreme highs, said Dr. Eckwurzel, who is on staff with the non-profit Union of Concerned Scientists. The UCS, a 40-year-old group based in Boston, tracks climate change and advocates for solutions. (See the UCS’s Climate Change 101.)
Right now, the nation is in a predicted cooler period, similar to the 1970s, when temperatures dropped for a few seasons – a response to ocean and wind patterns which follow multi-year trends, Eckwurzel said.
But this smaller trend is a blip within the larger one; the world continues to warm, she said. The past decade was the warmest on record since reliable records were kept in the late 1800s, and the oceans are warmer than they’ve ever been. Arctic sea ice is vanishing, as are many glaciers.
Even amidst the chill of winter’s breath, certain effects of climate change can still be seen.
Climate change brings more precipitation to areas that get rain and snow, and less to arid regions. These shifting precipitation patterns that can be seen even against the short-term weather patterns of 2010 and help explain the recent heavy snowfalls across the U.S. Midwest (as well as the rains that are playing havoc with plans for the Winter Olympics in British Columbia).
Warmer oceans and warmer surface air have “revved up” the water cycle, and all around the globe oceans, lakes and soils are evaporating, Eckwurzel said. The warmer air can hold water vapor for longer, fueling more intense precipitation events.
“The wet places are getting wetter and the dry regions are getting drier,” she said.
Some critics of climate change say that the nation should just go with the flow.
On Monday, American Farm Bureau Federation President Bob Stallman, lashed out against climate change-related reform, specifically against a plan to return open land to climate-mitigating forests in several states. Speaking at the group’s national convention in Seattle, he also criticized food experts and “activists” who would “drag agriculture back to the day of 40 acres and a mule.”
His remarks parallel those of some others in agriculture, government and agribusiness who muse that climate change could be a good thing for farmers in the nation’s heartland, bringing a longer growing season and ample rainfall. Bring it, they say.
But Eckwurzel warns that this type of thinking fails to take into account the monetary costs of shifting agriculture. Extended summer heat waves will stress livestock, and confined animals like dairy cows would produce less in a hotter climate. As winters grow milder, the lack of freezes would allow plant pests to proliferate, leading to higher costs to combat crop intruders. ( The UCS has published a series of reports about the expected effects of climate change in the Midwest and all other regions of the U.S.)
In the Southern states, where the soil is drying out as rains become more erratic, flash flooding has already become a bigger threat to agriculture and human welfare.
“To ignore that there might be costs within those changes would be too simplistic,” Eckwurzel said. “Unabated climate change will bring costs.”
The clock has just struck midnight on New Year’s Eve, 2020, and your rooftop cocktail party is in full swing. An urban garden, with potted evergreens and fruit trees, carpets the top of your downtown apartment building. The structure itself is vintage – a 1960′s brownstone that’s been retrofitted, by city-wide mandate. It operates on the new multi-source national electrical grid, which is supplied by wind, solar, geothermal power, as well as fossil fuels whose emissions are trapped underground.
Rooftop Garden (Photo: Adpower99/Dreamstime.)
In your apartment, appliances and plumbing fixtures are energy- and water-efficient – something you were able to afford with the help of government incentives that started in 2010.
As the New Year turns, friends sip mojitos with mint freshly cut from your herb garden, nibbling locally made goat cheese, accented by your own roof-grown tomatoes and cukes. A rainwater-collection system irrigates your vegetable garden, and the rooftop’s community compost fertilizes it. Solar-heated water percolates through your plumbing, anda mobile rooftop solar system heats and coolsyour home. Several stories below, in the building’s underground parking lot, the family car is getting its nightly re-charge.
It’s a smart, self-contained life, one that consumes no more than it requires, and produces some of its own food and energy on-site. And believe it or not, you are paying less for utilities, transportation – for life, in general – than you did a decade ago. That’s because U.S. policy-makers and legislators pushed so hard ten years before to put the country on an aggressive path toward a sustainable, renewable-energy future.
Imagine if they hadn’t pushed through the Energy and Power Bill in 2010, or the emissions Cap and Trade plan or later, the Carbon Tax bill… Imagine if progressive, quickly instituted policies and incentives hadn’t reassured manufacturers and factory owners that it was a good idea to retool and hire and train all those “green-economy” workers. …
This is the future we could see, the best case scenario we might see, if the White House and U.S. Congress and the rest of us act aggressively – now – to grow a green economy and reduce carbon emissions.
Is it Possible?
Most conservative think-tanks and government agencies foresee a longer-term conversion to green energy. According to one DOE report, the fastest we could move would be to attain 20 percent wind by 2030, while still relying on fossil fuels for up to 78 percent of our overall power as late as 2035.
The latter suggests the U.S. could be at 100 percent renewable in 10 years, but that roadmapdoesn’t give a breakdown on which types of energy would provide what percentage of our overall electricity needs.And Gore’s and similar plans have been criticized as requiring the economy to travel at a warp speed not possible on this planet. They’ve also been challenged as risky, because they’d be based totally on today’s technologies, when solar and geothermal and biofuels are rapidly improving and coming down in price. Of course this could help us get there more quickly, but it also warns against locking in commitments.
In fact, if there’s one thing all parties agree upon, it’s that there is no single, truly reliable breakdown for a ten-year scenario that predicts specifics for how the energy pie would be divided in 10 years; 20 percent solar? 30 percent wind? 40 percent conventional fossil fuels like natural gas? Where does nuclear power fit?
No one has a crystal ball.
According to Mark Z. Jacobson, a Stanford University civil engineering professor and co-author of a recent report in Scientific American – “Evaluating the Feasibility of a Large Scale Wind, Water and Sun Energy Infrastructure” - in theory, the United States shouldn’t have a problem converting all “new production of electricity to renewable by 2020. The issue is, what’s ‘new’? It’s not going to be a high percent of the total. Each year you can replace a certain percent, but a (pre-existing) power plant can last 40 years.”
However, Jacobson adds, ”It’s certainly feasible in ten years - if everybody put their minds to it – to say all new power has to be renewable. We could be at 50 percent wind, 40 percent solar and 10 percent everything else, including geothermal, hydro-electric, even some tidal wave power.”
But converting our total energy production to renewables in 10 years is not a likely scenario, he says, because that would require the U.S. government to “take away all the subsidies from fossil fuels and shift them over to renewables” – unlikely, even with a progressive President and Congress.
“These coal plants that are grandfathered in, the way to make those go out of business is to change the subsidies, change the laws, but we’ll have a battle! Getting rid of the old stuff is easier said than done. We have all these people working in the industry and they are going to complain that we’re costing the country jobs, putting their companies out of business. And we’d need a job training program to shift them into other industries.”
But is it technically possible to have all new energy be renewable by 2020?
Yes, says the professor, adding that we might already be at 25 percent renewable for new power now.
“Right now, wind is the second largest source of all new energy, after natural gas, and if we slowly get rid of the ‘old’ power, how fast that could occur depends on” introducing things like new laws and incentives, aggressive policies that don’t change with each election, as well as shifting subsidies to green power interests and ridding the powers-that-be of outmoded mindsets.
Jacobson concludes: ”The scenario of 100 percent conversion to renewables in 10 years is very slim. A 90 percent conversion – maybe a little less slim. … That doesn’t mean we shouldn’t try. All forces should be aligned to do these things. But given there are so many confliciting interests – there are lobbyists, naysayers, competing financial interests, the economic cycles, the political cycles – so many potential roadblocks. … You can’t just shut down the existing plants and have new generation on-line in 10 years. You could imagine the law suits. The goal is there, but if you think about it as retiring existing things as they go down, there’s probably less of a fight on that front.”
And, as the civil engineer points out, ” electric power is not the only thing you’re trying to change. You’re trying to change the entire infrastructure, so you want to go down the path of least resistance. It’s better to get 25 percent across the board – for everything, for other sectors, and not just (go for) 100 percent for electric power. Those other sectors include industrial, transportation, energy efficiency” for our built-environment.
As for which type of renewable energy will create the largest chunk of power in America, no one can say. So let’s take a look at the three main ones consistently mentioned by renewable-energy proponents. First up, wind power.
As international climate treaty negotiations continue in Copenhagen amid debate over the potential economic impact of new standards, a new report shows that the costs for small business operating under California’s landmark climate law (AB 32) can be measured in pennies.
Border Gill in Santa Monica
Conducted by leading economists and released by the Union of Concerned Scientists, the report found that AB 32 policies will only increase the percent of small business revenue spent on energy by only 0.3 percentage points–from 1.4 to 1.7 percent–in 2020. In a case study of one small business — Border Grill restaurant — the report fond AB 32 will cost diners 3 cents extra per $20 meal in 2020.
The peer reviewed analysis, The Economic Impact of AB 32 on California Small Businesses, used data on the cost characteristics of small businesses to estimate the economic impacts of AB 32 and was commissioned by UCS and conducted by The Brattle Group, an international economic consulting firm.
“Our report finds that the incremental cost impact of AB 32 on the average California small business will be relatively small and definitely manageable,” Jurgen Weiss of The Brattle Group and co-author of the report, said in a statement. “The AB 32 cost impact pales in comparison to the effect of inflation over ten years, and falls well within the range of historic cost variation most small businesses face everyday regardless of climate policy.”
The Brattle Group projected the likely changes in electricity, natural gas, and gasoline prices due to the major AB 32 policies: cap and trade (which puts a price on carbon), a 33 percent renewable energy standard, increased energy efficiency measures, and a low-carbon fuel standard.
Others report highlights included:
Most small businesses will not be directly regulated under AB 32, therefore AB 32 policies will only impact them indirectly to the extent that these policies cause energy prices for electricity, natural gas and transportation to change.
The average small business spends less than 1.5 percent of revenues on energy-related costs. So any increase in the price of energy will have a modest financial impact.
Increases projected in electricity, gas and transportation fuel costs due to AB 32 are lower than recent increases in the same rates caused by factors wholly unrelated to environmental regulations
Increased costs of other products used by small business — such as food, supplies and services — that result from higher energy prices also will have only a modest impact on small business.
The report, released last week, includes a case study of Border Grill, a Santa Monica-based Mexican restaurant. The report’s authors said restaurants are more energy intensive than the average small business and represent the largest share of employment in any small business category – 10 percent of total statewide employment. After auditing five years of the restaurant’s electricity and gas bills, The Brattle Group developed a 10-year business projection based on historical data, and used this projection, along with macro-economic assessment of change in energy prices, to develop the case study results.
According to the report, in 2020, Border Grill will be spending 2 percent of its revenue on energy. By investing in a robust set of efficient appliances, vehicles, and other equipment, the restaurant will be able to use even less energy and improve its productivity and competitiveness.
Border Grill is known for serving only sustainable seafood, as part of the Monterey Bay Aquarium Seafood Watch Program. It also uses organic long-grain rice, beans, and coffee, and developed a program called “Good for the Planet, Good for You,” that gives guests the opportunity to choose dishes made with at least 80 percent plant-based ingredients.
Not convinced that climate change matters? The Union of Concerned Scientists has concluded that if Americans adopt that stance, they’ll be gambling not just with their lungs, but with their pocketbooks.
It found that rising sea levels, intense hurricanes, flooding, impaired public health and strained energy and water resources would all add up to one monumental price tag.
“By late this century, the Midwest could be inundated with more torrential rainstorms costs tens of billions of dollars [in crop and property damage]. California, Washington and Oregon could be hit with an additional billion dollars in property damage from wildfires every year. The Northeast and Northwest, meanwhile, could lost most of their snowpack, which would kill the ski industry,” said Lexi Shultz, deputy director of the Climate Program at UCS.
But wait, there’s good news: The US Department of Energy’s Energy Information Administration says that developing clean energy and taking steps to slow global warming emissions would be affordable. The EIA says that the cost of fighting global warming would only cost each American household about $10 a month in increases in their energy bills by 2020.
The UCS wants us to stack that price tag of about $120 a year against the staggering costs of inaction. If climate change continues unchecked, with temperatures climbing by 7 to 11 degrees by 2100, the UCS report projects that:
The federal government could end up spending billions fighting wildfires (which would increase by as much as 53 percent in 2100) considering the feds spent $200 million fighting just three wildfires last year in California.
California would also suffer from heat-related public health issues and associated costs of billions to mitigate the human effects of ground-level ozone, which would worsen under climate change.
The loss of snowpack would make many recreation areas in the Northeast and the Northwest unsuitable for skiing and snowmobiling, costing, conservatively, a loss of $405 million in annual skiing revenues.
Reduced snow melt in all of the nation’s mountainous regions could affect water flow in streams and ultimately cost farmers, such as those in New Mexico where the loss of water from reduced snowmelt could cost $21 million a year by 2080.
Shrinking snowpack would have huge impact in Oregon and Washington on many industries. Losses to the coldwater fishing (angling) industry could ultimately cost about $1 billion annually.
In the Northeast, sugar maples would lose habitat, meaning annual loss of $5 to $12 million just to that industry.
Sea level rise all along the East Coast would require seawalls. Possible cost in the Northeast: Up to $1.2 billion, and more in the Southeast
In the Southeast, where a projected rise of 18 inches is anticipated in sea levels, the beach recreation industry could incur $11 billion in cumulative damages by 2080.
Georgia alone could lose 5,000 tourism jobs.
Florida could be especially hard hit, experiencing residential real estate losses of as much as $60 billion a year by 2100, due to sea level rises. The tourism industry could be slapped with more than $175 billion in annual losses due to beach erosion. Property damage from hurricanes could top $100 billion annually by 2100.
In the Midwest, flooding and heavy downpours predicted by a collaboration of 13 federal agencies, could cause billions of dollars of crop damage and exacerbate erosion, raising the price of food production. Looking at just one state, Illinois, the annual costs to agriculture could reach $9.3 billion.
Alaska, where warming is occurring disproportionately faster than in other states, would suffer continued damage to infrastructure as the permafrost melts, costing up to $6 billion just by 2030.
As for those who might ask whether these projections are alarmist, a spokesman for the UCS notes that the report was based on “mainstream” studies and that scientists, if anything, tend to err on the of conservatism.
“Most climate scientists acknowledge that current methods of predicting the consequences of climate change may underestimate the real impact and costs of climate change. More carbon dioxide is staying in the atmosphere as the ocean absorbs less and less over time. At the same time, ice sheets appear to be melting more rapidly than scientist have expected,” said Aaron Huertas, press secretary for the UCS, which is based in Massachusetts.
“…If these costs seem large it’s only because our dependence on the relatively stable climate of the past century or so is immense,” Huertas said. “Every home, every crop, every road — our entire civilization — has been built for today’s climate. A rapid shift in our climate will mean major disruptions for our way of life.