What impact will climate change have on air quality?

sammis power plant
sammis power plant

The Sammis Power Plant near Steubenville, Ohio, which the PUCO agreed to allow FirstEnergy to continue operating through 2024 on the backs of ratepayers (courtesy of EarthJustice).

Though it’s hardly a secret that I view climate change as the preeminent issue of this generation, I usually try to bring some sobriety to the apocalyptic current that some of my fellow climate hawks bring to the table. Whether it’s casting a skeptical eye on the hype about climate change and conflict or challenging the use of the term “climate refugee,” I try to stay fairly level headed.

So it would seem reasonable that I would be somewhat wary of the hype surrounding the major new report from the U.S. Global Change Research Program on the public health impacts of climate change. I mean, as Kyle Feldscher of the Washington Examiner tweeted, somewhat snarkily,

But, here’s the thing, sometimes when Chicken Little screams that the sky is falling, you really do need to look up. And that’s the case with climate change. As this report lays out in tremendous detail, the public health implications of inaction are staggering, whether it’s the estimated 11,000 additional deaths per year from heat-related mortality, an increase in vector- and water-borne illnesses, or a spike in the frequency and intensity of disasters, things are going to suck unless we do something like yesterday.

Importantly, due to the lengthy atmospheric lifetimes of greenhouse gases (GHG), particularly CO2, some of these impacts are already baked into the cake. While the report makes it clear that we can stave off the worst effects on public health by taking immediate action to curb GHG emissions, the fact remains that we will inevitably have to adapt to that which we cannot mitigate and suffer that which we cannot adapt to. But since most of my focus is on air quality issues these days, I wanted to take a closer look at that chapter in the report.

Tracing improvements in air quality

First, it’s crucial that we note how much air quality has improved in the United States since the passage of the 1970 Clean Air Act Amendments (CAAA). According to the U.S. EPA, ambient levels of the six criteria air pollutants fell by a combined 63% from 1980 to 2014, including an astounding 99% for lead. All this occurred even as GDP grew by 147%.

This trend has paid significant dividends for Northeast Ohio. In Cleveland, for instance, the 3-year average for carbon monoxide (CO) from 1972-1974 was 17.3 parts per billion (ppb), well in excess of the 10 ppb standard. From 2012-2014, this value had fallen to just 4.3 ppb, a 75% decrease. Back in 1978, the 3-year average level of sulfur dioxide (SO2), which is generated largely from burning coal, stood at a mind boggling 497 ppb. In 2014, that level was down to 71 ppb, below the EPA’s 75 ppb standard.

The benefits of this dramatic improvement in air quality have been staggering. One study from the EPA found that, by 2020, the 1990 CAAA will prevent 230,000 premature deaths and generate benefits totalling $2 trillion. According to renowned University of Chicago economist Michael Greenstone, the 1970 CAAA extended the life expectancy of the average American by 1.6 years, totalling more than 336 million additional life-years. Here in Cleveland, we live, on average, 2.3 years longer because of this landmark piece of legislation.

But, as I’ve discussed before, a lot of people seem to think that these numbers mean we’ve moved beyond air pollution, that it’s something we’ve relegated to the past. That’s clearly not the case, given that a 2013 study estimated air pollution led to more than 200,000 premature deaths in 2005. In Cleveland, that number was 1,363, with the majority (62%) of deaths coming from electricity generation (466) and transportation (384). Clearly we have a long way to go, and incremental improvements in air quality will do a lot to winnow this number down further.

Will climate change affect this trend?

But that’s where climate change comes into play. The two primary bogeymen in the world of air quality are ground-level ozone and fine particulate matter (PM2.5). The formation of both of these pollutants depends heavily on meteorological conditions, particularly the former. When the conditions are right, ozone and PM2.5 levels can spike, with serious consequences for anyone who breathes air.

Now, obviously the most important thing that environmental officials can do is work to reduce emissions of ozone precursors, along with direct PM2.5 and its precursors. If there are simply fewer nitrogen oxide (NOx) and volatile organic compound (VOCs) molecules floating around, there will inevitably be less ozone in the air.

And this is true – to a point. That’s why the EPA estimates that, thanks to existing regulations like the controversial Mercury and Air Toxics Standard (MATS) and the Tier 3 Vehicle Emissions standard, ozone and PM2.5 levels will continue to decline. The agency projects, for instance, that ozone levels in Cuyahoga County will fall to 59 ppb in 2025 from 75 ppb currently.

Unfortunately, this fails to account for the impacts of climate change. Global warming is likely to make the types of meteorological conditions conducive to ozone formation – hot, still summer weather – considerably more common going forward. As the report’s authors note, “consequently, attaining national air quality standards for ground-level ozone will also be more difficult, as climate changes offset some of the improvements that would otherwise be expected from emissions reductions.”

To illustrate this effect, let’s look at recent history in Cleveland. From 2008 (when the EPA finalized its 75 ppb standard) through 2011, there were an average of 9.5 days each year when ozone levels exceeded the standard. This number plummeted further during the previous two mild summers, with 1 day and 3 days in 2014 and 2015, respectively. But then there’s 2012, the hottest year on record in the region. During that summer, we had 28 exceedance days, the highest number since 2002.

What will climate change’s impact be on air quality?

So what, exactly, does the report project? Well, it uses data from a 2015 paper by a group of EPA scientists that aims to “quantify and monetize the climate penalty” from higher ozone levels tied to climate change through 2030. Because the effects of climate change on PM2.5 are so difficult to suss out, the report focuses exclusively on ozone.

The authors use two global climate change scenarios, known as Representative Concentration Pathways (RCP) to estimate the effects. RCP 8.5 is a worst case scenario, while RCP 6.0 is slightly less pessimistic model that assumes we will take some action to curb emissions.

These models allow them to estimate the number of climate change-attributable, ozone-related premature deaths and illnesses in the US. While RCP 6.0 leads to somewhere between 37 and 170 premature deaths each year, RCP 8.5 could generate 420 to 1,900 additional early deaths. The authors find that “the economic value of these adverse outcomes ranges from $320 million to $1.4 billion for the RCP 6.0 scenario and from $3.6 to $15 billion for the RCP 8.5 scenario.”

climate change ozone impacts

The projected impacts of climate change on ozone levels and ozone-related mortality in the US for RCP 6.5 and RCP 8.0 (courtesy of Fann et al.)

These health impacts will not be distributed evenly, as the map above shows. Here the Midwest, particularly along the Great Lakes, significant global warming could drive ozone levels up by more than 5 ppb leading to tens or dozens of additional deaths. These findings are similar to those from a 2007 study (PDF) by Michelle Bell et al. in the journal Climatic Change. This study examined the impact of significant climate change on ambient ozone levels in 50 US cities by 2050. Bell et al. concluded that ambient summertime ozone levels would jump by 4.4 ppb, and every city studied would see an increase in the number of exceedance days by 2050. The average city would experience 5.5 more exceedance days per year, a 68% increase compared to the 1990s, while Cleveland could see a spike of 140%, from 7.5 to 18 days per year. The study uses the 1997 ozone standard of 85 ppb, meaning that the number of exceedances would likely be much higher for the current 2015 standard of 70 ppb. All told, ozone-related mortality was projected to increase 0.11-0.27%.

While this seems relatively insignificant, I should note that ozone is not a major cause of air pollution-related death here. If climate change was to have comparable impacts on particle pollution levels, these costs would increase by orders of magnitude. Unfortunately, this remains a real possibility. One study estimates that, while global PM2.5 concentrations may fall by up to 18%, they could increase by anywhere from 1 to 4 micrograms per cubic meter in the eastern US.

Ultimately, it’s not the projected number of additional deaths or asthma exacerbations that matters. What this report shows is that we have done an excellent job of cutting levels of harmful air pollutants, even as we increased emissions of a seemingly harmful one – CO2. But now, unless we take immediate action to slash the latter, all our great work on the former is at risk.

Idling cars are the tools of the devil

vehicle exhaust
vehicle exhaust

Vehicle exhaust contains a number of harmful pollutants, including fine particulate matter, and it is increasingly the primary source of urban air pollution (courtesy of Wikimedia Commons).

Thus far, El Niño has more or less kept winter at bay here in Cleveland. Well, that’s all changing this week. I guess once Mother Nature heard an overgrown rodent said we were getting an early spring this year, she got pissed.

Winter is back with a vengeance. We’re going to see temperatures drop to perhaps their lowest point of the year this weekend, and forecasters are calling for five or six separate fronts to bring snow over the next week or so. All of this should help to cut into our substantial snow deficit. As of Monday, the National Weather Service had recorded just 11.2 inches of snow this winter, roughly 26 inches below normal. That deficit has already shrunk by one-fifth, and it will continue to decrease.

The return of winter means a few things. First, our profuse application of road salt – with all its inherent environmental consequences – means that everything will adopt a fine coating of sodium chloride. Second, those of us walking through the city will trudge through unshoveled sidewalks and try to avoid the ubiquitous puddles of filthy, half-melted slush, which could either be an inch deep or the bottomless pit that Ozzie Smith fell into on The Simpsons. And third, people will idle their cars left and right. The other day, I walked past a St. Ignatius security guard who was idling his car in a parking lot on Lorain Avenue. When I came back an hour later, he was still idling his vehicle, all the while straddling three separate parking spots (including a handicapped space).

Now, I see the appeal of vehicle idling in the winter, but I don’t really understand the level of passion that idlers bring to the table. A few weeks ago, our local ABC affiliate, WEWS, reposted an article on why drivers should avoid idling their cars during the winter. Within a few hours, the pro-idling commenter horde descended to inform the reporters just how wrongheaded they were.

I know – never read the comments – but individuals insisted that “this is the dumbest thing I’ve ever read. Warming up car by idle (sic) is very good” and “cold oil will destroy your engine.” And, of course, the coup de grace: “Stupid article with gas prices at a (sic) all time low I could careless (sic) if I waste gas warming up my car especially when the windows are frosted or frozen.”

Let’s assume for a minute that the U.S. Environmental Protection Agency and basically every other authority on this topic, including car-makers, know more about vehicle idling than random internet commenters. Can we marshal the available information to help dissuade some of these pervasive myths on idling? Of course we can. So let’s do that.

“Warming up car by idle is very good”

This one is perhaps the most common idling myth, and, like all good myths, there is a kernel of truth here.

The EPA says that, on average, cars get 12% worse gas mileage during cold weather. However, this was a much bigger issue for older model years, particularly those that employed carburetors. Modern fuel injection systems automatically adjust to exterior weather conditions. Furthermore, cars warm up twice as quickly when driven as they do while stationary. It may be nice to sit in a car that you warmed up with your remote start on frigid winter mornings, but you’re not doing your car any favors.

“Cold oil will destroy your engine”

No, no it won’t. Again, this myth is ubiquitous, but it’s highly out of date. Modern, synthetic engine oils do not need to warm up first. They can flow properly at temperatures as low as -40°F. It may have been cold as hell last February, but we still live in Cleveland, not Barrow or Yellowknife.

Beyond this, idling is actually harder on your car than driving it normally. While batteries commonly stall out in cold temperatures, idling does more long-term damage. As they idle, car batteries continue to expend energy to the car’s components. This process leads to deeper engine cycling, which forces the battery to discharge more energy during normal engine operation. Discharged batteries, in turn, produce less power; this means that subsequent engine starts will require even more energy and take longer, which will shorten a battery’s lifespan.

Idling is hard on cars in other ways as well. It is true that a number of vehicle components, such as the starter, are designed to last a set number of starts. This would seem to suggest that idling your car would place less wear and tear on a vehicle over time. But again, this is not true. According to Natural Resources Canada, idling your car for just 46 seconds is worse and more costly than turning it off and back. In addition to straining the battery, idling engines do not run at an optimal temperature, which leads to the incomplete combustion of gasoline. This leaves fuel residue in the engine – not to mention producing more pollution – and can cut fuel economy by around 5%.

“I could careless if I waste gas warming up my car”

Would that I were so wealthy. But let’s consider exactly how much gas this gentleman – who I assume is Rich Uncle Pennybags from Monopoly – is wasting by idling.

Every two minutes that a vehicle spends idling consumes the same amount of gas as driving two miles. The average vehicle spends 60-73 hours idling per year, which accounts for 5-7% of total fuel use. Based on information from the Argonne National Laboratory, if a person idles for 10 minutes per day, s/he can waste up to 30-50 gallons of gas per year.

The true costs of vehicle idling

But, if you’re familiar with anything I’ve ever written, I’m more interested in trying to figure out the social costs of our idling habits. If everyone in Cleveland idles so profligately, what are the effect on the larger scale? How might all of that extra, inefficient fuel use add to the costs of air pollution and climate change?

With that question in mind, I decided to do some back-of-the-envelope calculations. According to a 2009 study from Amanda Carrico and colleagues, the average American idles approximately 16.1 minutes per day. More than half of that idle time (8.2 minutes) occurs due to traffic lights, congestion, stop signs, and the like, so we’ll eliminate it. This leaves 7.9 minutes of idling per day – 4.2 minutes for warming up the car and 3.7 minutes while waiting (to pick some up one, in the drive thru, etc.).

Next, we need to determine the population of passenger cars in the Cleveland area. According to data from the Ohio Bureau of Motor Vehicles, there were 2,130,794 passenger vehicles registered in the seven counties that make up Northeast Ohio last year. Of course, not all of those vehicles will idle that amount each day, so I will adjust these numbers to reflect the percentage of the general population that reported idling for warming (48%) and waiting (46%) for more than 30 seconds at a time in the Carrico et al. study.

We now need to figure out how much pollution and fuel cars consume while idling. Fortunately, the EPA has provided this information, though they do not have estimates for particulate matter emissions (which are, by far, the most harmful conventional pollutant in vehicle exhaust).

idling emissions calculations

Annual vehicle idling emissions in the seven counties of Northeast Ohio (author’s estimates).

Using information from the US Department of Transportation (updated to 2015$) and the EPA’s social cost of carbon, we can estimate the total public health costs of these idling emissions per year.

idling cost calculations

Total annual costs of vehicle idling in Northeast Ohio (author’s estimates).

So, by my (admittedly rough) estimates, vehicle idling carries social costs of more than $58 million per in Northeast Ohio alone. While the vast majority of these costs come from wasted fuel, there are still nearly $3.5 million in air pollution related costs. I could go down the rabbit hole of trying to estimate the morbidity and mortality costs associated with these pollutants, but I’ll spare you the arm-waving wonkery. But let’s not pretend that this wasted fuel has no effects. Oil extraction has significant environmental consequences throughout the process from well to tank, and – given that is a nonrenewable resource – all of this valuable fuel could have been put to more productive uses. Waste is waste is waste.

As all the available evidence and my calculations show, vehicle idling is far from beneficial. On the contrary, is wasteful, costly, and illegal in many places. If this one small component of driving carries this large of an impact on our region, can you imagine the aggregate costs of our cars? Comfort is important, but it’s not everything. So turn the damn engine off next time. Your lungs and wallet will thank you.

The 1948 Donora Smog and the birth of air quality regulations

lunch time smog

Sixty-seven years ago today, residents of Donora, a town of around 14,000 lying along Monongahela River some 24 miles downstream of Pittsburgh, woke up to find a dense, yellow smog had blanketed the town. Donorans were accustomed to such smogs, as the town lay in a river valley ringed by hills that could reach up to 400 feet high. During the “smog season,” pollution from the industrial base of the city – including a steel mill and a zinc works – would collect in this natural depression and develop into smog until changes in meteorological conditions (shifting winds, rainfall) would dissolve the cloud.

But that didn’t happen on October 27. Or October 28, 29, or 30. Instead, a strong atmospheric inversion, which occurs when a blanket of lighter, warmer air flows in over heavier, colder air, sealed the smog in place. As this happened, emissions from the town’s factories, which included sulfuric acid, nitrogen dioxide, and flourine gas, continued to accumulate near the surface, instead of dissipating into the atmosphere. As the days passed, this blanket of toxic smog engulfing the town continued to get thicker and more noxious.

Given the prevailing views of the day, which suggested that air pollution was just a necessary byproduct of industrial progress, Donorans continued to go on with their lives. The high school football team played its home game that Friday; the Donora and Monongahela teams simply adjusted their tactics, with neither team throwing the ball. And the town even carried on with its Halloween festivities as planned. Workers at the steel and zinc mills continued to show up to work, despite the fact that they were producing the toxic emissions enveloping the town. The owners of the zinc works and steel mill rejected initial requests to shut down the factory as the days went by, and only agreed to cut back production on Halloween. This step occurred just as a storm blew into the area, helping to break the inversion and clear the air of the pollution.

All told, at least 20 people died during the smog, and, in the coming months, 50 more people died in the town than would have been expected under normal circumstances. But almost no one escaped the legacy of the smog, even those who did not succumb to its immediate impacts. The official epidemiological study conducted in the aftermath of the event concluded that “15.5 per cent of the total populace in the area were mildly affected; 16.8 per cent, moderately affected; and 10.4 per cent, severely affected.” The town’s overall mortality rate remained elevated for a decade or more. Relatively little changed for Donora or the country in the short-term. The town’s steel and zinc plants largely avoided being held liable, as investigators placed the blame on the extreme meteorological conditions that occurred. Whereas residents sued the steel plant for more than $4.5 million, U.S. Steel eventually settled for just $256,000, less than 6% of the damages sought.

To this day, the Donora smog remains less well-known than the Great London Smog of 1952, which, given that it affected a major metropolis, killed far more people (perhaps 12,000) and garnered considerably more attention. But Donora did lay the groundwork for air quality regulations in the United States. According to the Pittsburgh Gazette, Allegheny County regulated pollution for the first time the following year, and the passage of the 1955 U.S. Air Pollution Control Act, “the first federal legislation to recognize pollution as a problem,” can be linked to Donora (UPDATE: Per Ben Ross, author of The Polluters: The Making of Our Chemically Altered EnvironmentO, the 1955 Air Pollution Control Act was not the first federal bill to address air pollution. That can be traced back to the 1910 Organic Act, which created the Bureau of Mines. In fact, he noted, the 1955 act was a step backwards from the 1910 law in certain regards). The town’s museum commemorating the smog bears a sign proclaiming that “Clean Air Started Here,” while the town’s historical marker notes that “major federal clean air laws became a legacy of this environmental disaster.” Just as we think of the Cuyahoga River fire of 1969 as the impetus for the 1972 Clean Water Act (a story which is largely a fable), we should turn to Donora as we commemorate the 45th anniversary of the 1970 Clean Air Act Amendments that helped to end the legacy of these toxic smogs.

Climate change will lead to more deadly traffic accidents

A rendering of the proposed Cleveland Midway, a network of protected cycle tracks that would run across the city (courtesy of Bike Cleveland).

A rendering of the proposed Cleveland Midway, a network of protected cycle tracks that would run across the city (courtesy of Bike Cleveland).

In recent years, there has been a considerable amount of attention paid to transportation issues in climate change circles. This makes sense, given that the transportation sector is the second largest source of greenhouse gas (GHG) emissions in the United States. Mobile sources produced 1,806 million metric tons of CO2 equivalent (MMtCO2e) in 2013 (27%), trailing just electricity generation, which accounted for 21% of total emissions (2,077 MMtCO2e). Emissions from the transportation sector have also grown by 16.4% since 1990, making it the second fastest growing emissions source behind agriculture.

Accordingly, the Obama administration has taken a number of steps to address the issue. These include corporate average fuel economy (CAFE) standards for passenger vehicles, new investments in electric vehicles (EVs), proposed stricter rules for emissions from heavy-duty trucks, and the recent endangerment finding for GHGs from air travel. Each of these steps will be important if the US is to meet its goal to cut overall GHGs by 26-28% by 2025, as outlined in the administration’s pledge for the upcoming Paris Conference.

How climate change affects transportation

But the other side of this equation – how climate change will affect the US transportation sector – has garnered far less focus. The 2014 National Climate Assessment included a detailed chapter on the transportation sector, and the Federal Highway Administration (FHWA) manages a pilot program to help transportation agencies assess their systems’ vulnerability to a changing climate. We know, for instance, that more extreme rainfall could wash out roads, that sea level rise endangers coastal transportation infrastructure, and that accelerated freeze-thaw cycles may increase the costs of road maintenance. But much research in this area remains to be done.

A few weeks ago, Resources for the Future, a leading environmental economics think tank, released a report that examines one as yet unexplored issue – how climate change may influence traffic accident rates. I’ll admit that the idea that climate change could affect the number of car accidents in the US seemed a bit far fetched to me a first. People tend to jump through all sorts of hoops in order to connect everything to climate change these days. But this report provides a convincing case that, barring aggressive action both to cut carbon pollution and become more resilient, climate change could make our roads even more dangerous.

The connection between weather and traffic accidents

In order to explore the relationship between climate and traffic accidents, economists Benjamin Leard and Kevin Roth first examined existing evidence on how changes in weather patters affect accident rates. Using data from the National Highway Traffic Safety Administration (NHTSA) on the number of traffic accidents that result in property damage, injuries, and fatalities for 20 states, the authors identified the existing relationships between temperature and precipitation fluctuations and traffic accidents. When temperatures fall below 20ºF, accidents that result in property damage increase by 9.3%. The relationship between temperature and accidents that lead to injuries is weak, but it appears highly significant for fatal traffic accidents. In contrast to property damage accidents, fatal accidents are 9.5% more likely on days when temperatures climb above 80ºF.

The relationship between precipitation and traffic accidents is more complex. Both rainfall and snowfall increase the incidence of property damage accidents; when rain and snow totals exceed 3 centimeters, accidents increase by 18.8% and 43.3%, respectively. This effect changes when we consider accidents leading to injuries and fatalities. In the former category, rain and snow totals over 3 centimeters lead to 14.4% and 25.9% increases in accidents, a relative reduction of 23.4% and 40.2%, respectively, compared to property damage accidents. But Leard and Roth found that fatal traffic accidents are actually less common on days with rainfall. On days with 1.5-3 centimeters of rain, fatal accident rates fall by 8.6%; this result is highly statistically significant. In contrast, this same amount of snowfall leads to 15.5% more fatalities. According to the authors, these results indicate “that drivers behaviorally compensate for these conditions,” but these adjustments are not enough to reduce the elevated accident risk presented by snowfall.

Importantly, the study also finds a strong correlation between weather conditions and the number of trips people make by foot, bike, or motorcycle (the authors term these “ultralight duty vehicles,” or ULDs). Unsurprisingly, these ULD trips decrease significantly as temperatures dip below 40ºF and as the precipitation begins to fall. Put a different way, as the weather improves, an increasing number of people choose to walk, bike, or motorcycle. This increases their exposure to automobiles, elevating the risk that they may be the victim of an accident. Accordingly, when the authors removed pedestrians, cyclists, and motorcyclists from their models, fatality rates fell by roughly half.

Climate change will cause more traffic fatalities

The authors then used these observed relationships to project how climate change could affect traffic accident rates in the future. They utilize the IPCC’s A1B scenario – a middle of the road scenario that assumes global temperatures will rise by around 4ºC – to project changes in weather and traffic accidents through the end of the century. According to the Climate Action Tracker, we are currently on pace for 3.6-4.2ºC of warming in the absence of further action, making A1B a good model for this study.

As global temperatures increase, precipitation will gradually shift from snowfall to rain. The authors find that this change will decrease the number of annual traffic fatalities by roughly 253. However, the changing climate will also induce an increase in the number of trips people take by foot, bike, and motorcycle – leading to an additional 849 traffic fatalities per year – which brings the net change to 603 additional deaths per annum. This spike in traffic fatalities will carry an annual cost of $515.7 million. All told, by 2090 climate change will lead to an additional 27,388 traffic-related fatalities in the US, carrying total costs of approximately $61.7 billion.

Now, I should note that this study does not explicitly address a few issues.

Research shows that as the number of pedestrians and cyclists increases, the chance that they will be struck by a car declines. Each time that the number of pedestrians and cyclists doubles, the risk that they will be injured in an accident falls by a third. But this decline in the relative risk of injury does not overcome the increase in the absolute number of injuries, which actually rises by a similar percentage. Leard and Roth’s study finds similar results. Furthermore, their use of fixed effects should account for this safety-in-numbers effect.

Moreover, the study does not directly account for the fact that expanding bike and pedestrian-friendly infrastructure tends to make roads safer and reduce the number of accidents. Protected bike lanes, for instance, can cut the risk of injury by up to 90%. To be fair, Leard and Roth admit that this is a potential shortcoming of their study, noting that failing to control for this effect “overstates the long-run impacts of climate change.” They also explicitly point out the important role that these types of interventions can play in climate adaptation planning,

Our results do not indicate that reliance on walking, biking, and motorcycling imply large fatality rates, as other developed English speaking and western European nations have per-capita fatality rates that are often less than half that of United States. Some countries like Sweden with extraordinarily low fatality rates have pursued a variety of urban design and legislative changes to reduce fatalities with policies such as replacing intersections with roundabouts to slow vehicles where they are likely to encounter pedestrians. Relatively simple changes like these may prove to be effective, although unglamorous, adaptation strategies to climate change.

How can this study inform climate policy?

I have two main takeaways from this study.

1. Climate change will affect nearly every aspect of our lives, and we will never be able to fully anticipate and prepare for it. That’s what happens when humanity performs a global science experiment on the planetary systems that facilitated the development of human civilization.

2. It provides even more evidence of the benefits of investing in better infrastructure for cyclists and pedestrians, particularly when accounting for climate change. It emerges as a win-win-win.

Promoting active transportation is a vital component of any mitigation strategy, as every mile we don’t drive keeps roughly one pound of CO2 out of the atmosphere.

This type of people-centric infrastructure  also represents an important step that local governments can take to enhance their resilience to the impacts of climate change. We know that it may help to offset potential increases in fatal accidents due to climate change. But, more than that, it can also serve as a key lifeline to supplement existing road networks, which may be endangered by a changing climate. When roads are washed away and subway tunnels flooded, being able to ride your bike or walk to access resources and social services becomes that much more important.

Lastly, these types of investments would be valuable even in the absence of climate change, as they improve quality of life. Active transportation benefits air quality and public health, which reduces premature mortality and health care costs. Complete streets can also raise property values, increase business activity, create jobs, and make neighborhoods safer. All of these things make communities more vibrant and better able to withstand external shocks, whether from economic or climatic forces. In this way, pedestrian and cyclist-friendly infrastructure is exactly the type of no-regrets investment that climate resilience experts say we should be making now, regardless of the inherent uncertainties.

Karachi’s Heat Wave a Sign of Future Challenges to Pakistan’s Fragile Democracy

A man (R) cools off under a public tap, while others wait to fill their bottles, during intense hot weather in Karachi, Pakistan, June 23, 2015. A devastating heat wave has killed more than 400 people in Pakistan's southern city of Karachi over the past three days, health officials said on Tuesday, as paramilitaries set up emergency medical camps in the streets. REUTERS/Akhtar Soomro - RTX1HPUL

A man (R) cools off under a public tap, while others wait to fill their bottles, during intense hot weather in Karachi, Pakistan, June 23, 2015. A devastating heat wave has killed more than 400 people in Pakistan’s southern city of Karachi over the past three days, health officials said on Tuesday, as paramilitaries set up emergency medical camps in the streets (courtesy of Reuters).

Karachi, the world’s second largest city by population, is emerging from the grips of a deadly heatwave. A persistent low pressure system camped over the Arabian Sea stifled ocean breezes and brought temperatures in excess of 113°F (45°C) to the city of 23 million people in June. The searing heat disrupted electricity and water service, making life nearly unbearable. All told, officials estimate the heatwave killed at least 1,200 Pakistanis, more than twice as many as have died in terrorist attacks this year.

But meteorology alone cannot explain this turn of events. Rather, as with all disasters, Karachi’s heatwave is rooted in a complex web of natural and man-made factors. “The emergency is the product of a perfect storm of meteorological, political, and religious factors,” notes The New York Times.

Karachi’s rapid growth has heightened people’s exposure and vulnerability to heat. Since 2000, Karachi’s population has doubled, making it the fastest growing megacity in the world. This population explosion has overwhelmed the capacity of local government. At least half of all Karachiites live in informal settlements, with little access to infrastructure and vital services. Unplanned expansion has also led to widespread environmental degradation. Karachi’s annual concentration of fine particulate matter is 11.7 timesWorld Health Organization standards (and more than double that of Beijing), making it the fifth most air-polluted city in the world. Karachi also faces an acute water crisis. Some of its poorest residents survive on just 10 liters per day, one-fifth of daily drinking requirements, while some estimates suggest more than 30,000 people die from water-related diseases every year.

Wide swathes of trees and other vegetation have been cleared for roads and buildings, limiting shade and exacerbating the urban heat island effect (the process by which urbanized areas absorb and retain solar radiation, significantly increasing local temperatures). Add to this the city’s construction boom which creates a major demand for manual labor and the onset of the holy month of Ramadan – during which Muslims can neither eat nor drink before sundown – and you have a recipe for disaster.

To read the rest, head over to the original post at New Security Beat.

El Niño is here. What will it mean for Great Lakes ice cover?

lake erie ice

Over the weekend, temperatures finally climbed over 40ºF in Cleveland. Given the fact that the average temperature in February was all of 14.3ºF – by far the coldest February in our history – the mid-40s felt like a heat wave.

My fiancée and I decided to venture outside and headed down to Edgewater Park on Cleveland’s West Side. Edgewater, as the name suggests, sits along Lake Erie. We wanted to take an opportunity to see the lake before the ice really began to melt. Due to the frigid winter, the Great Lakes were once again covered in a thick layer of ice this year. Though we will likely remain just shy of last year’s mark, ice cover reached a peak of 88.8% on February 28. As set to continue running at or above normal, this number should continue dropping until the lakes are ice free sometime in late Spring. It has already fallen by more than 20% in the past 10 days.

We were far from the only people with this idea. While neither of us planned to actually head out onto the ice, we eventually decided to follow the pack. Someone had even decided to set up a tent on the ice a few hundred feet off shore to serve soup and coffee to passersby. At the time, I had no idea what the actual thickness of the ice we were walking on was. I flippantly estimated that it was several feet thick – a testament to my ignorance. I have since discovered, from the map below, that we were likely standing on a sheet of ice roughly 40 centimeters thick. Fortunately, that is thick enough to support a car.

lake erie ice thickness march 9, 2015

Courtesy of NOAA’s Great Lakes Environmental Research Laboratory

El Niño arrives – finally

Just as the forecast was beginning to take a turn for the better last week, NOAA made headlines by announcing that El Niño had finally arrived. Forecasters had been warning about its impending onset for more than a year, so the announcement wasn’t exactly a surprise. As I stood on the ice last weekend, I couldn’t help but wondering how this phenomenon might affect ice cover next winter.

El Niño is the warm phase of the El Niño Southern Oscillation (ENSO), during which a band of water water forms in the mid-tropic Pacific Ocean. The phenomenon is characterized by high air pressure in the western Pacific and low air pressure in the eastern reaches of the ocean. As Eric Holthaus notes at Slate,

Technically, for an official El Niño episode, NOAA requires five consecutive three-month periods of abnormal warming of the so-called Nino3.4 region of the mid-tropical Pacific, about halfway between Indonesia and Peru. It usually takes a self-reinforcing link-up between the ocean and the atmosphere to achieve this, and it finally appears the atmosphere is playing its part.

Generally speaking, El Niño brings above average temperatures to the Great Lakes region. Moreover, because the oceans have been storing vast amounts of heat over the past decade-plus, helping to limit the rate of global warming, a particularly strong El Niño could lead to a dramatic transfer of stored heat from the oceans to the surface. As a result, many observers are predicting that 2015 will be the warmest year on record.

El Niño and Great Lakes ice cover

It would be logical to assume that the onset of El Niño will limit the amount of ice that forms on the lakes. According to a 2010 NOAA study, from 1963-2008, 11 out of 16 El Niño winters saw below average ice cover. During these 16 winters, ice covered an average of 47.8% of the Great Lakes, considerably lower than the long-term annual average of 54.7%. As Raymond Assel, a scientist with NOAA’s Great Lakes Environmental Research Laboratory (GLERL) wrote in 1998 (emphasis from original):

On average, the average annual regional temperature is likely to be higher (approximately 1.2ºC and the annual regional maximum ice cover is likely to be less extensive (approximately 15%) during the winter following the onset year of a strong warm ENSO event.

But the connection between El Niño and ice cover is not quite so straightforward. In fact, three winters – 1970, 1977, and 1978 – saw above average ice cover, despite occurring during El Niño events. Ice cover during the latter two years exceeded 80%.

So what else is at play? Well, according to the literature, three factors must combine to produce a particularly mild winter for the Great Lakes region and, by extension, lead to extremely low ice cover like we saw in 1998, 2002, and 2012: the strength of the El Niño event and the modes of the Arctic and Pacific Decadal Oscillations. Let’s take a look at three these indicators to get a sense of what might be in store.

El Niño strength

Multiple studies have found that the relationship between these two factor is highly nonlinear. As this chart from Bai et al. (2010) shows, the scatter plot for ice cover and El Niño strength follows a parabolic curve. Accordingly, El Niño does tend to limit ice formation, but its effect is only significant during strong events.

Relationship between El Niño strength and Great Lakes ice cover (from Bai et al. 2010).

Relationship between El Niño strength and Great Lakes ice cover (from Bai et al. 2010).

But the current signs do not point to a strong event. As Brad Plumer explained for Vox,

Back in the spring of 2014, it really did look like a strong El Niño would emerge later in the year…

But then… things got messy. Atmospheric conditions over the Pacific Ocean didn’t shift as expected. Specifically, scientists weren’t seeing the change in atmospheric pressure over both the eastern and western Pacific that you’d expect during an El Niño.

As a result, NOAA appears to be tempering expectations about the strength and duration of this event. It is likely to be relatively weak and last through the summer, potentially limiting its impacts on the Great Lakes.

Arctic Oscillation

The Arctic Oscillation (AO) is among the most important factors that determines the severity of winter in the Great Lakes. The AO is “a climate pattern characterized by winds circulating counterclockwise around the Arctic at around 55°N latitude.” During its positive phase, strong winds around the North Pole effectively lock Arctic air in the polar region, helping to moderate winters. But in its negative phases, these westerly winds weaken, allowing Arctic air to travel further South; this is the phenomenon that caused the polar vortexes we have seen in the past two winters.

Accordingly, during the positive phase of the AO, less ice cover forms on the Great Lakes. From 1963-2008, positive AO winters have been 0.9-1.8ºC warmer than normal and seen a mean ice cover of 49.2%. The combination of an El Niño and a positive AO produced the five lowest ice cover totals during this period.

So where does the AO stand? Currently, it is in a positive phase. Unfortunately, it is difficult to determine whether this phase will persist, as the AO can fluctuate widely. But if this oscillation does remain in a positive phase next winter, it would amplify the effect of the weak El Niño.

Pacific Decadal Oscillation

Winter weather is also influenced by the Pacific Decadal Oscillation (PDO), “a long-lived El Niño-like pattern of Pacific climate variability” that helps determine sea surface temperatures in the North Pacific. Rodionov and Assel (2003) concluded that the PDO helps to modulate the impact of ENSO on the Great Lakes. Warm phases of the PDO tend to amplify the impact of El Niño and reduce ice cover.

Last year, the PDO emerged from its prolonged weak phase to reach record high levels. If it continues to remain strong, it will likely lead to warmer temperatures not just next winter, but potentially for the next 5-10 years. This would seem to suggest that the PDO will enhance the impact of the El Niño event next winter.

Conclusion

Overall, the picture is still a bit murky. It does not appear that the El Niño will be strong enough to produce the type of least ice cover event that we saw in 2012. Yet, at the same time, the combined effects of El Niño, a positive AO (should it remain that way), and a warm PDO (if the trend continues) will likely ensure that the Great Lakes region avoids another brutal winter, like the ones we’ve seen two years running. If this is the case, lake ice cover should regress closer to the long-term mean of approximately 50%.

But if the indicators strengthen in the next several months, the winter weather could moderate even more; this would have clear impacts for lake levels, lake-effect snow, harmful algal blooms, and local temperatures during the Spring and Summer months.

Don’t blame it on the rain: On the root causes of Northeast Ohio’s flooding problems

Floodwaters submerged vehicles in the parking lot at Great Northern mall in North Olmsted on May 12 (courtesy of Cleveland.com).

“Après moi, le déluge” – King Louis XV (1710-1774)

Northeast Ohio has a flooding problem, as anyone affected by the severe storms last evening can attest. The region has experienced at least four major flooding events in the past few months, the most serious of which occurred five months ago on May 12, when torrential rains caused widespread flooding in several communities.

As the hydrographs below demonstrate, this severe deluge caused several rivers and streams to overflow their banks throughout the western and southern portions of Greater Cleveland. Flash floods also occurred in several areas; one raging flash flood nearly washed away a vehicle containing legendary meteorologist Dick Goddard, who apparently did not heed that famous National Weather Service saying: “turn around, don’t drown.”

This hydrograph displays the streamflow for three Northeast Ohio rivers – the Vermilion River (red), the Black Creek in Elyria (green), and the Rocky River in Berea (blue) – as measured by the US Geological Survey during May of this year. As you can clearly see, the streamflow in each of these rivers spiked drastically on May 12-13, due to the extreme precipitation during that period. Both the Vermilion and Black Rivers exceeded their respective flood stages (courtesy of USGS).

Who is to blame?

Since these floods occurred, people have been looking for answers or, in many cases, someone to blame. Those individuals whose property and piece of mind were damaged by the floodwaters have, in many cases, been understandably and justifiably upset, even angry. Many of these people have turned their anger at their municipal governments for failing, for one reason or another, to prevent the floods from occurring. This anger bubbled over in some instances, leading to highly contentious public meetings, such as the one in North Olmsted during which a resident got on stage to publicly rebuke officials and call for citizens to sue the city. Residents of other municipalities, including Olmsted Township and Strongsville, are also considering class action lawsuits, accusing their cities of negligence for not investing in adequate infrastructure upgrades.

City officials, for their part, have found a different scapegoat – the rain itself. And there can be no question that the rain in some areas in the past few months has been downright biblical. North Olmsted endured 4.44 inches of rain – more rain than it receives, on average, for the entire month of May – in a couple of hours on the 12th. Put another way, that amount of rain would be equivalent to roughly 44 inches of snow. Strongsville, in turn, saw 3.58 inches of rain that evening, just under its monthly average rainfall of 3.66 inches for May.* The following month, Cleveland suffered a similar fate. The 3.54 inches that fell on June 24 made it the fourth rainiest day for the city in the past century.

Yet, major rainfall events are not uncommon for Northeast Ohio during the summer months; in fact, they are the norm. On average, roughly 40% of the total precipitation in the Midwest each year falls during just 10 days; almost all of these days occur during the summer months, when high heat and humidity can lead to major convective storms. But, what is different is the frequency with which these types of flooding events are occurring. Residents in many of the affected communities have testified that they have experienced floods on a semi-regular basis over the past 10-15 years.

Don’t blame it on the rain…or the sewers

While it may be convenient to blame these floods on the rain, it’s not that simple. As the (handful of) people who have perused this blog in the past have no doubt grown tired of reading, there’s no such thing as a “natural” disaster. Rather, disaster risk is the combination of a natural hazard, our physical and economic exposure to the hazard, and our socioeconomic vulnerability. If 4 inches of rain falls in the middle of an uninhabited tract of some national park in Montana, it does not constitute a disaster. In a sense, for disasters, if a tree falls in the forest, and no one is there to see it, it really doesn’t make a sound.

So, while it may make sense for people to blame inaction by public officials or the heavens for floods, these simply represent the proximate causes of the disaster. We cannot hope to address the real issue at hand by focusing simply on these; that is the equivalent of treating the symptoms of the illness. Rather, we need to focus on the root causes, which one can identify through this disaster risk lens.

Since I cannot readily or adequately examine the various facets of disaster vulnerability for every community affected by this summer’s floods, I want to focus instead on the other two components of the disaster risk triad – natural hazards and exposure. Increases in extreme precipitation events due to climate change and Northeast Ohio’s ongoing sprawl problem, respectively, account for much of the apparent spike in flooding events throughout the region over the past several years. I explore each of these below.

Natural hazard: Climate change and precipitation in Northeast Ohio

Logically, the more rain that falls over an area, particularly within a limited period of time, the higher the likelihood that a flood will occur. We already know that, based on simple physics, as global temperatures increase, the amount of moisture in the air should also rise. According to the Clausius-Clapeyron equation, the atmosphere’s capacity to hold water vapor increases roughly 7% for each 1ºC increase in atmospheric temperatures. This should lead to two general outcomes. First, it will take the atmosphere longer to reach its point of saturation, which may lengthen the periods between rain events for many areas, contributing to droughts. Conversely, because the amount of water vapor available for precipitation also rises, rainfall events should become more extreme in nature. As Dr. Kevin Trenberth put it in a 2007 study (PDF),

Hence, storms, whether individual thunderstorms, extratropical rain or snow storms, or tropical cyclones, supplied with increased moisture, produce more intense precipitation events. Such events are observed to be widely occurring, even where total precipitation is decreasing: ‘it never rains but it pours!’ This increases the risk of flooding.

We are already witnessing this intensification of rainfall in the US, particularly in the Midwest.  According to the latest National Climate Assessment (NCA), total precipitation has increased in the Midwest by 9% since 1991. Over the past century, certain parts of the region have seen precipitation totals climb by up to 20%. This increase is due largely to a spike in the frequency of extreme precipitation events. From 1958-2012, the amount of precipitation falling in very heavy downpour events jumped by 37% in the Midwest. This statistic helps to explain why, of the 12 instances in which Cleveland received more than 3 inches of rain in a day during the last century, 7 have occurred since 1994.

heavy downpours by region

One measure of heavy precipitation events is a two-day precipitation total that is exceeded on average only once in a 5-year period, also known as the once-in-five-year event. As this extreme precipitation index for 1901-2012 shows, the occurrence of such events has become much more common in recent decades. Changes are compared to the period 1901-1960, and do not include Alaska or Hawai‘i. (courtesy of Climate Central).

Unless we take action quickly to reduce our carbon emissions, this situation will only get worse in the coming decades. The NCA projects that, under a business as usual scenario (RCP 8.5), Ohio will see such extreme precipitation events four times more frequently by the end of the century.

extreme precipitation events projections

The increase in frequency of extreme daily precipitation events (a daily amount that now occurs once in 20 years) by the later part of this century (2081-2100) compared to the later part of last century (1981-2000) (courtesy of the National Climate Assessment).

Exposure: Sprawl and flooding in Northeast Ohio

I’ve also written extensively in the past about Northeast Ohio’s problems with sprawl-based development (see here for examples). As I wrote one year ago today,

Northeast Ohio has suffered from decades of sprawl and uncoordinated development patterns, leading to waves of suburbanization followed by exurbanization. In 1948, Cuyahoga County’s population stood at 1,389,532; just 26% of land in the county was developed at the time. Yet, by 2002, although the county’s population had grown by a mere .32% to 1,393,978, sprawl ensured that roughly 95% of the county’s land area had been developed.

cuyahoga county land use in 1948 & 2002

Changes in land use within Cuyahoga County from 1948 (left) to 2002 (right). Red shading indicates developed land, while the beige indicates land that is still undeveloped. The maps clearly demonstrate the decentralization of the county over the last six decades (courtesy of the Cuyahoga County Planning Commission).

We’ve come a long way since 2002. The heyday of sprawl appears to be on its last legs, as the combined effects of the Great Recession, the rise of the Millennial generation, and the gradual retirement of the Baby Boomers has led to a resurgence in the number of people living in walkable urban areas. Multiple sources have proclaimed the end of sprawl; this trend even appears to be taking root in Atlanta.

Cleveland has tried to position itself to follow this emerging trend. The city was recently ranked 10th most walkable among the largest 30 metro areas, enjoys a 98.3% residential occupancy rate downtown, has unveiled a plan to double the amount of bike routes in the city by the end 2017, and has seen a rise in transit-oriented development.

Given all of these positive indicators, why would I suggest that sprawl has increased the frequency and intensity of floods over the past decade-plus? Well, simply put, because it has. While it’s impossible for one to  deny all of these positive indicators, one also cannot ignore the facts.

In its Measuring Sprawl 2014 report, Smart Growth American ranked Cleveland 153 of 221 metros on its sprawl index. The median score was 100; cities with scores over 100 were more compact, while those with scores less than 100 were more sprawling. Cleveland scored an 85.62 (PDF), placing it below other regional metros, including Detroit (12th), Milwaukee (15th), Chicago (26th), Akron (111th), Dayton (116th), Toledo (117th), Pittsburgh (132nd), and Columbus (138). Cleveland does outperform some other nearby metros, including Indianapolis (158th), Cincinnati (166th), and Youngstown (175th).

Moreover, a recent study out of the University of Utah suggests that from 2000-2010, the Cleveland metro area became even more sprawling (PDF). Using Smart Growth America’s sprawl index, the authors examined the rate of change for the 162 largest metro areas (paywalled) during this period. While Akron actually became 2.7% more compact, Cleveland sprawled by another 13.3%, the 10th worst change of any metro area. Though the city’s number improved since 2010, our 85.62 in 2014 is still lower than the 86.01 that we had 14 years ago.

So why does this all matter for flooding? Well, simply put, areas that follow sprawl-based development models are more likely to suffer from flooding problems. Sprawl increases the percentage of land area that is covered with impervious surfaces, such as parking lots, roads, and driveways. As the extent of impervious surfaces rises, so too does the amount of precipitation that winds up as surface runoff during storms. Forested areas are excellent at controlling stormwater (PDF); trees enable 50% of precipitation to infiltrate the soil and allow another 40% to return to the atmosphere through evapotranspiration. Urbanized areas, in contrast, drastically reduce the amount of water that can infiltrate into the soil, guaranteeing that 35-55% of precipitation ends up as runoff.

As Hollis (1975) has shown, urbanization increases the incidence of small flooding events 10-fold (paywalled). Additionally, if 30% of the roads in an urban area are paved, major flood events with return periods of 100 years or more tend to double in magnitude. Northeast Ohio has more than 48,000 acres of impervious surfaces, equivalent to approximately one-third of the region’s land area. Accordingly, we fall directly into that danger zone for major flood events due, in large part, to our development patterns.

Secondly, because so much of the county is already developed, many new developments are being built in existing flood zones. In December 2010, FEMA released its first comprehensive flood zone maps for Northeast Ohio since the 1960s. Unsurprisingly, these maps show a dramatic increase in the number of people living in flood zone areas, due to the outward expansion of development. Thousands of people woke up one day to find out that they had been living in a flood zone, and they were none too happy to learn that they would now have to shoulder some of the cost of that decision by purchasing federal flood insurance. Interestingly, the gentleman who filed the class action lawsuit against Strongsville over the flooding lives in a housing development in one of these flood plains.

Lastly, sprawl directly contributes to climate change by leading to additional greenhouse gas emissions. Suburban areas account of 50% of the US’s total emissions, despite being home to less than half of the population. While households in downtown Cleveland produce just 26.5 tons of GHGs annually, that number skyrockets to 85.6 tons for Gates Mills residents. Because transportation accounts for such a high portion of the average family’s carbon footprint in this region, our sprawl problem has directly resulted in additional carbon pollution.

Conclusion

There is no question that flooding represents a real threat to the quality of life of people living in Northeast Ohio. Those individuals who have been directly affected by it have every right to be upset and to demand answers. Unfortunately, however, it appears that we are losing sight of the forest for the trees. Focusing exclusively on the proximate drivers of these floods may seem like a good idea, but it allows us to escape examining the real, underlying root causes. Until we step up and begin to shift our regional development patterns away from those centered on sprawl and rampant fossil fuel use, this flooding problem will only get worse.

 

*It’s worth noting that Strongsville is one of eight suburbs that have sued the Northeast Ohio Sewer District to fight the implementation of its stormwater management program. The case went before the Ohio Supreme Court on Tuesday. Obviously, this action runs directly counter to the city’s interests. While the stormwater management program will lead to an increase in rates, it is also the only chance we have to begin managing runoff as a region, which is essential not only for flood control but for improving our water quality and fighting harmful algae blooms. Additionally, a portion of the revenues from this fee would be made available for cleaning up after floods and helping to prevent future flooding. Perhaps that’s why, after the May 12 storms, North Royalton withdrew as one of the plaintiffs in the Supreme Court case. This region desperately needs the investment that will come from this program, through Project Clean Lake, though I strongly encourage NEORSD to invest a greater portion of the program’s funds into green infrastructure, which is vital for controlling floods and filtering water.

Sorry, Roger Pielke, climate change is causing more disasters

typhoon haiyan damage
typhoon haiyan damage

Damage in Tacloban from super Typhoon Haiyan (courtesy of The Daily Mail).

Back in March, controversial political scientist Roger Pielke, Jr. published his first post for FiveThirtyEight. The piece centered on the argument that climate change is not contributing to an increase in scale of disasters globally; rather, Pielke argued, “the numbers reflect more damage from catastrophes because the world is getting wealthier.”

The piece immediately drew consternation and criticism from a number of individuals and even prompted Nate Silver to commission a formal response from MIT climate scientists Kerry Emanuel. In particular, Emanuel and fellow climate scientist Michael Mann criticized Pielke’s decision to normalize GDP data. As Emanuel wrote,

To begin with, it’s not necessarily appropriate to normalize damages by gross domestic product (GDP) if the intent is to detect an underlying climate trend. GDP increase does not translate in any obvious way to damage increase; in fact, wealthier countries can better afford to build stronger structures and to protect assets (for example, build seawalls and pass and enforce building regulations). A grass hut will be completely destroyed by a hurricane, but a modern steel office building will only be partially damaged; damage does not scale linearly with the value of the asset.

Pielke’s critics also noted that he used an oddly brief time span for his investigation (1990-2013), that his use of global data tends to cover up significant differences in disaster damages among regions, and that he does not account for disaster damages that have been avoided due to investments in disaster mitigation and risk reduction. There’s also the fact that he includes geological disasters (i.e. earthquakes and volcanic eruptions) in an analysis that purportedly dismisses climate change as a factor; would it really have been that hard to get the original data on climate-related disasters directly from EM-DAT?

Not suprisingly, Pielke and a number of his friends, colleagues, and allies defended the piece, portraying Pielke as the victim of a coordinated witch hunt from climate activists and radical environmentalist bloggers. In an interview with Pielke, Keith Kloor, someone with whom I have disagreed on many occasions but respect, wrote that various commenters had “used slanderous language in an attempt to discredit” Pielke’s work. The basic argument is that few people had any real qualms with the research itself; instead, Pielke’s critics could not escape their personal feelings towards him and allowed those to color their critiques of his work.

Disaster frequency in the Asia-Pacific region

All of this is just an excessively long introduction to a new study published this week in the journal Climatic Change. In the article, researchers Vinod Thomas of the Asian Development Bank, Jose Ramon G. Albert of Philippine Institute for Development Studies, and Cameron Hepburn from Oxford University (herein known as TAH) “examine the importance of three principal factors, exposure, vulnerability and climate change, taken together, in the rising threat of natural disasters in Asia-Pacific” during the period from 1971-2010.

Now, there are three key reasons why this article piqued my interest and why its results are relevant to the topic at hand, particularly in contrast to Pielke’s research:

  1. The Asia-Pacific region typically accounts for at least a plurality of all disaster metrics – frequency, victims, and economic damages. From 2002-2011, according to EM-DAT (PDF, see page 27), Asia-Pacific was home to 39.6% of disaster events, 86.6% of disaster victims, and 47.9% of economic losses.
  2. The overwhelming majority of disaster events, losses, and victims in Asia result from climate-related disasters. For instance, the region accounts for 40% of all flood events (PDF, see page 6) over the last 30 years, and three-quarters of all flood-related mortality occurs in just three Asian countries – Bangladesh, China, and India.
  3. Both the size of the region’s populations and economies have grown dramatically over the past 40 years. As the figure below demonstrates, East and South Asia have seen GDP per capita growth rates of 8.4% and 5.6%, respectively, easily outpacing other regions. Asia-Pacific is also rapidly urbanizing. From 1950 to 2010, the number of Asians living in urban areas grew seven fold to 1.77 billion (PDF, see page 32). Many of these individuals live in areas highly exposed to disasters; for instance, 18% of all urbanized Asians live in low elevation coastal zones. Accordingly, if population growth and increased exposure to disaster risk were the ultimate drivers of increasing disaster occurrence, Asia would likely be the test case.

So, does this new research validate Pielke’s assertions that disasters are not becoming more frequent and, if they are (which they aren’t), it has nothing to do with manmade climate change?

In a word, no.

Unlike Pielke, who apparently believes that normalized economic losses represents an appropriate proxy for disaster occurrence, TAH actually examine the frequency of intense disasters over a four-decade period. And whereas Pielke considers damages from geological disasters, which, – given the fact that we have not suddenly entered an age of earthquakes – are a function of increasing physical and economic exposure, these authors focus exclusively on climatological (droughts, heat waves) and hydrometeorological (floods, tropical storms, etc.) disasters, which can be influenced by a changing climate.

Moreover, TAH only consider the occurrence of intense disasters, which they define “as those causing at least 100 deaths or affecting the survival needs of at least 1,000 people.” The use of this metric ensures that any increase in the number of observed disasters is unlikely to be the result of better reporting mechanisms alone, countering Pielke’s assertion that any perceived increase “is solely a function of perception.”

TAH explore the frequency of climate-related disasters in 43 Asian-Pacific countries, using both random and country-fixed effects*, which provides them with a greater sense of the validity of their results. They use the log of population density as a proxy for population exposure, the natural log of real income per capita as a proxy for socioeconomic vulnerability, and both average annual temperature and precipitation anomalies as proxies for climate hazards. Additionally, they break the data into 5 subregions and the timeframe into decade-long spans as sensitivity tests.

Climate change is increasing the frequency of disasters in Asia-Pacific

Results show that both population exposure and changes in climate hazards have a statistically significant influence on the frequency of hydrometeorological disasters. For each 1% increase in population density and the annual average precipitation anomaly, the frequency of such events increases by 1.2% and 0.6%, respectively. The authors then applied these results to historical trends in three Asia-Pacific states – Indonesia, the Philippines, and Thailand. As a result, a moderate increase in the precipitation anomaly of 8 millimeters per month (well within the observed changes for Southeast Asia over the past decade) leads to 1 additional hydrometerological disaster every 5-6 years for Indonesia, every 3 years for the Philippines, and every 9 years for Thailand.

In contrast, the models suggest that only changes in precipitation and temperature anomalies affect the frequency of climatological disasters. While a 1 unit increase in the average precipitation anomaly (logically) produces a 3% decrease in the number of droughts and heat waves, an equivalent jump in the annual average temperature anomaly leads to 72% increase.

It is not surprising that population exposure would play a role in determining the frequency of disaster events; if a tropical storm hits an uninhabited island, it doesn’t get recorded as a disaster. But what is surprising, if you take Pielke at his word, is the clear influence of our changing climate. As TAH conclude,

This study finds that anthropogenic climate change is associated with the frequency of intense natural disasters in Asia-Pacific countries. A major implication of this is that, in addition to dealing with exposure and vulnerability, disaster prevention would benefit from addressing climate change through reducing man made greenhouse gases in the atmosphere.

Ultimately, there can be no question that climate change will, is, and has changed the frequency and nature of disasters globally. That’s not to say that exposure and vulnerability are not playing an important role; we know they are. Bending over backwards to inject climate change into every event and subject, as some climate activists are prone to do, is misleading and irresponsible. But so is cherry picking data to downplay its role in shaping the nature and scope of disaster events, when the data tell us otherwise.

*Obviously I am, by no means, an economist or statistician of any repute. That said, here is how TAH define the difference between random and country-fixed effects: “In panel data analysis, while the random effects model assumes that individual (e.g. country) specific factors are uncorrelated with the independent variables, the fixed effects model recognizes that the individual specific factors are correlated with the independent variables.” Accordingly, because there is likely some correlation between the independent variables, it is impossible to assume that they are completely exogenous variables.

Will climate change disasters really lead to more conflict? Maybe.

naval station pensacola
naval station pensacola

Damage to Naval Air Station Pensacola following Hurricane Ivan in 2004 (courtesy of Wikimedia Commons).

The US military has devoted a considerable amount of attention to climate change, which makes sense given the various risks it poses to military operations. These risks include potential increased demand for humanitarian responses to climatic disasters and the threat of climatic changes, such as stronger tropical storms and sea level rise, to existing military installations. For instance, Hurricane Ivan knocked one of the Navy’s key bases, Naval Air Station Pensacola, out of commission for a year.

Climate change’s most severe potential military threat – increasing the risk of violent conflicts – is also its least likely, by far. Yet, unsurprisingly, this has gotten the lion’s share of attention from the media.

Last week, Eric Holthaus at Slate published an interview with retired Navy Rear Admiral David Titley. The piece is worth a read. I would say the Rear Admiral’s comments accurately reflect the views of many military officials who are concerned about climate change.

Let me just preface this by noting that Rear Admiral Titley has forgotten a hell of a lot more about military strategy, history, and operations than I will ever learn. But I do take exception with the way that he framed the issue:

Let me give you a few examples of how that might play out. You could imagine a scenario in which both Russia and China have prolonged droughts. China decides to exert rights on foreign contracts and gets assertive in Africa. If you start getting instability in large powers with nuclear weapons, that’s not a good day.

Here’s another one: We basically do nothing on emissions. Sea level keeps rising, three to six feet by the end of the century. Then, you get a series of super-typhoons into Shanghai and millions of people die. Does the population there lose faith in Chinese government? Does China start to fissure? I’d prefer to deal with a rising, dominant China any day.

If you take Rear Admiral Titley’s comments at face value, you’d be forgiven if you came away believing that climate-related disasters may inevitably spawn violent conflict. This is an all-too-common perception, one to which I used to subscribe.

What can we say about disasters and conflict?

But the fact remains that nothing about disasters inherently leads to conflict. Quite the opposite, really. There have been a few studies that find such a connection, including a one in 2008 from Philip Nel and Marjolein Righarts (PDF), who examined the connections between various forms of disasters and the risk of civil conflict onset. They found that disasters increase the likelihood that civil conflict will occur. Such disasters may create incentives for rebel groups to attack state institutions, or they can generate new grievances from heightened resource scarcity.

But an array of studies dispute these findings. Back in the 1960s, sociologist Charles Fritz suggested that disasters often alter social relations and help to mitigate pre-existing cleavages within communities. If securitization requires the existence of an “other” against which people can organize, the disaster itself may take that role, leading to the development of a  “common community of sufferers,” that promotes social cohesion and cooperation.

Ilan Kelman has further suggested that this ameliorative effect can take place at both intra and interstate levels, leading to “disaster diplomacy.” He has cataloged dozens of examples of disaster diplomacy, ranging from earthquakes in Greece and Turkey (PDF) to the aftermath of the 2004 Indian Ocean tsunami (paywall) in Aceh. Overall, the preponderance of evidence suggests that disasters do not inherently precipitate violence.

So can disasters lead to conflict?

Disasters, on their own, are highly unlikely to cause conflict. But the politics of the disaster response, or the lack thereof, is a different story. Think of Hurricane Katrina; it wasn’t the storm itself that caused so much outrage and discord, but the massive failure of the Bush administration to respond adequately to the needs of survivors.

Weak governments that are poorly equipped and lack sufficient international support are unlikely to respond effectively to disasters. This outcome could potentially anger survivors and provide them with incentives to take up arms, perhaps in an attempt to seize additional resources. But the real problem emerges when governments intentionally divert relief aid for their own gain or to serve their own political ends.

On December 23, 1972, a devastating earthquake rattled Managua, the capital of Nicaragua. The quake destroyed three-quarters of the city’s housing stock and killed at least 11,000 Nicaraguans as they slept. Strongman Anastasio Somoza immediately began to abuse his power to take advantage of the catastrophe. According to a 2010 Miami Herald article,

Somoza began directing reconstruction efforts from a family estate on the outskirts of Managua. Cabinet ministers, businessmen, foreign officials and international relief bosses — many of them addressing Somoza as “Mr. President” — trooped in and out all day long. It was Somoza with whom foreign diplomats negotiated aid packages; it was Somoza who decided Managua would be rebuilt.

While later independent investigations cast some doubt upon the scale and significance of the profiteering, it left an indelible mark upon Nicaraguans. As evidence of the corruption mounted, event the conservative Catholic Church turned on the regime. These events contributed to a resurgence of the Sandanista movement, which formally took up arms three years later.

managua earthquake damage

An aerial image of the damage to Managua following the devastating 1972 earthquake (courtesy of the US Geological Survey).

Evidence suggests that inadequate and/or politically motivated disaster responses may have fed into subsequent conflict in Bangladesh (following the 1970 Bhola cyclone), Guatemala (after the 1976 Guatemala earthquake), and Sri Lanka (after the Indian Ocean tsunami).

How else might disasters spawn conflict?

When conflict occurs in the wake of disasters, it is not always an unintended and unforeseen consequence. In fact, according to Travis Nelson, it can actually be a survival tactic (paywall) employed by weak states. Nelson suggested that weak leaders may be more likely to launch small, diversionary conflicts in order to distract from inadequate disaster responses and generate nationalistic solidarity.

In July 1959, severe flooding occurred along the Yellow River, killing approximately two million Chinese. The disaster occurred at a time when the Maoist regime was weak and dealing with several crises, including the catastrophic Great Leap Forward. In the midst of these crises, the regime was unprepared for the floods, and Chinese elites began openly to question Mao’s rule. In response, the regime launched a series of border skirmishes with India, which eventually fed into the 1962 Sino-Indian War. The war aroused nationalist fervor and distracted from other challenges.

But do disasters really cause conflict?

In a word, maybe. But now we’re wading into a difficult and highly complex area that deals with endogeneity. In statistical modeling, a variable is said to be endogenous when it can be affected by other variables within the model. In other words, we cannot truly isolate the variable from the effects of others, making it difficult to determine whether or not its effects are mitigated by other factors.

As I’ve written before and will continue to say until I’m blue in the face, there’s no such thing as a natural disaster. Disaster events are inherently shaped and controlled by the extant political, economic, and social environments. As a result, disasters do not occur in a vacuum, and we can’t treat them as such. So even if it seems likely that a disaster helped cause a conflict, it would be difficult to say that it was an exogenous effect, as its effects would likely be influenced by existing political and social dynamics.

While it’s true that the 1972 Guatemala earthquake helped reignite civil war (paywall), as it seems to be, it’s also true that the vulnerability of Mayan peasants to the earthquake’s effects was dictated by structural inequalities and existing violent conflict. So can we really say that the quake caused the subsequent return to war? Yes. No. Maybe. Honestly, it depends on how you define “cause.”

Did climate change cause Syria’s civil war then?

Keith Kloor hammered this point home in his recent post on the question of whether Syria’s drought caused its brutal civil war. Kloor takes Tom Friedman to task for suggesting that the Assad regime’s response to the drought helped fuel the war but failing to acknowledge that the regime’s actions also helped facilitate the drought. He quotes, at length, from an article last year where authors Jeannie Sowers and John Waterbury argue,

When terms such as ‘stressor’ or ‘threat multiplier’ are applied to drought, shifting rainfall patterns, floods, and other environmental events in the Middle East, they often obscure rather than illuminate the causes of uprisings and political change. There is perhaps no better illustration of this dynamic than Syria, where a closer examination shows that government policy helped construct vulnerability to the effects of the drought during the 2000s. State policies regarding economic development, political control in rural areas, and water management determined how drought impacted the population and how the population, in turn, responded.

So yes, the regime’s response to the drought – which may have been driven by climate change – helped incite the rebellion. But the regime’s policies also helped drive the drought in the first place. So did the drought and – by extension – climate change cause the rebellion? Yes. No. Maybe. It depends on how you define “cause.”

So will climate change really be different then?

Probably. In a 1987 article, Beverley Cuthbertson and Joanne Nigg consider under what circumstances a disaster may produce discord among survivors (paywall), which they term a “nontherapeutic community.” They find that, unlike with geological and weather disasters, victims do not see manmade disasters, like chemical spills, as natural. Accordingly, survivors often disagree as to whether a disaster has actually occurred and who is accountable. These disagreements can lead to the emergence of “victim clusters,” elevating tensions. In extreme circumstances, this could potentially lead to violence.

Given the fact that climate change is unequivocally manmade and that it has increasingly been linked to disasters, like droughts and heatwaves, it’s possible that climate-related disasters could be different. Disaster survivors could point to climate change’s fingerprints in the events that damage their livelihoods and use it as a call to take up arms. It seems unlikely, but it’s hard to be sure. Clearly, manmade climate change is different than anything we have dealt with in the past, and it is likely to change our calculus on these issues.

We may be able to say, to this point, that disasters probably don’t directly cause conflict, but as my grad school professor Ken Conca always says, you should be careful about driving forward by looking through the rear view mirror.

How global warming will cause more lake-effect snow

lake erie ice
lake erie ice

Ice engulfs most of the surface of Lake Erie on January 10, following the severe polar vortex event a few days prior (courtesy of Discover Magazine).

Today may be the first day of spring, but winter’s icy grip continues to linger for most of us in the Midwest. But as we move – we hope – into warmer weather, NOAA has provided an overview of the winter from which we have emerged. It released its latest monthly state of the climate data last week, which also included the data for this year’s meteorological winter (December-January).

Unsurprisingly, the report reveals that this winter was cold, but far from historically so. It was just the 34th coldest on record, and no state recorded its coldest winter ever. In contrast, California had its warmest winter ever, and Arizona had its fourth warmest. As the AP’s Seth Borenstein put it, this winter demonstrated a “bi-polar winter vortex.”

One climatological variable that did reach near-record levels was the extent of lake ice on the Great Lakes. Due to the spate of below-freezing temperatures in the Great Lakes region, ice cover reached a maximum of 91% this winter, far higher than the long-term average of 51.4%. Lake Erie, which typically has the highest level of ice cover of the five lakes, jumped from just 5% ice cover on New Year’s day to more than 80% ten days later as a result of the polar vortex on January 6-7.

One year does not make a trend, though. According to the National Climate Assessment (PDF), surface water temperatures have increased dramatically in the Great lakes since the mid-20th century. From 1973-2010, annual Great Lakes ice cover fell by 71%, a startling downward trend despite the noisy year-to-year variation. Additionally, the IPCC has noted the duration of lake ice throughout the entire Northern Hemisphere has decreased by approximately two weeks since the middle of the 20th century.

great lakes ice cover trend

The average percentage of the Great Lakes covered with ice decreased dramatically from 1973-2010 (courtesy of the National Climate Assessment).

Cleveland’s winter was largely in line with the overall regional trend. January-February temperatures were 7.1ºF below the historical average, making it the sixth coldest such span in the past 75 years. Cleveland also recorded 65″ of snow this winter, 36% higher than average. That number would likely have been higher, however, had the brutal temperatures not iced over the lakes, effectively shutting down the lake-effect snow conveyor belt.

That got me thinking – as global warming continues to warm the lakes, could the Great lakes region actually see more lake-effect snow?

Lake-effect snow

As Kunkel et al note (paywall), the presence of the Great Lakes provides the necessary heat and moisture to generate precipitation where none would otherwise exist. Lake-effect snow constitutes a major part of winter in the Great Lakes region, where it can account for up to 50% of winter precipitation (PDF).

Because lake-effect occurs (paywall), as a result of “the destabilization of relatively cold, dry air mass by heat and moisture heat fluxes from a comparatively warm lake surfaces,” ice cover can significantly influence the amount of lake-effect snow that falls in a typical winter. The existence of open water in the winter allows the development of a “significant surface-atmosphere temperature gradient,” which facilitates the development of lake-effect events. Accordingly, because we know that global warming has and will continue to reduce lake ice extent, it should also generate more lake-effect snow in the Great Lakes region.

The evidence

So what does the evidence say? Do we have research to backup this seemingly counterintuitive outcome? In a word, yes.

In a 2003 study, Burnett et al examined long-term changes in lake-effect snowfall (paywall) and compared them with October-April snowfall totals and air temperatures from 1931-2001. According to the authors, lake-effect snowfall totals increased significantly at 11 of 15 sites studied. Their research also demonstrated that lake surface temperatures increased significantly at the majority of sites examined. Consequently, they concluded that

[I]ncreased lake-effect snowfall is the result of changes in Great lakes whole-lake thermal characteristics that involve warmer lake surface waters and a decrease in lake ice cover.

While the Burnett et al piece is now more than a decade old, several additional studies have largely supported its findings. Vavrus, Notaro, and Zarrin examined how ice cover affected a subset of 10 heavy lake-effect snow (HLES) events in order to quantify the impact of the ice. They found that “the suppression of open water [i.e. expansion of ice cover] on the individual lakes causes over an 80% decline in downstream” HLES. Ice cover on Lake Erie lowered snowfall by 73%, while Lake Michigan saw a reduction of nearly 100%.

The authors note that ice cover reduces heat fluxes over the lakes, lowers atmospheric moisture, stymies cloud formation, and depresses near-surface air temperatures. All of these changes can suppress lake-effect. For all five lakes, complete ice cover reduces downstream snowfall by 85%. As a result,

The results of the current study suggest that this change toward more open water should favor significantly greater lake-effect snowfall.

Wright, Posselt, and Steiner conducted a similar study, examining the relative amount and distribution of snowfall under four different models: a control (observed lake ice in mid-January 2009), complete ice cover, no ice cover, and warmer lake surface temperatures. The authors also show that moving from complete ice cover to no ice cover dramatically increases lake-effect totals. The total area seeing small (≤2mm) and large (≥10mm) lake-effect events increase by 28% and 93%, respectively. In contrast, while elevated lake surface temperatures do not increase the area affected by lake-effect, they do tend to increase the amount of heavy snowfall; areas that already experienced HLES saw 63% more snowfall.

It remains important to note that, while higher lake surface temperatures and reduced ice cover should lead to more lake-effect snow during the coming decades, a decrease in the number of cold-air outbreaks could work to counter this effect. But if lake-effect increases, as the preponderance of evidence suggests it will, it could carry major additional economic costs for Great Lakes states. According to a study of 16 states and two Canadian provinces from IHS Global Insight, snowstorms can costs states $66-700 million in direct and indirect losses per day if they render roads impassable. Great Lakes states had among the most significant losses, with Ohio forfeiting $300 million per day and New York leading the pack at $700 million.

Additional major lake-effect events will only serve to drive up this price tag even more, further constraining limited state and municipal budgets well into the future.