El Niño is here. What will it mean for Great Lakes ice cover?

lake erie ice

Over the weekend, temperatures finally climbed over 40ºF in Cleveland. Given the fact that the average temperature in February was all of 14.3ºF – by far the coldest February in our history – the mid-40s felt like a heat wave.

My fiancée and I decided to venture outside and headed down to Edgewater Park on Cleveland’s West Side. Edgewater, as the name suggests, sits along Lake Erie. We wanted to take an opportunity to see the lake before the ice really began to melt. Due to the frigid winter, the Great Lakes were once again covered in a thick layer of ice this year. Though we will likely remain just shy of last year’s mark, ice cover reached a peak of 88.8% on February 28. As set to continue running at or above normal, this number should continue dropping until the lakes are ice free sometime in late Spring. It has already fallen by more than 20% in the past 10 days.

We were far from the only people with this idea. While neither of us planned to actually head out onto the ice, we eventually decided to follow the pack. Someone had even decided to set up a tent on the ice a few hundred feet off shore to serve soup and coffee to passersby. At the time, I had no idea what the actual thickness of the ice we were walking on was. I flippantly estimated that it was several feet thick – a testament to my ignorance. I have since discovered, from the map below, that we were likely standing on a sheet of ice roughly 40 centimeters thick. Fortunately, that is thick enough to support a car.

lake erie ice thickness march 9, 2015

Courtesy of NOAA’s Great Lakes Environmental Research Laboratory

El Niño arrives – finally

Just as the forecast was beginning to take a turn for the better last week, NOAA made headlines by announcing that El Niño had finally arrived. Forecasters had been warning about its impending onset for more than a year, so the announcement wasn’t exactly a surprise. As I stood on the ice last weekend, I couldn’t help but wondering how this phenomenon might affect ice cover next winter.

El Niño is the warm phase of the El Niño Southern Oscillation (ENSO), during which a band of water water forms in the mid-tropic Pacific Ocean. The phenomenon is characterized by high air pressure in the western Pacific and low air pressure in the eastern reaches of the ocean. As Eric Holthaus notes at Slate,

Technically, for an official El Niño episode, NOAA requires five consecutive three-month periods of abnormal warming of the so-called Nino3.4 region of the mid-tropical Pacific, about halfway between Indonesia and Peru. It usually takes a self-reinforcing link-up between the ocean and the atmosphere to achieve this, and it finally appears the atmosphere is playing its part.

Generally speaking, El Niño brings above average temperatures to the Great Lakes region. Moreover, because the oceans have been storing vast amounts of heat over the past decade-plus, helping to limit the rate of global warming, a particularly strong El Niño could lead to a dramatic transfer of stored heat from the oceans to the surface. As a result, many observers are predicting that 2015 will be the warmest year on record.

El Niño and Great Lakes ice cover

It would be logical to assume that the onset of El Niño will limit the amount of ice that forms on the lakes. According to a 2010 NOAA study, from 1963-2008, 11 out of 16 El Niño winters saw below average ice cover. During these 16 winters, ice covered an average of 47.8% of the Great Lakes, considerably lower than the long-term annual average of 54.7%. As Raymond Assel, a scientist with NOAA’s Great Lakes Environmental Research Laboratory (GLERL) wrote in 1998 (emphasis from original):

On average, the average annual regional temperature is likely to be higher (approximately 1.2ºC and the annual regional maximum ice cover is likely to be less extensive (approximately 15%) during the winter following the onset year of a strong warm ENSO event.

But the connection between El Niño and ice cover is not quite so straightforward. In fact, three winters – 1970, 1977, and 1978 – saw above average ice cover, despite occurring during El Niño events. Ice cover during the latter two years exceeded 80%.

So what else is at play? Well, according to the literature, three factors must combine to produce a particularly mild winter for the Great Lakes region and, by extension, lead to extremely low ice cover like we saw in 1998, 2002, and 2012: the strength of the El Niño event and the modes of the Arctic and Pacific Decadal Oscillations. Let’s take a look at three these indicators to get a sense of what might be in store.

El Niño strength

Multiple studies have found that the relationship between these two factor is highly nonlinear. As this chart from Bai et al. (2010) shows, the scatter plot for ice cover and El Niño strength follows a parabolic curve. Accordingly, El Niño does tend to limit ice formation, but its effect is only significant during strong events.

Relationship between El Niño strength and Great Lakes ice cover (from Bai et al. 2010).

Relationship between El Niño strength and Great Lakes ice cover (from Bai et al. 2010).

But the current signs do not point to a strong event. As Brad Plumer explained for Vox,

Back in the spring of 2014, it really did look like a strong El Niño would emerge later in the year…

But then… things got messy. Atmospheric conditions over the Pacific Ocean didn’t shift as expected. Specifically, scientists weren’t seeing the change in atmospheric pressure over both the eastern and western Pacific that you’d expect during an El Niño.

As a result, NOAA appears to be tempering expectations about the strength and duration of this event. It is likely to be relatively weak and last through the summer, potentially limiting its impacts on the Great Lakes.

Arctic Oscillation

The Arctic Oscillation (AO) is among the most important factors that determines the severity of winter in the Great Lakes. The AO is “a climate pattern characterized by winds circulating counterclockwise around the Arctic at around 55°N latitude.” During its positive phase, strong winds around the North Pole effectively lock Arctic air in the polar region, helping to moderate winters. But in its negative phases, these westerly winds weaken, allowing Arctic air to travel further South; this is the phenomenon that caused the polar vortexes we have seen in the past two winters.

Accordingly, during the positive phase of the AO, less ice cover forms on the Great Lakes. From 1963-2008, positive AO winters have been 0.9-1.8ºC warmer than normal and seen a mean ice cover of 49.2%. The combination of an El Niño and a positive AO produced the five lowest ice cover totals during this period.

So where does the AO stand? Currently, it is in a positive phase. Unfortunately, it is difficult to determine whether this phase will persist, as the AO can fluctuate widely. But if this oscillation does remain in a positive phase next winter, it would amplify the effect of the weak El Niño.

Pacific Decadal Oscillation

Winter weather is also influenced by the Pacific Decadal Oscillation (PDO), “a long-lived El Niño-like pattern of Pacific climate variability” that helps determine sea surface temperatures in the North Pacific. Rodionov and Assel (2003) concluded that the PDO helps to modulate the impact of ENSO on the Great Lakes. Warm phases of the PDO tend to amplify the impact of El Niño and reduce ice cover.

Last year, the PDO emerged from its prolonged weak phase to reach record high levels. If it continues to remain strong, it will likely lead to warmer temperatures not just next winter, but potentially for the next 5-10 years. This would seem to suggest that the PDO will enhance the impact of the El Niño event next winter.

Conclusion

Overall, the picture is still a bit murky. It does not appear that the El Niño will be strong enough to produce the type of least ice cover event that we saw in 2012. Yet, at the same time, the combined effects of El Niño, a positive AO (should it remain that way), and a warm PDO (if the trend continues) will likely ensure that the Great Lakes region avoids another brutal winter, like the ones we’ve seen two years running. If this is the case, lake ice cover should regress closer to the long-term mean of approximately 50%.

But if the indicators strengthen in the next several months, the winter weather could moderate even more; this would have clear impacts for lake levels, lake-effect snow, harmful algal blooms, and local temperatures during the Spring and Summer months.

How focusing on climate could make us miss the forest for the trees

mosul dam
mosul dam

Iraq’s Mosul Dam (courtesy of the AP).

If you haven’t read my last post on why we need to integrate climate change into disaster risk reduction, read that first. I’ll wait. And, while you’re at it, read my other post on including DRR into the sustainable development goals. 

As you’ll recall from my last post, I outlined new research arguing that we need to integrate climate change into disaster risk reduction. In this post, I want to explore Syria within this context.

Last week, PNAS released a major study linking climate change (paywalled) to the historic drought that may have contributed to the ongoing violent conflict in Syria. Unsurprisingly, the study has generated a lot of attention, garnering significant coverage from The New York TimesNational Geographic, Slate, Mother Jones, and the Huffington Post, among other outlets.

The debate over the Syria study

Given the highly contentious nature of the climate change and conflict debate (see more from me on this here and here), there has been some blowback, most prominently from Keith Kloor at Discover. In his second post on this debate, Kloor finds some dissenting voices on the study, including Edward Carr from the director of the Humanitarian Response and Development Lab (HURDL) at the University of South Carolina. Carr objected to the general view within the media that this study represents proof of the connection between climate and Syria’s violence. As he noted,

I think the translation of this drought into conflict is pretty weak. Basically, they plumb the conflict literature to support really general statements like “The conflict literature supports the idea that rapid demographic change encourages instability.” No kidding – not sure a citation was needed there. But the causal change between climate change, drought, displacement, and conflict is long and crosses several bodies of data/evidence, all of which are uncertain. The compounding uncertainty in this causal chain is never addressed, so I can’t tell if it is offsetting (that is, some parts of the causal chain address weaknesses in other parts, thereby making the connection throughout the chain stronger) or compounding. I doubt the authors know, either. Basically, I don’t understand how you can get any real understanding of the likely contribution of climate change to this conflict via this mechanism.

Some members of the media who covered the study objected to the criticisms lobbied against them. And, to be fair, both sides make fair points. The media coverage of this study has been far more measured and accurate than in the past. At the same time, the critics are also correct that this study does not prove that climate change caused the Syrian civil war and that we need to be careful when saying it did.

Because I tweet entirely too much, I waded into this debate in the form of a lengthy exchange with Kloor, Neil Bhatiya from The Century Foundation, and Brian Kahn of Climate Central. In it, Kahn asked an important question: Does discussing the role of climate change really detract from focusing on the other drivers of the conflict?

It is in this context that I want to discuss the Kelman, Gaillard, and Mercer paper. In the paper, KGM argue that the extensive focus on climate change sometimes allows it to “dominate” other drivers of vulnerability and disaster risk. Climate change can drive both hazards and vulnerabilities, two of the components in the disaster risk triad, but the question of whether climate “is a more significant or a less significant contributor than other factors…depends on the specific context,” and we should not focus on it to the detriment of other contributors. We cannot miss the forest for the trees.

What KGM means for the Syria study

Here I want to turn to another issue – the policy implications of the PNAS study. For the most part, none of the media coverage of the paper discusses what policymakers are supposed to do with this information. How should it shape their interventions in Syria? What lessons can should they glean for the future? Carr’s colleague at HURDL, Daniel Abrahams, noted the problem therein, saying “I would guess policy makers see this paper as a distraction; something that fills their inbox with people tangentially paying attention to climate issues.”

I’ve been thinking a lot about this question over the past week, and I would argue that it is here that the KGM study’s emphasis on placing climate change in its proper context can be particularly valuable. Let’s assume for a minute that USAID wanted to operationalize the Syria study as the basis for an intervention in the region. If the agency focused on the role that climate change played in driving the conflict, it may conclude that it should invest in projects that can provide reliable clean energy and drinking water to Syria’s crowded urban centers and irrigation water to its hard-hit farmers. What project meets all of those criteria? Why a dam, of course.

USAID actually has a track record of funding the construction of a dams in drought-affected, fragile states within the region, including Iraq’s Mosul Dam and the Kajaki Dam in southern Afghanistan. Accordingly, funding this type of project would not be out of the realm of possibility, and it would likely make sense when viewed from a climate lens. So what could go wrong wrong?

Syria’s complicated hydropolitics

Well, in a word, lots. The climate lens fails to account for the geographic and political environment in which Syria sits. Syria is the midstream party for the Euphrates River, sitting between its upstream neighbor (Turkey) and its downstream neighbor (Iraq). Additionally, the Tigris River forms the border between Syria and Turkey as it heads southeast into Iraq. Disputes over water allocations from the rivers have undermined relations among the three parties for decades.

The complicated hydropolitics within the region are often centered around the Kurds. Turkey has embarked on a massive river basin development scheme, the Southeast Anatolia Project (GAP), which will see it complete 22 dams and 19 power plants. Turkey’s Kurdish minority sees GAP as just another attempt to drown their cultural identity and weaken the Kurdish People’s Party (PKK). Turkey’s dam building has long been a point of contention for Syria and Iraq. Syria has supported the PKK as a proxy battle over water allocations, while Turkey invaded northern Iraq in 1997 to attack Kurdish rebels stationed there. Syria and Iraq have also fought among themselves over water issues, with both countries dispatching troops to the border in 1975.

Clearly, the construction of one or more dams could further exacerbating the region’s hydropolitics. Furthermore, the dam itself may become entangled in the conflict. The Taliban has launched a number of attacks on the Kajaki Dam against American and British forces. ISIS, for its part, has made Iraq’s dams major targets. Its capture of the Mosul Dam, which observers have dubbed “the moment IS ascended from a dangerous insurgent group to an existential threat to Iraq,” was among the major factors that drew the US into the conflict. Any militants who remained in Syria would likely see our hypothetical dam in this same light.

Lastly, new dam projects in the region would likely create widespread, deleterious consequences for Syrians and Iraqis living downstream. Large dams have displaced 40-80 million people worldwide and created a whole host of social and environmental problems. One need look no further than Iraq to see how dams can destroy livelihoods. Following the First Gulf War, Saddam Hussein used dams to drain the Mesopotamian Marshes in order to punish the Ma’dan people. The UN Environment Programme has called this episode “a major environmental catastrophe that will be remembered as one of humanity’s worst engineered disasters.”

While it’s true that climate change will alter conflict dynamics and act as a threat multiplier going forward, we cannot allow this risk to blind us to other the critical considerations at play.

Why we need to link disaster risk reduction to the sustainable development goals

disaster mortality since 1990
disaster mortality since 1990

Trends in disaster mortality since 1990 (courtesy of the Global Assessment Report 2015).

I know I said that my next post would be on the Syria climate change & conflict paper; that’s coming next, I promise. But I wanted to finally get around to cross-posting this piece I wrote for the World Conference on Disaster Risk Reduction first, because it completes the logical chain I started in my last post – climate change feeds into DRR which feeds into sustainable development.

As we enter the year 2015, we approach the final target date for the Millennium Development Goals (MDGs). In many regards, the MDGs have been successful. The number of people living on less than $1.25 per day fell from 47% to 22% by 2010; the global burden from HIV/AIDS and malaria has been ameliorated significantly; and more than 2 billion people have gained access to clean water.

Despite these successes, the international community has been unable to halt environmental degradation. Though MDG 7 called for integrating the principles of sustainable development and reducing biodiversity loss, the destruction of critical ecosystems, such as wetlands and tropical forests, continues apace. Additionally, global carbon emissions have increased by 34% since 1990. Failing to stem this tide will could reverse many of the gains made through the MDGs. As Secretary General Ban Ki-Moon said last April, “Climate change is the single greatest threat to a sustainable future.”

It is for this reason that the international community proposed developing a set of so-called Sustainable Development Goals (SDGs) at the Rio + 20 Conference. These SDGs will pick up where the MDGs left off and further embed the principles of sustainable development and environmental protection.

But, just as we cannot hope to promote sustainable development without addressing climate change, we cannot expect to achieve the SDGs without tackling the threat posed by disasters. The number of disasters worldwide has spiked in recent years, increasing from roughly 100 disasters per decade during the first half of the 20th century to 385 per year from 2000-2010.

Strangely, a consensus appears to have emerged among some economists that disasters may have limited macroeconomic impacts and can actually be beneficial in the long-run. According to this theory, disasters tend to have a stimulative effect for economies by sparking large-scale reconstruction efforts and attracting financial support from the international community.

From an economic perspective, there are several shortcomings to this theory. It assumes that disasters are exogenous events, rather than part of the normal political and economic order. It is also assumes that disasters significantly alter the existing economic order. Yet, low- and middle-income states lack the capacity to replace their productive capital on a broad scale. Moreover, most developing countries lack the necessary human capital to maximize such new technologies. Accordingly, even when disasters provide a boost in the short-term, economies should ultimately return to their pre-disaster state.

More importantly, this theory ignores the political economy of disasters, which disproportionately affect the poor and vulnerable. Disasters tend to undermine the social, political, and natural capital upon which vulnerable groups depend to make their livelihoods. For one, they can severely damage the natural capital upon which most low-income households rely. It can take years, if not decades, for this capital stock to regenerate. The specter of frequent disasters also makes low-income agricultural households more risk averse. In Ethiopia, this threat has reduced economic growth by more than one-third. And disasters can create significant, long-term consequences for the most vulnerable groups. Women living in camps for disaster survivors may find themselves at an elevated risk of physical and sexual violence. After Hurricane Katrina, for instance, the rate of rape among women living in FEMA trailer camps was 53.6 times higher than the rate before the storm.

Taken together, disasters can undermine the vital coping mechanisms of low-income households. Consequently, they may become locked into debilitating poverty traps. Households need to maintain a minimum asset threshold in order to provide for their needs and retain the ability to move up the economic ladder. Evidence suggests that a large number of households fell below this threshold as a result of asset destruction during Hurricane Mitch in Honduras and the 1998-2000 droughts that affected Ethiopia. Such environmental shocks may lock poor households into spirals of poverty and degradation from which they may never escape.

It is for these reasons that the international community must embed the principles of disaster risk reduction into the SDGs. Failing to account for the deleterious impacts of disasters would undermine this enterprise and risk stymieing further progress on poverty alleviation. Moreover, as we enter a greenhouse world, the risk from climate-related disasters and environmental change will only become more apparent. The time has come for the world to mainstream disaster risk reduction and climate adaptation into development planning. The risks of not acting are too great to ignore.

Climate overshadows disaster risk reduction. Here’s how to change that.

sendai earthquake tsunami damage
sendai earthquake tsunami damage

Fires rage in Sendai, Japan, following the devastating earthquake & tsunami that hit the area in 2011 (courtesy of the AP).

Next Saturday, the Third Wold Conference on Disaster Risk Reduction (WCDRR) kicks off in the coastal city of Sendai. Never heard of it? You’re not alone. Disaster risk reduction (DRR) has been on international agendas for decades, but it tends to get overshadowed by climate change. DRR is the broader, less famous, older sibling of climate change; think of it as the Frank to climate’s Sylvester Stallone.

The WCDRR is a follow-up of the 2005 conference in Kobe, Japan, which produced the Hyogo Framework for Action, a 10-year plan to reduce disaster risk and enhance resilience worldwide. That, in turn, served as a successor to the 1994 Yokohama Conference, the first international meeting on DRR, which led to the development of the landmark Yokohama Strategy and Plan for Safer World. (If you’re noticing a theme here, you’re right; Japan is more or less the center of the world when it comes to DRR. It is highly vulnerable to a number of natural hazards and has, accordingly, become a leader and innovator in this space. The 1995 Kobe earthquake, which killed more than 5,000 Japanese, served as a catalyst to place DRR onto policymakers’ agendas. Sendai, for its part, has the unfortunate distinction of being the epicenter of the 2011 earthquake/tsunami that triggered the Fukushima nuclear crisis, which is the costliest disaster in history at $235 billion.)

DRR in context & why 2015 matters

The Hyogo Framework defined it as a strategy to bring about “the substantial reduction of disaster losses, in lives and in the social, economic and environmental assets of communities and countries.” Minimizing the risks and damages wrought by disasters is critically important, given their dramatic costs in blood and treasure. According to the United Nations International Strategy for Disaster Reduction (UNISDR), the world lost a combined 42 million life-years annually as a result of disasters from 1980-2012. On average, disasters cause at least $250 billion in economic losses each year, a number that UNISDR expects will climb considerably due to economic growth, demographic changes, and climate change.

DRR encompasses more issues than climate change, generally speaking. While climate change will generally influence climatic and hydrometeorological disastes, DRR includes all types of disasters, including geological ones like earthquakes and volcanic eruptions. While the former varieties tend to get a lot of the attention, the latter types are often far deadlier and more destructive. 2015 marks the anniversaries of a few of these severe disasters, including the aforementioned Kobe earthquake, the 2005 Kashmir earthquake, and the 2010 Haiti earthquake. These three disasters alone killed more than 306,000 people, affected over 9.3 million, and caused more than $113 billion in damages, according to EM-DAT.

More broadly, 2015 is shaping up to be a landmark year for the international community. The WCDRR is taking place in conjunction with this September’s UN Summit on the Sustainable Development Goals and December’s Paris Conference on climate change. Unsurprisingly, given the scale of what’s yet to come, the Sendai Conference has largely stayed below the radar. You won’t see any major world leaders giving speeches like you will in New York, and the conference won’t produce a document with binding targets like we may get out of Paris. Instead, as the zero draft makes clear, WCDRR will lead to a voluntary agreement that sets global metrics for disaster impacts, defines progressive international principles for DRR, and outlines actions that governmental and nongovernmental actors can take at all levels.

WCDRR & the climate question

Interestingly, even though many observers expect the Sendai Conference to help set the table for the SDGs and Paris Conference, these issues, particularly climate change, have thus far been nearly absent from the conversation. As RTCC noted yesterday, climate is more or less morphed into Sendai’s version of Voldemort – that which shall not be named:

But compared to a planned UN climate change deal in Paris this December, or the Sustainable Development Goals process, this is not making headlines. In part that’s because the proposals – which are still under negotiation – are non-binding and will not require countries to make any financial pledges. It’s also due to the decision by the UN office for Disaster Risk Reduction – conscious or otherwise – to delink the talks from global warming, instead focusing on wider “natural disasters”. This UN body is desperate to avoid the toxic clash of developed and developing countries its climate cousin has suffered from since the early 1990s. Even mentions of the Intergovernmental Panel on Climate Change, a UN body which has a number of publications on disaster risk, are omitted from the draft text. Presenting the 2015 Global Assessment Report on Disaster Risk Reduction on Wednesday, its head Margareta Wahlström appeared at pains not to mention the ‘c’ word.

Decoupling DRR from climate change may prove to be something of a two-edged sword, however. While keeping the two issues separate may help shield WCDRR from some of the political controversy that has tended to overshadow the UNFCCC process, it also limits opportunities to link DRR to climate change and, more broadly, the SDGs. Keeping these three critical processes on parallel tracks that rarely, if ever, intersect reduces opportunities to find commonalities. As a result, we may miss ways to use interventions or funding streams to address them at the same time. Worse yet, keeping these topics siloed may lead us to pursue projects that appear beneficial in one area, but are actively harmful in another.

Making climate change part of DRR

What’s the international community to do? Fortunately, researchers Ilan Kelman, JC Gaillard, and Jessica Mercer have just released a paper (open access) that outlines a strategy to bring these three topics together. It’s well worth reading the whole thing.

In the article, Kelman, Gaillard, and Mercer (herein referred to as KGM) explain how some actors have framed social vulnerability as the result of individual’s “double exposure” to the effects of globalization and climate change. But, as they note, these are just two of a variety of threats that people face on a daily basis; there are also poverty, inequality, social repression, gender roles, disaster risk, environmental degradation, and the burden of disease, among others. In this environment, our focus on globalization and climate change can crowd out these other crucial issues.

Moreover, as KGM notes, some governments are actively using these two issues to pursue their own political ends:

Research in Maldives shows how climate change and globalization are being used as excuses by the government to force a policy of population consolidation (resettlement) on outer islanders. Yet the government has long been trying to resettle the outer islanders closer to the capital using other reasons, such as that it is hard to provide a scattered population with services including health, harbors, and education. Both arguments have legitimacy and can be countered, but climate change is used as an excuse to do what the government wishes to do anyway.

So what’s the solution? The international community needs to adopt a broader “multiple exposure” model that considers climate change as one challenge among many. And, according to KGM, the best way to do this is treat climate change as a subset of DRR. While the authors note that linking climate change to DRR will not be easy to achieve in the current political environment – particularly before the end of this year – they stress the need to pursue this end. As such, they provide three key principles for considering climate change as a subset of DRR:

  1. The international community must treat climate change as one contributor to disaster risk, but not the only or even the most important one. Focus on climate cannot be allowed to dominate other factors, like population growth in floodplains or economic inequality.
  2. Climate change must be seen as one “creeping environmental change” among many, such as soil erosion and desertification. KGM define this term as “incremental changes in conditions that cumulate to create a major problem, apparent or recognized only after a threshold has been crossed.”
  3. We should harness climate change’s political salience as a tool to engage policymakers in a more comprehensive discussion on sustainable development. On this point, it’s best to see the climate-DRR-sustainable development like a nesting doll. Climate change fits within the larger topic of DRR, which, in turn, must be placed within the context of sustainable development. The authors provide a great example to illustrate this: “Little point exists in building a new school with natural ventilation techniques that save energy and that cope with higher average temperatures, if that school will collapse in the next moderate, shallow earthquake.” And, beyond this, building a green, earthquake-resilient school makes little sense if it is only open to boys or the children of the wealthy.

KGM explain how taking this three-in-one approach is the most effective way to harness the strengths of all three issues: the political and power of climate change, the historical perspective and theoretical strength of DRR, and the universal legitimacy of sustainable development. Climate may get all the attention – Lord knows I talk about it enough – but it’s important to recognize its proper place and role within a sustainable development agenda.

I think this framework holds a lot of practical power and value. In my next post, I will use it to consider the recent PNAS article on the role of climate change in the Syrian civil war.

It’s time to include climate change in the immigration debate

hanna lake dried up
hanna lake dried up

A man walks through the desiccated remains of Hanna Lake in Balochistan, which dried up during a decade-long drought in the region (courtesy of Al Jazeera).

Last month, The New York Times released the results of a poll, showing that Hispanics are far more likely to view climate change as a pressing issue that directly affects them. Fifty-four percent of Hispanics rated global warming as a extremely or very important, compared to just 37% of non-Hispanic whites. Moreover, nearly two-thirds (63%) of Hispanics said that the federal government should do a lot or a great deal to tackle climate change.

There are a number of reasons to explain this high level of concern, such as the fact that Hispanic households are far more likely to live in neighborhoods adversely affected by pollution. Minority communities are also more acutely vulnerable to the impacts of climate change, such as heat-related mortality.

This increasing awareness about climate change among Hispanics may appear odd to some, at first glance. As Coral Davenport put it, “the findings of the poll run contrary to a longstanding view in politics that the environment is largely a concern of affluent, white liberals.” Timothy Matovina, executive director of the Institute for Latino Studies at the University of Notre Dame, voiced this conventional wisdom in January, arguing,

Many Spanish-speaking immigrants are worried about surviving from one week to the next. Going to the latest rally on climate change or writing letters to their local chamber of commerce about some environmental issue that sounds to me more like something a middle-class person would do with time on their hands.

Climate change and drought in the American Southwest & Central America

What this argument misses, however, is the myriad ways that climate change is intertwined with other key issues, like immigration. Recently, NASA scientists released a study examining how climate change will affect drought conditions throughout the American Southwest and Central Plains. The study also investigated the impacts on Central America, particularly Mexico. As the map below illustrates, under a business as usual emissions scenario, there is a greater than 80% likelihood that the region will experience a megadrought of at least 30 years between 2050-2099. The historical risk of this type of megadrought is less than 12%.

This study was the first of its kind to compare projected drought trends to the historical record for the past millennium. While droughts of this type did occur during the Medieval Climate Anomaly, a warmer-than-average period lasting from 1100-1300 CE, future droughts will be exceptional. Even if the world takes steps to dramatically curb carbon emissions by mid-century,  climate change will lead to drought conditions that are “unprecedented” in at least the last 1,000 years.

nasa drought projection 2095

The portions of the continental US and Mexico that will be affected by extreme drought this century under a business as usual scenario (courtesy of NASA).

This latest study supplements earlier research showing the looming specter of drought for the region in question. A 2012 Nature Climate Change study by Aiguo Dai, for instance, concluded there would be “severe and widespread droughts in the next 30–90 years” through much of the world, particularly the US and Central America. And a 2011 study from Michael Wehner et al. found that an ensemble analysis “exhibits moderate drought conditions over most of the western United States and severe drought over southern Mexico as the mean climatological state.”

Climate change, drought, and immigration

So what does this research have to do with immigration and Hispanic Americans? Well, we have considerable evidence that droughts are a major driver of migration. As I wrote last January, high temperatures and declining rainfall significantly increase rates of migration in Pakistan. Males living in rural parts of the country, for instance, are 11 times more likely to migrate during periods of extremely high temperatures, while both men and women are more likely to leave their villages under drought conditions.

But the evidence linking climate-induced droughts and migration is not just contained to Pakistan. Because declining rainfall and elevated temperatures combine to lower crop yields in arid and semi-arid areas around the world, drought is likely to be a driver of out-migration in a number of regions. A 2010 study in PNAS found just such a link in Mexico. Declining yields of corn due to drought could increase rates of immigration from Mexico to the US by up to 9.6% through 2080.

Last week, Joe Romm connected the NASA drought study toUS immigration policy. In a post, which is somewhat inartfully titled “If We Dust-Bowlify Mexico And Central America, Immigration Policy Will Have To Change,” Romm writes:

But what are the implications for our poorer neighbors to the south? There will be virtually no part of their countries that are not in near-permanent Dust Bowl or severe drought. And of course their coastal areas (and ours) will be trying to “adapt” to sea level rise of perhaps 3 to 6 feet by 2100 (and likely faster rise after that). Again for all but the wealthiest coastal areas, the primary adaptation strategy will probably be abandonment.

Much of the population of Mexico and Central America — likely over 100 million people (Mexico alone is projected to have a population of 150 million in 2050!) — will be trying to find a place to live that isn’t anywhere near as hot and dry, that has enough fresh water and food to go around. They aren’t going to be looking south.

Romm calls this scenario “a humanitarian and security disaster of almost unimaginable dimensions.” Unfortunately, like all too many commentators before him, Romm makes broad statements about environmentally-induced migration, a topic that is incredibly complex and multi-layered. It’s exactly these types of sweeping generalizations that has led others to claim we would see up to 50 million “climate refugees” by the year 2010. Not quite.

Putting environmentally-induced migration in context

First of all, from a legal and academic sense, there’s no such thing as a climate refugee. But beyond that, it’s not helpful to reduce an issue as complex as migration to a string of simplified absolutes. Arguing that drought conditions will inevitably force people to abandon their villages, en masse, ignores a large collection of evidence to the contrary and effectively robs these people of their agency. We need to do better than that as a community of people who purport to care about the interests of individuals on the front lines of climate change.

Romm’s claim that abandonment will be the primary adaptation strategy has little support. Migration carries considerable costs and risks for individuals, so it is almost never the first choice people pursue. Environmental stress is one of many considerations that people have when deciding to migrate, but it is important to remember that this decision includes a number of social, economic, and political factors.

When examining migration patterns, we need to consider both the push and pull factors involved. Drought can be a major push factor that drives people from their homes, but there generally needs to be pull factors on the other end to attract people to destination communities. We have plenty of evidence of this from Mexico, where multiple studies from migration scholars at the University of Colorado have found that emigration to the US largely occurs among households that have previous experience with migration and/or have access to established migration networks. While declining rainfall does appear to drive households to migrate from Mexico to the US (especially for households living in dry portions of Mexico and during periods of extreme drought), the existence of social networks for potential migrants is “dominating” these flows. Whether or not households choose to migrate during dry spells is largely predicated on this factor.

None of this is to suggest that, as large portions of the Southwest and Central America enter persistent drought conditions, the number of people entering the US across the southern border (with or without legal approval) won’t increase. It almost certainly will. We have already seen spikes in migration from countries such as Guatemala, which is currently enduring an historic drought.

But, if we are proactive, things need not devolve into the worst case scenario Romm laid out. The US and our neighbors need to work together to both enhance the adaptive capacity of people living in Central America, so they can be better prepared to weather a changing climate in situ and to reform immigration policies to facilitate the movement of people throughout the region.

Migration has always been a vital adaptation in the face of external stress, and we should consider it through that lens. It is likely time for the international community to begin including migration and displacement in the broader discussion about climate change policy. Perhaps it can be couched under the national adaptation plans or the work program on loss and damage. But we need to be very careful not to let migration get subsumed within climate change. As I’ve noted, there could be significant unintended consequences of creating a special protected class for climate migrants. What about internally displaced persons who cannot access international assistance? What about the 40-80 million people who have been displaced by large dam projects worldwide?

We must also be careful about hyping waves of climate refugees. There is already enough backlash against immigrants worldwide, and pushing such doomsday scenarios may just serve to heighten that opposition. Rather than building figurative and literal barriers to immigration, we need to begin upgrading our domestic and foreign policy to support and protect potential migrants of all stripes. In a greenhouse world, we can no longer afford to consider immigration policy in a vacuum.

Geoengineering makes climate change less polarizing! It’s still a bad idea.

sardar sarovar dam
sardar sarovar dam

India’s controversial Sardar Sarovar dam, located on the Narmada River (courtesy of Mittal Patel).

About 20 minutes after I posted my piece yesterday arguing that we are nowhere near ready to begin researching geoengineering, the Washington Post‘s new Energy and Environment section ran its own piece on the topic. But this post, by Puneet Kollipara, took a vastly different tone.

Rather than delving into the NRC report, it looked at a study in the Annals of the American Academy of Political and Social Science, which explored various tactics to make debate around climate change less polarizing. The researchers broke participants into three groups and laid out the reality about climate change. One was told that the best approach was to curb carbon pollution, the second heard a pro-geoengineering message, and the third group acted as a control. From the post:

Conventional wisdom might hold that telling people about geoengineering would make them less concerned about climate change’s risks by making them complacent about it; if geoengineering works, then maybe climate change isn’t such a big deal. But that’s not what the researchers found. The geoengineering group viewed climate change as posing a slightly higher risk than did the control group and a similar level of risk as did the anti-pollution group…

[W]hen it came to the scientific information on climate change that the researchers made all participants read, the geoengineering group was actually less polarized on whether the science is solid than the control group was…Not only was it a matter of conservative skepticism of climate science shrinking in the geoengineering group, but liberals in the geoengineering group became more likely to question the science.

This result really shouldn’t come as that much of a surprise, when you think about it. Conservatives have long been receptive to geoengineering as a way to address climate change. Newt Gingrich has pushed it as a solution for years, while the American Enterprise Institute endorsed solar radiation management as an “evolving climate policy option.” Geoengineering fits into the broader conservative mindset that pushes engineering solutions to environmental problems. As Clive Hamilton wrote in his book Earthmasters,

As the identity of conservative white males tends to be more strongly bound to the prevailing social structure, geoengineering is the kind of solution to climate change that is less threatening to their values and sense of self….they are consistent with the ideas of control over the environment and the personal liberties associated with free market capitalism.

Geoengineering represents just the latest iteration of this ethos. It’s the same worldview that says massive tree plantations can solve deforestation. Or that calls for building giant indoor, vertical farms as a way to address population and nutrition issues. Or that suggests we can address biodiversity loss by resurrecting species in a laboratory. Why address the causes of environmental degradation, when we can just use our ingenuity to treat the symptoms?

But while these kind of engineering solutions for environmental problems may sound great in the short-term, we really need to consider their long-term implications. As I noted in my previous post, geoengineering is different from these other examples for one key reason: once we go down that road, we are locked into it, forever.

Probably the closest analogue that I can come up with is dam building. Building large dams provides us an engineering solution to a variety of challenges – a lack of energy, unpredictable rainfall, disasters. We can create clean electricity to power cities and industry, easily irrigate our fields, mitigate the risk of drought, and hold back floodwaters. It seems like a great idea on the surface. Of course, megadams create a whole host of unintended consequences, from impeding the movement of fish to drowning villages. But, more than that, they lock us into the need to actively manage nature for the long run.

Jacques Leslie explores all of these issues, and more, in his book Deep Water: The Epic Struggle over Dams, Displaced People, and the Environment. (I highly recommend it if you’re looking for a user-friendly primer on the major controversies over bid dam projects like Sardar Sarovar, Belo Monte, or Three Gorges.)

In the book, Leslie explores the consequences of Australia’s scheme to regulate the Murray River with thousands of dams, canals, and weirs. He spoke with Mike Harper, a former Australian natural resources manager turned activist, about the effort to save the endangered Chowilla floodplain. Before Australians began altering the landscape, the Chowilla experienced an irregular, but essential, cycle of floods and droughts that regulated the ecosystem. With the advent of the water management scheme, the water table rose several feet, bring salt deposits to the surface, effectively poisoning the land. The only way to address this crisis without jettisoning the entire system is for the Australian government to actively replicate this flood/drought cycle in perpetuity:

The ecosystem will have to become dependent on an artificial regime that must be applied forever, [Harper] said. “You might get a good manager for ten years, but the one after him might be a bad one. If we have to manipulate the environment all the time, we’re going to fuck it up sometime.”

Precisely. When you endeavor to play God and actively manage the environment, you are placing the well being of the system in the hands of a few bureaucrats. If Australia’s civil servants screw up, they may irreparably damage a critical ecosystem. If you think that’s bad, spread the risk to the entire planet and multiply it by a thousand.

This is where geoengineering stops being the libertarian panacea some conservatives apparently believe it to be. If we want to control our atmosphere to address climate change, we will need to amass an enormous array of scientists and civil servants who devoted to this task. Because of the global nature of the endeavor and the risks of sparking a geoengineering “arms race” I noted in my last post, no one state or small group of states can be entrusted with this responsibility. We will need to create a supranational organization, perhaps akin to a vastly more powerful UN Environment Programme or World Meteorological Organization. Given how shaky our track record has been on global governance to this point, I’m not particularly convinced that we can successfully regulate our climate for several hundred years. You thought cap and trade was a recipe for big government.

At the risk of being labeled a Tea Partier, I am much more inclined to support a free market approach to climate change like a carbon tax. Hell, you can even use the revenues to offset income taxes. Ultimately, let’s just say I am highly skeptical that geoengineering constitutes a silver bullet to depoliticizing the debate around climate change, let alone to the climate crisis writ large.

The NRC is wrong – we’re nowhere near ready to research geoengineering

mr burns solar shade
mr burns solar shade

Geoengineering: The Simpsons already did it.

Last week, the National Research Council released a lengthy, two-volume report on geoengineering. The central crux of the report and the surrounding debate seems to be that, sure, geoengineering is a crazy idea, but, we need to at least research it, because we’ve gotten ourselves into this mess, and we need every tool available at our disposal. Even the IPCC has dipped a toe in the water, noting in its Fifth Assessment Report that we will most likely end up surpassing the 2ºC warming threshold if we exceed 450ppm of CO2. The only way to get back under that threshold is through the “widespread deployment of bioenergy with carbon dioxide capture and storage (BECCS) and afforestation in the second half of the century.” Given these realities, it seems logical to at least start researching geoengineering, right? It’s better to have that arrow in the quiver and never need it than need it and not have it.

The risks & rewards of geoengineering

The problem with geoengineering is that even researching it carries clear risks. In July 2013, the Woodrow Wilson Center’s Environmental Change and Security Program published a report titled “Backdraft: The Conflict Potential of Climate Change Adaptation and Mitigation.” The report includes a chapter on geoengineering (“climate engineering” in their parlance) by Achim Maas and Irina Comardicea of aldephi, a German think tank.

In their piece, Maas and Comardicea lay out the potential benefits and drawbacks of climate engineering. On the one hand, climate engineering would not need to upend our existing fossil fuel-based global energy system and may be a more appealing option for certain actors. This approach would also allow developing states space to continue to exploiting their fossil fuel reserves as a way to lift their citizens out of extreme poverty, helping to level the potential trade-offs between tackling climate change and global poverty.

On the other hand, tinkering with our climate system could generate some severe unintended consequences. First, it fails to tackle the root cause of climate change, so it is far from an actual solution. Second, it does nothing to curb the impacts of climate change outside of global warming, most notably ocean acidification. Third, the potential side effects of climate engineering could be widespread, and we cannot predict them for sure. We may end up altering the color of the sky or the chemistry the oceans. Fourth, once we start playing God with our atmosphere, we can never stop. According to a study in the Journal of Geophysical Research, if we engaged in solar radiation management (SRM) for 50 years, then stopped, we could end up getting all of the delayed warming from that period in just 5 to 10 years. That type of warming would be unprecedented, and we could have no way to adapt.

The trouble with research

But none of the above specifically explains why researching climate engineering is, in and out itself, fraught with risks. Maas and Comardicea delve into this issue at length. Because of the scope and the scale of climate engineering, we cannot accurately replicate it in a lab, and there is no way the international community would sign off on a global chemistry experiment without knowing the real world implications. And that necessitates experimenting outside of the lab.

Say we decided to inject sulfates into the atmosphere on a “small scale” in order to see if we could reduce the amount of solar radiation reaching the Earth’s surface. The risks of even a theoretically controlled experiment could be significant. Thanks to a study that also came out last week, we know that the increase in aerosol emissions in Europe and North America during the Industrial Revolution altered precipitation patterns in the northern tropics, contributing to a “substantial drying trend” after 1850.

The impacts of climate change and, by extension, climate engineering, are so distant in time and space that we would have no way of knowing exacting where, when, and how the potential consequences of this kind of experiment would play out. What if a prolonged drought occurred in the Caribbean or the monsoon shifted dramatically in South Asia? We would be unable to pinpoint the cause of such a change – whether natural or manmade – for several months or years.

And, during this period, all sorts of social and political consequences could occur. We already know, for instance, that aerosol pollution from India has intensified tropical cyclones in the Arabian Sea, including a category 4 cyclone that hit Pakistan in June 2010. Pakistan is already acutely aware of the impacts of drifting air pollution from India. What if the country experienced another disaster on the scale of the 2010 Indus River floods and decided that Indian interference in the atmosphere was to blame?

As Maas and Comardicea argue,

Even if there may be no direct connection between a state’s regional climate engineering scheme and the crop failure of another state, it may provide a convenient scapegoat and lead to increased tensions…The possibility of unilaterally implemented climate engineering, either via world powers or smaller coalitions of states, may thus lead to a “climate control race.” In the same way that states raced to develop arsenals of nuclear weapons during the Cold War, states may compete to develop and control climate engineering technology.

There is already a track record to suggest that states can use weather as a tool of war. The US employed cloud seeding techniques from 1967-1972 as part of “Operation Popeye” to try and interrupt movement along the Ho Chi Minh Trail. Such actions directly led to the 1978 Environmental Modification Convention (ENMOD), which barred the using environmental modification to cause harm. But even ENMOD has not halted efforts to control the weather. China, for instance, has widely employed cloud seeding, most famously to clear the skies over Beijing in the run up to the 2008 Olympics.

Governance before research

With all of these potential consequences in mind, what do Maas and Comardicea recommend to stave off the worst effects of climate engineering? In a word, governance:

To reduce the conflict potential of climate engineering, a transparent international dialogue on the research and applications of climate engineering technologies is crucial prior to any field research (emphasis added). Ongoing talks and deliberations should involve a wide variety of stakeholders, and critically evaluate the potential technological benefits and pitfalls, as well as the regulatory development of the range of climate engineering techniques…

In conclusion, the NRC is wrong here. We need governance, first and foremost, before undertaking research and absolutely before trying out any of these crazy ass schemes in the real world. The risks are too great to play it by the seat of our pants. I’m not here to say that we should never employ geoengineering at any point, for any reason. I think it’s a terrible idea with a dramatic downside, but there may come a time when we have no choice. Then, and only then – as a very last resort – should we be ready to employ it. But we are a long way from that point, and we are still a long way from reaching the day at which we can even begin to researching to prepare for this day. Let’s get through Paris first without saddling this conference with yet another intractable problem.

Minnesota’s DOT is ready for climate change. ODOT? Not so much.

duluth flood damage
duluth flood damage

Heavy damage to a road in Duluth, Minnesota following flooding from the Tischer Creek in June 2012 (courtesy of Minnesota Public Radio).

Climate change will have profound and diverse impacts upon infrastructure throughout the United States, including transportation infrastructure. Rising sea levels, stronger storm surges, more severe flooding, land subsidence, soil erosion, melting permafrost, and more frequent freeze-thaw cycles will all strain our already aging, deteriorating roads, bridges, and ports. The American Society of Civil Engineers has consistently given the country’s infrastructure a D or D+ on its annual report card since 1998, and the US slipped from fifth place in 2002 to 24th by 2011 in World Economic Forum’s transportation rankings. Throw in profound and unpredictable changes to the climate that facilitated the rise of human civilization, and you have a recipe for disaster.

It is for this reason that the President Obama’s administration has attempted to drag the federal government into the 21st century on climate change planning, despite considerable institutional inertia, not to mention stalwart opposition from Congressional Republicans and special interests. Just last week, the President issued an executive order requiring all federal agencies to include long-term sea level rise projections during the siting, design, and construction of federal projects. Planning for such changes and the threat of 100- and 500-year flood events, which will become drastically more frequent in a greenhouse world, will be of vital importance for the US Department of Transportation. Critical institutions are already at risk from near-term climate change. Parts of Oakland International Airport, for instance, could wind up under water during the daily high tide if sea levels rise just 16 inches.

But, given the complexity of our federal system of government and the aforementioned institutional inertia, the administration’s actions will take time to trickle down to the state and local level, where many of the daily decisions on transportation infrastructure construction and maintenance occur. While this fragmented structure has enabled some progressive state and municipal governments to take steps to combat climate change in the absence of meaningful legislation from Congress, it can also create a highly uneven system in which certain locales are far more inclined to incorporate climate change considerations in transportation planning. Given the fact that the roads, bridges, and ports we build today will likely still be in operation 30-50 years from now, each day that we delay increases our adaptation deficit – that is, the gap between our current level and an ideal level of adaptive capacity to a changing climate.

To demonstrate this uneven level of adaptation, consider the different approaches to climate change planning from the Minnesota Department of Transportation (MnDOT) and the Ohio Department of Transportation (ODOT). This week, Minnesota Public Radio is featuring a series on climate change in the North Star State. An article this morning discussed how the state has adapted to more severe rainfall events in recent years.

In all three cases, whether officials say it in so many words or not, they are adapting their cities’ infrastructure to a changed climate, one that has been dumping more rain and bigger rains on Minnesota.

Warmer temperatures have an impact on infrastructure as well – more freeze-and-thaw cycles mean more potholes, for example. But because roads require relatively constant maintenance, road planners can adapt to a changing climate on the fly.

Not so with storm and wastewater systems, which are built to last as long as a century. That, say urban planners, is where the real challenge lies, and it is where some Minnesota cities have been focusing their efforts to adapt to climate change.

As the article notes, severe rain and flash floods have taken a drastic toll on infrastructure, including transportation systems, in recent years. Fortunately for Minnesotans, MnDOT has been leading the way in the effort to incorporate changing precipitation patterns and flood risks into planning. The agency recently completed an assessment of climate change risks to infrastructure in two districts, District 1 and District 6, which are located in the northeast and southeast portions of the state, respectively. In the introduction to the assessment, MnDOT writes,

Recognizing this, MnDOT planners and engineers have long considered minimizing the risk of flash flooding in the siting and design of the state’s roadway network. However, as has been the standard practice worldwide, they have traditionally assumed that future climate conditions will be similar to those recorded in the past. Climate change challenges this assumption and calls for new approaches to understanding vulnerabilities across the highway system and at specific transportation facilities so that appropriate actions, adaptations, can be taken to minimize expanding risks.

This project…represents a starting point for developing these new approaches. The focus of this pilot study is on flash flooding risks to the highway system. While flooding is not the only threat to the state’s highway system posed by climate change, it is likely to be one of the most significant and has already caused extensive disruptions to the transportation system in many areas.

If only Ohio had taken such a proactive approach to this issue. To be fair to ODOT, the agency does appear to be considering climate change in its planning process. There is a section devoted to the issue in Access Ohio 2040, the state’s long-term transportation planning vision. Perhaps strategically, the document refers to it as “climate variability” and completely bypasses the question of what is causing climate change. Now, the supplement to this section does touch on the fact that greenhouse gas emissions, including those from transportation, are driving the observed changes, though it does so somewhat halfheartedly. And then there’s the presentation on climate change infrastructure vulnerability that seems more focused on the potential benefits for the state from altering our extant climatic systems.

But, at least ODOT appears to have faced up to the issue. Access Ohio 40 calls for the state to complete a Statewide Climate Variability Study “within the next two years.” If the state meets this metric, the study should be finished by summer 2016, leaving the state roughly 18 months behind MnDOT. Now, I should note that, unlike ODOT, MnDOT’s assessment was one of 19 pilot projects funding by the Federal Highway Administration through its 2013-2014 Climate Change Resilience program. Then again, states, metropolitan planning organizations, and other entities had to actually apply to secure FHWA funding. I can find no evidence that Ohio bothered applying. Additionally, I have searched through the State of Ohio’s FY 2014-2015 transportation budget, and I find no evidence that the legislature has ponied up the $250,000-500,000 that ODOT stated it would need to complete its climate variability assessment. So I question whether Ohio is on track to finish the assessment by next summer.

And, even if ODOT has made some commitment to climate change adaptation at the strategic level – a highly dubious proposition – there is absolutely no evidence that this commitment has worked its way down to the project level. Consider the Opportunity Corridor, one of the largest projects currently being funded in the state. A handful of individuals and organizations submitted comments to ODOT’s Draft Environmental Impact Statement (DEIS) on the project, imploring the agency to take climate change into account. Here’s how ODOT responded in its Final EIS:

[I]t is analytically problematic to conduct a project level cumulative effects analysis of greenhouse gas emissions on a global-scale problem… Because of these concerns, CO2 emissions cannot be usefully evaluated in the same way as other vehicle emissions. The NEPA process is meant to concentrate on the analyses of issues that can be truly meaningful to the consideration of project alternatives, rather than simply “amassing” data. In the absence of a regional or national framework for considering the implications of a project-level greenhouse gas analysis, such an analysis would not inform project decision-making, while adding administrative burden.

In other words, we think your request is stupid and a waste of time, so nope.

ODOT does not operate in a vacuum. I’m sure there are a lot of good civil servants trying their best to meet the needs of Ohioans at the agency, but its direction is ultimately shaped by the elected officials in power in Columbus. Governor Kasich may at least pay lip service to climate change, but he has shown no inclination to actually act on the issue. Quite the contrary – he is responsible for signing SB 310 into law last June. Attorney General DeWine, for his part, is currently suing the EPA to stop its efforts to regulate greenhouse gas emissions.

And then there’s the GOP-dominated statehouse. The only reason Senator Bill Seitz will ever leave the legislature is through term limits, regardless of how many bombs he tosses about enviro-socialist rent seekers or the Bataan death march. And Senator Troy Balderson, the person who sponsored SB 310 and serves on the committee that regulates electric utilities, was blissfully unaware of the EPA’s plan to regulate coal-fired power plants a year after it was announced. It’s not exactly a shock that ODOT is a laggard here.

Those of us in Ohio who want an agency that is responsive to our desires to create an equitable, low-carbon, fiscally responsible transportation system need to keep pressuring ODOT, but we also need to win elections. Until then, our civil servants and public officials will keep their heads firmly lodged in the sand.

Don’t blame it on the rain: On the root causes of Northeast Ohio’s flooding problems

Floodwaters submerged vehicles in the parking lot at Great Northern mall in North Olmsted on May 12 (courtesy of Cleveland.com).

“Après moi, le déluge” – King Louis XV (1710-1774)

Northeast Ohio has a flooding problem, as anyone affected by the severe storms last evening can attest. The region has experienced at least four major flooding events in the past few months, the most serious of which occurred five months ago on May 12, when torrential rains caused widespread flooding in several communities.

As the hydrographs below demonstrate, this severe deluge caused several rivers and streams to overflow their banks throughout the western and southern portions of Greater Cleveland. Flash floods also occurred in several areas; one raging flash flood nearly washed away a vehicle containing legendary meteorologist Dick Goddard, who apparently did not heed that famous National Weather Service saying: “turn around, don’t drown.”

This hydrograph displays the streamflow for three Northeast Ohio rivers – the Vermilion River (red), the Black Creek in Elyria (green), and the Rocky River in Berea (blue) – as measured by the US Geological Survey during May of this year. As you can clearly see, the streamflow in each of these rivers spiked drastically on May 12-13, due to the extreme precipitation during that period. Both the Vermilion and Black Rivers exceeded their respective flood stages (courtesy of USGS).

Who is to blame?

Since these floods occurred, people have been looking for answers or, in many cases, someone to blame. Those individuals whose property and piece of mind were damaged by the floodwaters have, in many cases, been understandably and justifiably upset, even angry. Many of these people have turned their anger at their municipal governments for failing, for one reason or another, to prevent the floods from occurring. This anger bubbled over in some instances, leading to highly contentious public meetings, such as the one in North Olmsted during which a resident got on stage to publicly rebuke officials and call for citizens to sue the city. Residents of other municipalities, including Olmsted Township and Strongsville, are also considering class action lawsuits, accusing their cities of negligence for not investing in adequate infrastructure upgrades.

City officials, for their part, have found a different scapegoat – the rain itself. And there can be no question that the rain in some areas in the past few months has been downright biblical. North Olmsted endured 4.44 inches of rain – more rain than it receives, on average, for the entire month of May – in a couple of hours on the 12th. Put another way, that amount of rain would be equivalent to roughly 44 inches of snow. Strongsville, in turn, saw 3.58 inches of rain that evening, just under its monthly average rainfall of 3.66 inches for May.* The following month, Cleveland suffered a similar fate. The 3.54 inches that fell on June 24 made it the fourth rainiest day for the city in the past century.

Yet, major rainfall events are not uncommon for Northeast Ohio during the summer months; in fact, they are the norm. On average, roughly 40% of the total precipitation in the Midwest each year falls during just 10 days; almost all of these days occur during the summer months, when high heat and humidity can lead to major convective storms. But, what is different is the frequency with which these types of flooding events are occurring. Residents in many of the affected communities have testified that they have experienced floods on a semi-regular basis over the past 10-15 years.

Don’t blame it on the rain…or the sewers

While it may be convenient to blame these floods on the rain, it’s not that simple. As the (handful of) people who have perused this blog in the past have no doubt grown tired of reading, there’s no such thing as a “natural” disaster. Rather, disaster risk is the combination of a natural hazard, our physical and economic exposure to the hazard, and our socioeconomic vulnerability. If 4 inches of rain falls in the middle of an uninhabited tract of some national park in Montana, it does not constitute a disaster. In a sense, for disasters, if a tree falls in the forest, and no one is there to see it, it really doesn’t make a sound.

So, while it may make sense for people to blame inaction by public officials or the heavens for floods, these simply represent the proximate causes of the disaster. We cannot hope to address the real issue at hand by focusing simply on these; that is the equivalent of treating the symptoms of the illness. Rather, we need to focus on the root causes, which one can identify through this disaster risk lens.

Since I cannot readily or adequately examine the various facets of disaster vulnerability for every community affected by this summer’s floods, I want to focus instead on the other two components of the disaster risk triad – natural hazards and exposure. Increases in extreme precipitation events due to climate change and Northeast Ohio’s ongoing sprawl problem, respectively, account for much of the apparent spike in flooding events throughout the region over the past several years. I explore each of these below.

Natural hazard: Climate change and precipitation in Northeast Ohio

Logically, the more rain that falls over an area, particularly within a limited period of time, the higher the likelihood that a flood will occur. We already know that, based on simple physics, as global temperatures increase, the amount of moisture in the air should also rise. According to the Clausius-Clapeyron equation, the atmosphere’s capacity to hold water vapor increases roughly 7% for each 1ºC increase in atmospheric temperatures. This should lead to two general outcomes. First, it will take the atmosphere longer to reach its point of saturation, which may lengthen the periods between rain events for many areas, contributing to droughts. Conversely, because the amount of water vapor available for precipitation also rises, rainfall events should become more extreme in nature. As Dr. Kevin Trenberth put it in a 2007 study (PDF),

Hence, storms, whether individual thunderstorms, extratropical rain or snow storms, or tropical cyclones, supplied with increased moisture, produce more intense precipitation events. Such events are observed to be widely occurring, even where total precipitation is decreasing: ‘it never rains but it pours!’ This increases the risk of flooding.

We are already witnessing this intensification of rainfall in the US, particularly in the Midwest.  According to the latest National Climate Assessment (NCA), total precipitation has increased in the Midwest by 9% since 1991. Over the past century, certain parts of the region have seen precipitation totals climb by up to 20%. This increase is due largely to a spike in the frequency of extreme precipitation events. From 1958-2012, the amount of precipitation falling in very heavy downpour events jumped by 37% in the Midwest. This statistic helps to explain why, of the 12 instances in which Cleveland received more than 3 inches of rain in a day during the last century, 7 have occurred since 1994.

heavy downpours by region

One measure of heavy precipitation events is a two-day precipitation total that is exceeded on average only once in a 5-year period, also known as the once-in-five-year event. As this extreme precipitation index for 1901-2012 shows, the occurrence of such events has become much more common in recent decades. Changes are compared to the period 1901-1960, and do not include Alaska or Hawai‘i. (courtesy of Climate Central).

Unless we take action quickly to reduce our carbon emissions, this situation will only get worse in the coming decades. The NCA projects that, under a business as usual scenario (RCP 8.5), Ohio will see such extreme precipitation events four times more frequently by the end of the century.

extreme precipitation events projections

The increase in frequency of extreme daily precipitation events (a daily amount that now occurs once in 20 years) by the later part of this century (2081-2100) compared to the later part of last century (1981-2000) (courtesy of the National Climate Assessment).

Exposure: Sprawl and flooding in Northeast Ohio

I’ve also written extensively in the past about Northeast Ohio’s problems with sprawl-based development (see here for examples). As I wrote one year ago today,

Northeast Ohio has suffered from decades of sprawl and uncoordinated development patterns, leading to waves of suburbanization followed by exurbanization. In 1948, Cuyahoga County’s population stood at 1,389,532; just 26% of land in the county was developed at the time. Yet, by 2002, although the county’s population had grown by a mere .32% to 1,393,978, sprawl ensured that roughly 95% of the county’s land area had been developed.

cuyahoga county land use in 1948 & 2002

Changes in land use within Cuyahoga County from 1948 (left) to 2002 (right). Red shading indicates developed land, while the beige indicates land that is still undeveloped. The maps clearly demonstrate the decentralization of the county over the last six decades (courtesy of the Cuyahoga County Planning Commission).

We’ve come a long way since 2002. The heyday of sprawl appears to be on its last legs, as the combined effects of the Great Recession, the rise of the Millennial generation, and the gradual retirement of the Baby Boomers has led to a resurgence in the number of people living in walkable urban areas. Multiple sources have proclaimed the end of sprawl; this trend even appears to be taking root in Atlanta.

Cleveland has tried to position itself to follow this emerging trend. The city was recently ranked 10th most walkable among the largest 30 metro areas, enjoys a 98.3% residential occupancy rate downtown, has unveiled a plan to double the amount of bike routes in the city by the end 2017, and has seen a rise in transit-oriented development.

Given all of these positive indicators, why would I suggest that sprawl has increased the frequency and intensity of floods over the past decade-plus? Well, simply put, because it has. While it’s impossible for one to  deny all of these positive indicators, one also cannot ignore the facts.

In its Measuring Sprawl 2014 report, Smart Growth American ranked Cleveland 153 of 221 metros on its sprawl index. The median score was 100; cities with scores over 100 were more compact, while those with scores less than 100 were more sprawling. Cleveland scored an 85.62 (PDF), placing it below other regional metros, including Detroit (12th), Milwaukee (15th), Chicago (26th), Akron (111th), Dayton (116th), Toledo (117th), Pittsburgh (132nd), and Columbus (138). Cleveland does outperform some other nearby metros, including Indianapolis (158th), Cincinnati (166th), and Youngstown (175th).

Moreover, a recent study out of the University of Utah suggests that from 2000-2010, the Cleveland metro area became even more sprawling (PDF). Using Smart Growth America’s sprawl index, the authors examined the rate of change for the 162 largest metro areas (paywalled) during this period. While Akron actually became 2.7% more compact, Cleveland sprawled by another 13.3%, the 10th worst change of any metro area. Though the city’s number improved since 2010, our 85.62 in 2014 is still lower than the 86.01 that we had 14 years ago.

So why does this all matter for flooding? Well, simply put, areas that follow sprawl-based development models are more likely to suffer from flooding problems. Sprawl increases the percentage of land area that is covered with impervious surfaces, such as parking lots, roads, and driveways. As the extent of impervious surfaces rises, so too does the amount of precipitation that winds up as surface runoff during storms. Forested areas are excellent at controlling stormwater (PDF); trees enable 50% of precipitation to infiltrate the soil and allow another 40% to return to the atmosphere through evapotranspiration. Urbanized areas, in contrast, drastically reduce the amount of water that can infiltrate into the soil, guaranteeing that 35-55% of precipitation ends up as runoff.

As Hollis (1975) has shown, urbanization increases the incidence of small flooding events 10-fold (paywalled). Additionally, if 30% of the roads in an urban area are paved, major flood events with return periods of 100 years or more tend to double in magnitude. Northeast Ohio has more than 48,000 acres of impervious surfaces, equivalent to approximately one-third of the region’s land area. Accordingly, we fall directly into that danger zone for major flood events due, in large part, to our development patterns.

Secondly, because so much of the county is already developed, many new developments are being built in existing flood zones. In December 2010, FEMA released its first comprehensive flood zone maps for Northeast Ohio since the 1960s. Unsurprisingly, these maps show a dramatic increase in the number of people living in flood zone areas, due to the outward expansion of development. Thousands of people woke up one day to find out that they had been living in a flood zone, and they were none too happy to learn that they would now have to shoulder some of the cost of that decision by purchasing federal flood insurance. Interestingly, the gentleman who filed the class action lawsuit against Strongsville over the flooding lives in a housing development in one of these flood plains.

Lastly, sprawl directly contributes to climate change by leading to additional greenhouse gas emissions. Suburban areas account of 50% of the US’s total emissions, despite being home to less than half of the population. While households in downtown Cleveland produce just 26.5 tons of GHGs annually, that number skyrockets to 85.6 tons for Gates Mills residents. Because transportation accounts for such a high portion of the average family’s carbon footprint in this region, our sprawl problem has directly resulted in additional carbon pollution.

Conclusion

There is no question that flooding represents a real threat to the quality of life of people living in Northeast Ohio. Those individuals who have been directly affected by it have every right to be upset and to demand answers. Unfortunately, however, it appears that we are losing sight of the forest for the trees. Focusing exclusively on the proximate drivers of these floods may seem like a good idea, but it allows us to escape examining the real, underlying root causes. Until we step up and begin to shift our regional development patterns away from those centered on sprawl and rampant fossil fuel use, this flooding problem will only get worse.

 

*It’s worth noting that Strongsville is one of eight suburbs that have sued the Northeast Ohio Sewer District to fight the implementation of its stormwater management program. The case went before the Ohio Supreme Court on Tuesday. Obviously, this action runs directly counter to the city’s interests. While the stormwater management program will lead to an increase in rates, it is also the only chance we have to begin managing runoff as a region, which is essential not only for flood control but for improving our water quality and fighting harmful algae blooms. Additionally, a portion of the revenues from this fee would be made available for cleaning up after floods and helping to prevent future flooding. Perhaps that’s why, after the May 12 storms, North Royalton withdrew as one of the plaintiffs in the Supreme Court case. This region desperately needs the investment that will come from this program, through Project Clean Lake, though I strongly encourage NEORSD to invest a greater portion of the program’s funds into green infrastructure, which is vital for controlling floods and filtering water.

Sorry, Roger Pielke, climate change is causing more disasters

typhoon haiyan damage
typhoon haiyan damage

Damage in Tacloban from super Typhoon Haiyan (courtesy of The Daily Mail).

Back in March, controversial political scientist Roger Pielke, Jr. published his first post for FiveThirtyEight. The piece centered on the argument that climate change is not contributing to an increase in scale of disasters globally; rather, Pielke argued, “the numbers reflect more damage from catastrophes because the world is getting wealthier.”

The piece immediately drew consternation and criticism from a number of individuals and even prompted Nate Silver to commission a formal response from MIT climate scientists Kerry Emanuel. In particular, Emanuel and fellow climate scientist Michael Mann criticized Pielke’s decision to normalize GDP data. As Emanuel wrote,

To begin with, it’s not necessarily appropriate to normalize damages by gross domestic product (GDP) if the intent is to detect an underlying climate trend. GDP increase does not translate in any obvious way to damage increase; in fact, wealthier countries can better afford to build stronger structures and to protect assets (for example, build seawalls and pass and enforce building regulations). A grass hut will be completely destroyed by a hurricane, but a modern steel office building will only be partially damaged; damage does not scale linearly with the value of the asset.

Pielke’s critics also noted that he used an oddly brief time span for his investigation (1990-2013), that his use of global data tends to cover up significant differences in disaster damages among regions, and that he does not account for disaster damages that have been avoided due to investments in disaster mitigation and risk reduction. There’s also the fact that he includes geological disasters (i.e. earthquakes and volcanic eruptions) in an analysis that purportedly dismisses climate change as a factor; would it really have been that hard to get the original data on climate-related disasters directly from EM-DAT?

Not suprisingly, Pielke and a number of his friends, colleagues, and allies defended the piece, portraying Pielke as the victim of a coordinated witch hunt from climate activists and radical environmentalist bloggers. In an interview with Pielke, Keith Kloor, someone with whom I have disagreed on many occasions but respect, wrote that various commenters had “used slanderous language in an attempt to discredit” Pielke’s work. The basic argument is that few people had any real qualms with the research itself; instead, Pielke’s critics could not escape their personal feelings towards him and allowed those to color their critiques of his work.

Disaster frequency in the Asia-Pacific region

All of this is just an excessively long introduction to a new study published this week in the journal Climatic Change. In the article, researchers Vinod Thomas of the Asian Development Bank, Jose Ramon G. Albert of Philippine Institute for Development Studies, and Cameron Hepburn from Oxford University (herein known as TAH) “examine the importance of three principal factors, exposure, vulnerability and climate change, taken together, in the rising threat of natural disasters in Asia-Pacific” during the period from 1971-2010.

Now, there are three key reasons why this article piqued my interest and why its results are relevant to the topic at hand, particularly in contrast to Pielke’s research:

  1. The Asia-Pacific region typically accounts for at least a plurality of all disaster metrics – frequency, victims, and economic damages. From 2002-2011, according to EM-DAT (PDF, see page 27), Asia-Pacific was home to 39.6% of disaster events, 86.6% of disaster victims, and 47.9% of economic losses.
  2. The overwhelming majority of disaster events, losses, and victims in Asia result from climate-related disasters. For instance, the region accounts for 40% of all flood events (PDF, see page 6) over the last 30 years, and three-quarters of all flood-related mortality occurs in just three Asian countries – Bangladesh, China, and India.
  3. Both the size of the region’s populations and economies have grown dramatically over the past 40 years. As the figure below demonstrates, East and South Asia have seen GDP per capita growth rates of 8.4% and 5.6%, respectively, easily outpacing other regions. Asia-Pacific is also rapidly urbanizing. From 1950 to 2010, the number of Asians living in urban areas grew seven fold to 1.77 billion (PDF, see page 32). Many of these individuals live in areas highly exposed to disasters; for instance, 18% of all urbanized Asians live in low elevation coastal zones. Accordingly, if population growth and increased exposure to disaster risk were the ultimate drivers of increasing disaster occurrence, Asia would likely be the test case.

So, does this new research validate Pielke’s assertions that disasters are not becoming more frequent and, if they are (which they aren’t), it has nothing to do with manmade climate change?

In a word, no.

Unlike Pielke, who apparently believes that normalized economic losses represents an appropriate proxy for disaster occurrence, TAH actually examine the frequency of intense disasters over a four-decade period. And whereas Pielke considers damages from geological disasters, which, – given the fact that we have not suddenly entered an age of earthquakes – are a function of increasing physical and economic exposure, these authors focus exclusively on climatological (droughts, heat waves) and hydrometeorological (floods, tropical storms, etc.) disasters, which can be influenced by a changing climate.

Moreover, TAH only consider the occurrence of intense disasters, which they define “as those causing at least 100 deaths or affecting the survival needs of at least 1,000 people.” The use of this metric ensures that any increase in the number of observed disasters is unlikely to be the result of better reporting mechanisms alone, countering Pielke’s assertion that any perceived increase “is solely a function of perception.”

TAH explore the frequency of climate-related disasters in 43 Asian-Pacific countries, using both random and country-fixed effects*, which provides them with a greater sense of the validity of their results. They use the log of population density as a proxy for population exposure, the natural log of real income per capita as a proxy for socioeconomic vulnerability, and both average annual temperature and precipitation anomalies as proxies for climate hazards. Additionally, they break the data into 5 subregions and the timeframe into decade-long spans as sensitivity tests.

Climate change is increasing the frequency of disasters in Asia-Pacific

Results show that both population exposure and changes in climate hazards have a statistically significant influence on the frequency of hydrometeorological disasters. For each 1% increase in population density and the annual average precipitation anomaly, the frequency of such events increases by 1.2% and 0.6%, respectively. The authors then applied these results to historical trends in three Asia-Pacific states – Indonesia, the Philippines, and Thailand. As a result, a moderate increase in the precipitation anomaly of 8 millimeters per month (well within the observed changes for Southeast Asia over the past decade) leads to 1 additional hydrometerological disaster every 5-6 years for Indonesia, every 3 years for the Philippines, and every 9 years for Thailand.

In contrast, the models suggest that only changes in precipitation and temperature anomalies affect the frequency of climatological disasters. While a 1 unit increase in the average precipitation anomaly (logically) produces a 3% decrease in the number of droughts and heat waves, an equivalent jump in the annual average temperature anomaly leads to 72% increase.

It is not surprising that population exposure would play a role in determining the frequency of disaster events; if a tropical storm hits an uninhabited island, it doesn’t get recorded as a disaster. But what is surprising, if you take Pielke at his word, is the clear influence of our changing climate. As TAH conclude,

This study finds that anthropogenic climate change is associated with the frequency of intense natural disasters in Asia-Pacific countries. A major implication of this is that, in addition to dealing with exposure and vulnerability, disaster prevention would benefit from addressing climate change through reducing man made greenhouse gases in the atmosphere.

Ultimately, there can be no question that climate change will, is, and has changed the frequency and nature of disasters globally. That’s not to say that exposure and vulnerability are not playing an important role; we know they are. Bending over backwards to inject climate change into every event and subject, as some climate activists are prone to do, is misleading and irresponsible. But so is cherry picking data to downplay its role in shaping the nature and scope of disaster events, when the data tell us otherwise.

*Obviously I am, by no means, an economist or statistician of any repute. That said, here is how TAH define the difference between random and country-fixed effects: “In panel data analysis, while the random effects model assumes that individual (e.g. country) specific factors are uncorrelated with the independent variables, the fixed effects model recognizes that the individual specific factors are correlated with the independent variables.” Accordingly, because there is likely some correlation between the independent variables, it is impossible to assume that they are completely exogenous variables.