Africa’s Great Lakes were central to human evolution

victoria falls

Cross-posted from Drink Local. Drink Tap., Inc.

great lakes of africa map

The Great Lakes region of Africa (courtesy of the Proceedings of the National Academy of Science).

If you’ve ever felt inexplicably drawn to Lake Erie or any of the other Great Lakes, you’re not alone. In fact, that attraction is hardwired into your genes.

Last month, two UK researchers published an article titled “Early Human Speciation, Brain Expansion and Dispersal Influenced by African Climate Pulses” in the online, open-source journal PLOS One. The piece explores a variety of close linkages between climatological variability and human evolution throughout Sub-Saharan Africa. It focuses, in particular, on the East Africa Rift System (EARS, for short), which is home to the bodies of water that make up the Great Lakes of Africa.

Africa’s Great Lakes region is home to several of the largest bodies of freshwater in the world. The lake system includes Lakes Victoria, Tanganyika, and Malawi, along with several other smaller bodies of water. These lakes are the lifeblood of the region and are home not only to the world’s largest waterfall, Victoria Falls, but also to the headwaters of the Nile River.

In the article, researchers Susanne Shultz and Mark Maslin sought to determine what factors contributed to the punctuated nature of human speciation and dispersal from East Africa. They focus, in particular, upon a particularly important period for human evolution, which occurred roughly 1.9 million years ago. This period gave rise to the Homo genus and witnessed a series of major migration events from East Africa into Eurasia.

Schultz and Maslin noticed that several of these major “pulses” in human evolution corresponded closely to the appearance and disappearance of the East African Great Lakes. As a result, their research probed this connection more deeply. Their results suggest a close relationship between the growth and decline of the EARS lakes and significant steps forward in human evolution:

Larger brained African hominins colonised Eurasia during periods when extensive lakes in the EARS push them out of Africa. Taken together, this suggests that small steps in brain expansion in Africa may have been driven by regional aridity. In contrast, the great leap forward in early Homo brain size at 1.8 Ma [million years ago] was associated with the novel ecological conditions associated with the appearance and disappearance of deep-freshwater lakes long the whole length of the EARS.

As this article suggests, Africa’s Great Lakes are more than simply natural resources that serve economic, social, political, cultural, and ecological purposes. They are, quite literally, engrained in our DNA.

victoria falls

Victoria Falls lie along the border between Zimbabwe and Zambia (courtesy of Wikimedia Commons).

Yet, tragically, these lakes and the people who depend upon them face a host of threats. The region has experienced extremely high rates of deforestation in recent decades due to unsustainable economic development, ongoing conflict, illicit logging, and dam construction. Annual rates of deforestation in the Congo River Basin doubled during the period from 2000-2005.

The ongoing conflict in the Democratic Republic of Congo (DRC) has displaced millions, forcing many of them to encroach upon protected areas. In Africa’s oldest park, Virunga National Park, rates of illegal logging have reached 89 hectares (220 acres) per day (PDF). And the Gibe III dam in Ethiopia is drying up Lake Turkana, threatening the livelihoods of tens of thousands of indigenous peoples.

Despite being home to 27% of the world’s freshwater, less than two-thirds of people in the Great Lakes region have access to improved water sources. Climate change is expected to exacerbate this issue even further. The IPCC projects that the total number of Africans facing water stress will climb to 75-250 million by the 2020s and 350-600 million by the 2050s.

But you don’t need to sit by and watch these Great Lakes dry up. Drink Local. Drink Tap., Inc.™ has been working to provide access to clean water for children in Uganda for the last three years. This winter, the organization will undertake three new projects to ensure that the children at St. Bonaventure Primary School and the Family Spirit AIDS orphanage can take advantage of their human right to clean water.

Just as East Africa’s Great Lakes are a part of our DNA, so too is access to clean water and sanitation an integral part of human development. We can all take small steps to ensure that we are protecting this human right for people at home and around the world

What separates a storm from a disaster?

two girls tornado destruction
two girls tornado destruction

Two girls look over the devastation left by the tornado that struck Moore, Oklahoma in May (courtesy of MSNBC).

My last post drew far more attention that I could have ever imagined. Unsurprisingly, it also garnered criticism, some of which was warranted. First, Typhoon Haiyan’s initial reported death toll of 10,000 appears – thankfully – to have been inflated. As of Monday morning, the Philippines national disaster agency had confirmed that the storm killed 3,976 people, with an additional 1,598 still missing.

Haiyan provides yet another reminder that, in the immediate aftermath of disasters, reports of the number of people killed are almost always wrong. A week after Hurricane Katrina, then-New Orleans Mayor Ray Nagin warned that perhaps 10,000 people had died; the final death toll stood at 1,833. Days after Cyclone Nargis crashed into Burma’s Irrawaddy Delta on May 2, 2008, the AP reported that the storm had killed roughly 350 people. After the floodwaters had receded, Cyclone Nargis emerged as the third most destructive storm in modern history, killing 138,373 Burmese.

Another individual noted that a storm as powerful as Typhoon Haiyan would have caused significant damage anywhere it hit, regardless of the level of development or political situation in the affected areas. This is probably true. Haiyan may have been a once in a lifetime storm. As I noted, some forecasters believe it was the most powerful storm at landfall in recorded history.

Given the fact that less powerful storms have wreaked havoc in significantly more developed parts of the world, it’s hard to imagine that Haiyan would not have become a severe disaster had it hit New York or London or Tokyo. Accordingly, and given the fact that there is no such thing as a “natural” disaster, this may lead you to as what separates a storm from a disaster.

In a word – capacity.

I described the three variables that form disaster risk – a natural hazard, physical and economic exposure, and socioeconomic vulnerability. While these three define the risk that a disaster will occur, there is a fourth variable missing.

The ability of an individual or a community to sustain and overcome the potentially destructive effects of an extreme event ultimately determines whether or not a natural hazard will become a disaster. Individuals with low levels of capacity and high levels of vulnerability often sit on the precipice of disaster on a daily basis. As Wisner et al noted, for marginalized individuals with low levels of social, political, and financial capital, “the boundary between disaster and everyday life can be very thin.” Those of us who can afford health care and homeowner’s insurance may be able to overcome a minor car accident or house fire. For those who lack these assets, such events may constitute life-altering disasters that trap them into a permanent state of emergency.

Vulnerability and capacity are determined by a cumulative set of decisions that can take place over a period of years, if not decades. These decisions are rooted in dominant social structures and ideologies, which unevenly distribute disaster risk among citizens. Anthony Oliver-Smith has described the May 1970 Ancash earthquake that struck Yungay, Peru as a “500 year earthquake (PDF).” By this, he meant the vulnerability to this seismic hazard was borne from the destruction of Incan infrastructure and land use policies that started with the Spanish conquistadors. While the proximate hazards that contributed to the disaster were local, the broader systems in which the disasters occurred grew from a set of structures remote in both space and time which overwhelmed the limited capacity of people in Yungay.

red cross hazard mapping india

An IFRC staff member conducts a participatory hazard mapping exercise with women in Varap, a village in India’s Maharashtra state (courtesy of the IFRC).

But focusing solely on vulnerability is not an effective strategy for reducing disaster risk. People facing disasters have developed a variety of coping mechanisms and strategies to help them survive. These form the heart of their adaptive capacities, and it is important for development and humanitarian actors to pay attention to these as well. The International Federation of Red Cross & Red Crescent Societies (ICRC) has been central in developing this concept. Its Vulnerability & Capacity Analysis (VCA) tool allows actors to use participatory assessments to identify both the vulnerabilities and capacities of people living in harm’s way. Only through this process can we determine both the risks that must be mitigated and the existing assets that we can build upon.

In the wake of disasters like Haiyan, there exists a window of opportunity during which change can take place. Deluding ourselves by claiming that disasters are natural events only serves to ensure we maintain the status quo. But treating survivors as nothing more than victims who need our help and our solutions is just as dangerous.

People living on the island of Leyte understand the threat of typhoons far better than I ever will. Accordingly, they already had ways to endure them long before Haiyan made landfall. What they need is not for us to bring solutions to them. They need support in identifying what assets they already possess and the resources necessary to build upon and enhance them.

We’ll never be able to create a world in which extreme weather events no longer occur. And the science suggests that every ton of CO2 we pump into the atmosphere will only increase their frequency. But we already know that investing in disaster risk reduction pays dividends. If we make it a priority to invest in building upon existing capacities and minimizing vulnerabilities, we may be able to create a world in which disasters are far less common and less destructive.

There’s no such thing as a natural disaster

typhoon haiyan image
typhoon haiyan image

An image of Super Typhoon Haiyan as it appeared the morning of Friday, November 8, just before making landfall in the Philippines (courtesy of the Capital Weather Gang).

As we all know, Super Typhoon Haiyan devastated the Philippines over the weekend. At its peak, Haiyan was perhaps the strongest tropical storm ever recorded at landfall, packing sustained winds of at least 195 mph with gusts of 235 mph. The United Nations and the Philippine Red Cross are both warning that 10,000 people may have been killed in Tacloban alone; this would make Haiyan the deadliest disaster in the history of the Philippines (though President Aquino is revising those numbers down).

I should note, however, that the true scale of a disaster is measured not in the number of dead, but in the number affected and displaced. According to the UN Office for the Coordination of Humanitarian Affairs, Haiyan affected 11.3 million people and displaced at least 673,000 Filipinos. In contrast, the 2004 Indian Ocean tsunami killed perhaps 250,000 people throughout Southern Asia but affected roughly 5 million.

The scope and scale of the devastation in the Philippines is, for lack of a better term, biblical. If you are in a position to provide support, I encourage you to make a monetary donation to the Philippine Red Cross or one of InterAction’s partner organizations working on the ground. Please donate money only. Survivors and relief organizations know what is needed, and they can source materials much more quickly and cheaply from regional sources.

As individuals and media outlets have tried to grasp the sheer scale of the devastation, they have almost unanimously referred to Haiyan as the worst natural disaster in Philippines history. The search term “Haiyan natural disaster” brings back at least 49,300,000 hits on Google, including headlines such as:

Let me be blunt: there is no such thing as a “natural” disaster. Disasters are complex, multifaceted, frequent, and overwhelming. We have a hard time fully grasping the nuance and complexity of each disaster – particularly one that strikes halfway across the world – so we turn to calling it a “natural” event. The term natural disaster is, in essence, a heuristic that we fall back upon in order to interpret the event.

In their landmark work, At Risk: Natural Hazards, People’s Vulnerability, and Disasters, Wisner, Blaikie, Davis, and Cannon term the tendency to view disasters in this light as the “myth of naturalness.” As Comfort et al put it (PDF):

A disaster is widely perceived as an event that is beyond human control; the capricious hand of fate moves against unsuspecting communities creating massive destruction and prompting victims to call for divine support as well as earthly assistance.

But a tropical storm or a tornado does not a disaster make. Rather, the risk of a disaster is a product of three variables: a natural hazard (e.g. a fault line or damaging winds), physical and economic exposure to the hazard, and socioeconomic vulnerability. To borrow liberally once more from Wisner et al:

Disasters happen when hazards hit vulnerable communities whose inherent capacity is not enough to protect itself and easily recover from its damaging effects. Disasters are the product of the social, economic and political environment.

As I tried explaining to a colleague of mine last Spring, Superstorm Sandy in DC was a hazard or an extreme weather event. Superstorm Sandy on the Jersey Shore or in Lower Manhattan was a disaster. Now, granted, most of this difference was due to the severity of the natural hazard as a result of weather dynamics. But Sandy may very well have been a disaster for someone living in a flood zone in Southwest DC (high levels of exposure) or to a homeless person without access to safe shelter from the storm (high levels of exposure and vulnerability).

Pressure and Release Model chart

The Pressure and Release Model, one way to depict the construction of disaster risk (courtesy of Wikipedia).

To their credit, a lot of journalists are starting to get it. Seth Borenstein has an excellent overview today of the social, economic, and political drivers of Haiyan.

Meteorologists point to extreme poverty and huge growth in population — much of it in vulnerable coastal areas with poor construction, including storm shelters that didn’t hold up against Haiyan.

More than 4 out of 10 Filipinos live in a storm-prone vulnerable city of more than 100,000, according to a 2012 World Bank study. The Haiyan-devastated provincial capital of Tacloban nearly tripled from about 76,000 to 221,000 in just 40 years.

About one-third of Tacloban’s homes have wooden exterior walls. And 1 in 7 homes have grass roofs, according to the census office.

Those factors — especially flimsy construction — were so important that a weaker storm would have still caused almost as much devastation, McNoldy said.

Andy Revkin had a similar analysis of the massive tornado that ravaged Moore, Oklahoma in May over at Dot Earth. But, unfortunately, these types of reports are the exception that proves the rule. Most media coverage falls back upon the “myth of naturalness.” Others obsess over debating whether or not we can attribute each individual disaster to climate change. The science of attribution is improving by leaps and bounds, and perhaps in a year or so, scientists will be able to tell us whether or not they can identify the specific fingerprints of a changed climate in the DNA of Haiyan.

But taking such an all-or-nothing approach to disasters is irresponsible. Every disaster is different, not all natural hazard events are disasters, and whether or not climate change acts through individual extreme events is not the point – it’s our new baseline. Instead, we need to understand that, contrary to conventional wisdom, humans can and do influence all three of the disaster variables. And, as a result, the number of disasters has spiked over the last century. I’ll briefly explore how we have altered each variable below.

chart of disaster occurrence 1900-2011

The number of reported disasters increased dramatically from 1900-2011, from roughly 100 per decade during the first half of the 20th century to 385 per year from 2007-2011 (courtesy of EM-DAT).

Exposure

Population and economic growth and rapid urbanization have heightened our exposure to disasters significantly in recent years. According to the UN’s International Strategy for Disaster Reduction (ISDR), the number of people and total GDP exposed to flood risks increased by 28% and 98% (PDF), respectively, from 1990-2007. As economies develop and individuals build fixed assets like homes and infrastructure in disaster-prone areas (e.g. floodplains), economic exposure spikes. At least 3.4 billion people are now exposed to one or more hazards, while 25 million square kilometers of land is hazard-prone.

Rapid and unplanned economic development has taken its toll on ecosystems, which provide vital sources of natural protection against disasters. For instance, despite the fact that intact mangrove forests can reduce the flow rate of tsunamis by up to 90%, at least half of all mangrove forests have disappeared globally. In the Philippines, 70% of mangroves were destroyed (PDF) from 1918-1993. This destruction has substantially increased physical exposure to disasters and reduced the natural environment’s ability to mitigate the risk.

Vulnerability

Of the three disaster variables, vulnerability is the most closely linked to the social, economic, and political environments. By definition, some groups appear more vulnerable to disaster risks than others. Key intervening variables include class, occupation, caste, ethnicity, gender, disability, physical and psychological health, age, immigration status, and social networks. One’s ability to access the resources s/he needs to cope with and adapt to stress – their “architecture of entitlements” (paywall) – is determined by these various factors, which shape social relations, political contexts, and structures of domination.

Differential vulnerability helps to ensure that different individuals and groups weather (no pun intended) disasters better than others. During the 2004 Indian Ocean tsunami, for instance, women were three to four times more likely to die than men in affected areas. This outcome occurred for a variety of reasons. Due to cultural norms, most women wore bulky clothing that covered most of their bodies; when this got wet, it weighed them down. Women were also far less likely to be able to swim (PDF) given their social roles and religious mores.

Natural Hazards

Interestingly, even natural hazards –the most natural of the three variables – have also undergone changes due to human actions.  Global temperatures have increased 0.85°C since 1880. Since the 1970s, global average precipitation has decreased by 10cm per year, but it has increased by more than 20% in certain regions (including the Northeast and Midwestern US). Accordingly, the number of extreme events associated with climate change rose by 50% over this three decade period. There even appears to be evidence that human activities can alter seismic risks. Researchers have connected a string of earthquakes in states from Ohio to Oklahoma to the high-pressure injection of wastewater into underground wells.

While it is difficult for reporters on a deadline to analyze the social, economic, and political drivers of various disasters, it is important that we begin to inch away from the myth of naturalness. Placing the blame for every disaster on the “capricious hand” of God or nature is dangerous and irresponsible.

First, it strips robs disaster survivors of their agency. They are just poor victims suffering from Acts of God. Secondly, in places where disasters are common (like the Philippines) it allows people who are disconnected from the events to blame the victims for not moving away from the threat. Thirdly – and perhaps most importantly – this mindset tends to make us complacent. If we accept disasters as natural events that we cannot control, what is our incentive to invest in disaster risk reduction strategies like curbing poverty, replanting mangrove forests, or hardening critical infrastructure? What is the hook for curbing carbon emissions to mitigate climate change?

The first step to addressing the rise in disasters worldwide is to admit that disasters aren’t natural. They’re manmade. Maybe if we do that we can get off our asses and do something about them.

It’s often more rational for people in disaster-prone areas not to move

Workers rebuild the boardwalk in Bayhead, New Jersey. The boardwalk was badly damaged by Superstorm Sandy (courtesy of The Atlantic Cities).

Workers rebuild the boardwalk in Bay Head, New Jersey. The boardwalk was badly damaged by Superstorm Sandy (courtesy of The Atlantic Cities).

Over at The Atlantic Cities, Prof. Harvey Moltoch has a good piece titled “Why Residents of Disaster-Prone Areas Don’t Move.” In it, he discusses some of the economic and emotional reasons why people choose to stay in vulnerable areas, even after suffering the devastating effects of disasters like Superstorm Sandy.

Consider, for example, that people are consumers of space in ways that go beyond having houses, apartments, back-yard gardens, and a place for the RV. They have friends, family and obligations nearby. Location, especially residential location, is the node around which people manage life with routines, like having specific shopping habits close to home. Even when they can be financially “made whole” in an offer to retreat, residents may not want the deal. Hence there are “hold-outs” who resist even robust financial inducements. Private developers trying to assemble large parcels know this all too well: people often want to remain for reasons that money can’t overcome.

Given the recent string of disasters that have occurred in the last few weeks – including earthquakes in Pakistan and the Philippines and massive tropical storms in China, India, and Japan – it is logical that people would want to understand the motives of those who choose to remain in harm’s way, despite the inherent risks; this is particularly true, given that each of these areas has been hit by similar destructive disasters in the past.

While Prof. Moltoch’s post provides valuable insight into this issue, it ignores a lot of other key factors that play into the decision. Moreover, it is largely applicable only to the developed world. What may seem economically rational to a person living in the Korail slum of Dhaka does not necessarily comport to Western standards, and vice versa.

In many areas, government regulations and economic structures may create incentives for individuals and businesses to build in these high risk areas. Both the presence of certain initiatives – like the heavily indebted federal flood insurance program – and the absence of others – such as a requirement to incorporate climate change projections into infrastructure planning – can incentivize people to build in areas where, if externalities were properly accounted for, it would not make economic sense for development.

One can understand the reticence of taxpayers to effectively subsidize such unsustainable development projects. Accordingly, it’s not surprising to see people make comments like “we ALL pay for their stupidity to remain in place,” which one person said in response to Prof. Moltoch’s piece. Yet, this mindset ignores the fact that, for many people (particularly in the developing world), staying in disaster-prone areas is actually the rational decision.

Given the threats posed by climate change, particularly that of sea level rise for low-lying areas, it’s common to hear about the need to resettle large populations of people from, say, Kiribati or Bangladesh. Yet, as we frequently see with resettlement programs related to large-scale development projects, people often find themselves worse off than before. The Hirakud dam, India’s first mega-dam, was completed more than 60 years ago; despite this fact, at least 10,000 people affected by the project still have not been rehabilitated fully.

While moving people from disaster-prone areas may minimize their physical vulnerability, it frequently maximizes their social vulnerability and sense of dislocation (PDF). Following major droughts in the 1980s, for instance, the Ethiopian government launched a large-scale, forced resettlement program of famine-affected households. The effort proved to be a catastrophe, and it soon turned into a state-sponsored disaster of its own.

Two vital forms of capital upon which people can draw to enhance their resilience to disasters are a familiarity with the climate/environment and social networks. Forced relocation can upset each of these in significant ways. Following massive flooding on the Zambeze and Limpopo Rivers in 2001 and 2007, the Mozambican government resettled a large number of people (PDF) from the affected flood plains. Unfortunately, these flood-safe regions suffer from recurrent drought; accordingly, many of the people who were resettled actually returned to the flood-prone areas after the disaster ended.

Moreover, forced relocation frequently breaks social bonds and interferes with various forms of social capital, leaving individuals highly vulnerable and prone to exploitation. After Hurricane Katrina, the rate of rape among women living in FEMA trailer camps was 53.6 times higher (PDF) than the rate before the storm. There is also ample evidence that Cambodian and Vietnamese families forced to migrate from their homes have sold their daughters into the sex trade (PDF) in order to generate income.

Lastly, particularly in urban areas, it is essential for poor households to have ready access to economic opportunities. Yet, the majority of formal land that the urban poor can afford is located within the peri-urban fringe, far from the urban core. Accordingly, most poor households will opt to live in marginal areas closer to the city center in order to have easier access to the economic resources upon which their livelihoods depend. This create situations in which slums develop in highly vulnerable, disaster-prone areas, such as the Annawadi slum of Mumbai that Katerine Boo chronicles in Beyond the Beautiful Forevers.

Two young Indian boys play with items they found in a garbage pile, while their mother sorts through the waste. In India, people who sort and sell trash for a living - an incredibly important job in a country with poor solid waste management - are overwhelmingly from low castes and are commonly known as "ragpickers" (courtesy of Don't Waste People).

Two young Indian boys play with items they found in a garbage pile, while their mother sorts through the waste. In India, people who sort and sell trash for a living – an incredibly important job in a country with poor solid waste management – are overwhelmingly from low castes and are commonly known as “ragpickers” (courtesy of Don’t Waste People).

So the next time that you want to criticize someone for living in the low-lying areas of New Orleans or in Orissa state in India, remember two things:

 

Welcome to tropical Cleveland, Part 2.5: Great Lakes ecosystem also vulnerable to climate change

map of climate vulnerability

I know I said my next post would be on what Cleveland can do/is doing to address its vulnerability to heat-related mortality related to climate change. But it’s my website, and I lied. I’ll get on that post as soon as I’m able.

But in the meantime, I came across this piece from Science Daily today on a new global study of vulnerability to climate change. The authors of the article in Nature Climate Change (paywalled) works to build upon weaknesses they have identified in previous analyses of vulnerability by incorporating the extent to which a changing climate will affect both the adaptive capacity of an ecosystem (which they measure as how intact its natural vegetation is currently) and how exposed it is to such changes (as measured by the projected stability of the region’s climate going forward).

Climatic instability will be significant for locations at higher latitudes, as warming tends to be far more drastic near the North Pole, as the map below illustrates. Accordingly, while the Great Lakes region may not be Siberia, it will likely experience a temperature increase higher than the global average.

map of temperature anomalies from NASA

This map shows global temperature anomalies (averaged from 2008-2012) compared to the 20th century average. As you can see, temperature increases have been particularly extreme in the Arctic (courtesy of NASA).

Moreover, as I discussed in my last post, the built environment within Greater Cleveland (and the Rust Belt, at large) amplifies the vulnerability of our ecosystems to climate change. While Cleveland is emblematic of the sprawl-based development that has cemented up millions of acres of natural vegetation, it is far from the only city to pursue this model. Kansas City, for instance, has 54% more freeway lane miles per capita than Cleveland.

Accounting for these two key variables, the authors produce a global map of vulnerability to climate change. Interestingly, their results contrast significantly from most previous studies.

For example, when climate stability (as a measure of exposure) is combined with vegetation intactness (as a measure of adaptive capacity), ecoregions located in southwest, southeast and central Europe, India, China and Mongolia, southeast Asia, central North America, eastern Australia and eastern South America were found to be relatively climatically unstable and degraded. This contrasts sharply with other global assessments (based only on exposure to climate change) that show that central Africa, northern South America and northern Australia are most vulnerable to climate change.

As the map below shows, the Great Lakes region falls within the region the authors identify as “central North America.” Accordingly, while climate change may not substantially hammer people living in Greater Cleveland, that’s more than I can say for our non-human neighbors. This study is just another thing to keep in mind as we plan for how to make the region more resilient to the changes we know are coming.

map of climate vulnerability

The map displays the relationship between climatic stability and ecosystem intact-ness. Those regions in pale green have low levels of both variables, indicating high levels of vulnerability to climate change. As the map illustrates, the Great Lakes ecosystem falls within such a zone (courtesy of Nature Climate Change).

Welcome to tropical Cleveland, part 2: The social & political roots of heat-related mortality

children at water park
children at water park

Children attempt to escape from the heat during July 2012 in Louisville (courtesy of the AP).

In my last post, I explored some recent research that outlined projections of climate change in Cleveland and its potential to drive an increase in heat waves. But climate/weather is just one factor behind heat-related mortality; socioeconomic and political issues are, perhaps, just as, if not more important, determinants.

Just as Cleveland’s historic climate and the associated lack of acclimatization to heat waves will likely leave the region more vulnerable to extreme heat, so too do the region’s various socioeconomic and political pathologies leave it ripe for a public health crisis. (As I write this, it is 97° outside, and I just got an extreme heat advisory from the National Weather Service. On September 10.)

Last month, the Graham Sustainability Institute at the University of Michigan released a new mapping tool that explores the social and economic factors underlying climate change vulnerability in the Great Lakes region. This great new tool allows you to zero in on any county around the Great Lakes to the extent to which its economy, infrastructure, and vulnerable citizens are likely to suffer in a greenhouse world. Unsurprisingly, Cuyahoga County (of which Cleveland is the seat) does not fare particularly well.

The Greater Cleveland area possesses a number of characteristics which, if they do not change, may create a perfect storm for heat-related mortality in a warmer world. I will explore four of these – the built environment, poverty, changing demographics, and racial segregation.

The Built Environment

Northeast Ohio has suffered from decades of sprawl and uncoordinated development patterns, leading to waves of suburbanization followed by exurbanization. In 1948, Cuyahoga County’s population stood at 1,389,532; just 26% of land in the county was developed at the time. Yet, by 2002, although the county’s population had grown by a mere .32% to 1,393,978, sprawl ensured that roughly 95% of the county’s land area had been developed.

cuyahoga county land use in 1948 & 2002

Changes in land use within Cuyahoga County from 1948 (left) to 2002 (right). Red shading indicates developed land, while the beige indicates land that is still undeveloped. The maps clearly demonstrate the waves of suburbanization in the county over the last six decades (courtesy of the Cuyahoga County Planning Commission).

According to data from the Cuyahoga County Planning Commission, 33.6% of the county (and 56.2% of Cleveland) is covered by impervious surfaces. These surfaces (e.g. asphalt) conduct heat, contributing to the urban heat island effect. The EPA notes that urban areas can experience annual mean temperatures of 1.8–5.4°F higher than their surroundings, while this difference can reach an astonishing 22° during the evening.

Cuyahoga County’s sprawl-based development structure presents a number of other challenges, as well. As people have spread out throughout the region, we have become increasingly car-dependent. Car use has come to dominate our policy discussions – transportation commentators like to note Ohio stands for Only Highways In Ohio” – despite its myriad of side effects.

According to the Northeast Ohio Sustainable Communities Consortium (NEOSCC), 86% of commuters in Northeast Ohio report driving alone to work. This car culture contributes to the development of chronic disease, which I discuss below. Additionally, combined with Cleveland’s industrial base and Ohio’s coal dependence, it significantly reduces air quality in the region. In its 2012 “State of the Air” report, the American Lung Association gave Cuyahoga County an F for ozone pollution and a failing grade for annual particle pollution.

Climate change will likely exacerbate this issue further. Last year, largely due to the abnormally warm summer, Northeast Ohio experienced 28 ozone action days – double the number from 2011. We know that high air temperatures increase concentrations of ground-level ozone, which can cause respiratory distress for vulnerable groups. Accordingly, Bell and colleagues have projected that ozone-related deaths will increase 0.11-0.27% in the eastern US by 2050. This issue adds to the risk of heat-related mortality in Greater Cleveland.

Changing Demographics

Like much of the Rust Belt, Cleveland has been shrinking and aging. From its peak in the 1950s, Cleveland’s population has plummeted. The city had 914,808 in 1950; by the 2010 census, the number had fallen to 396,815 – a 56.6% decrease in six decades.

This precipitous decrease in population has left large swaths of Cleveland abandoned and, increasingly hollowed out. Even before the Great Recession and the housing crisis that precipitated it began in 2007-2008, Cleveland had foreclosure rates on par with those in the Great Depression. From 2005-2009, Cuyahoga County average roughly 85,000 foreclosure filings per year, and parts of Cleveland saw nearly half of their homes enter foreclosure. The destruction of neighborhoods undermines social capital, a key coping mechanism for surviving extreme events.

foreclosures in Cuyahoga County 1995-2012

The number of annual foreclosure filings in Cuyahoga County from 1995-2012. As the chart indicates, the number of filings spiked in 2005, two years before the housing crisis began (courtesy of Policy Matters Ohio).

As people have fled the region, particularly young people and people of means, those who remain are increasingly poor and disconnected. Accordingly, the region’s population has aged significantly. Nationally, approximately 13% of the total population is age 65 or older. In Ohio, the number is 14.3%, while it sits at 15.8% in Cuyahoga County.

Older persons are far more vulnerable to the deleterious effects of extreme heat, particularly those suffering from chronic illnesses, like diabetes, and those living alone. Unfortunately, 20.6% of people 65 years and over (PDF) in the county suffer from diabetes; this number climbs to over 35% in Cleveland. Additionally, more than one-third of older persons in the county live alone, adding further to their vulnerability.

Poverty

Given the region’s challenges, it’s perhaps unsurprising that Greater Cleveland struggles with high levels of poverty. Cleveland was named the poorest city in the country in 2004; it has remained at or near the top since that point. Roughly one-third (32.7%) of Cleveland’s residents live below the poverty level. Even worse, more than half of Cleveland’s children are growing up in poverty.

map of poverty rates in Northeast Ohio

Poverty rates and changes in poverty rates within Northeast Ohio from 2005-2009 (courtesy of Rust Wire)

Much of this poverty is concentrated in highly depressed portions of the inner city and, increasingly, in the inner-ring suburbs. It creates regions where public health suffers dramatically; the Plain Dealer recently reported that portions of Cleveland had infant mortality rates higher than most of the developing world, including Bangladesh, Haiti, Pakistan, and Rwanda.

As one would expect, poor people suffer disproportionately in disasters. Roughly 95% of disaster deaths occur in the developing world, and the same principle applies within the developed world (see: Hurricane Katrina).

Racial Segregation

Lastly, Cleveland suffers from high levels of racial segregation. It was the 8th most segregated city in the US in 2011, which likely does not surprise Cleveland natives. For decades, the Cuyahoga River has been seen as something akin to the Berlin Wall – African-Americans stay to the East of the river, while whites and Hispanics live on the West Side.

Recently, the Atlantic Cities posted a map that showed the location of every person in the country (color-coded by race), based on Census data. The close-up shot of Cleveland is below. It quite clearly illustrates the racial divide within the city: African-Americans (green dots) to the east, whites (blue dots) and Hispanics (red dots) to the West. If you look closely, you can even see the small cluster of red dots that makes up Cleveland’s Asia Town.

map of Cleveland showing racial divide

The map, a closeup from the Racial Dot Map, shows the racial divide in the city of Cleveland.

Now, such spatial segregation creates a host of problems, but it also has a connection to heat-related mortality. A study published in Environmental Health Perspectives suggests that persons of color are far more likely to live in areas at risk of suffering extreme heat waves than whites. The study found that a high risk of suffering from the urban heat island effect is more closely correlated with race than class. Accordingly, severe spatial segregation, as we find in Cleveland, will ensure that poor minority neighborhoods have yet another risk factor to account for in a greenhouse world.

Taken together, Cleveland’s combination of heavy, sprawl-based development; an aging, sickly population; high rates of concentrated poverty; and racial segregation may create a perfect storm for heat-related mortality in the coming decades. The fact that sea level rise isn’t going to drown us, and it snows 6 months a year doesn’t mean we can get complacent as the climate changes. Like I said in my last post, just because it won’t suck as much as Bangladesh doesn’t mean it won’t still suck here.

Now that I’ve thoroughly depressed everyone, I will use my next post to look at some of the things Cleveland can do to mitigate the threat of heat-related mortality, including some of the initiatives the region is already undertaking.

Welcome to tropical Cleveland, Part 1: Climate change & future heat waves

Snowball is ready for when Cleveland's climate becomes more tropical.

Snowball is ready for when Cleveland’s climate becomes more tropical.

For the most part, it would appear that Cleveland is poised to cope relatively well with the effects of climate change. When The Nature Conservancy developed a list of the cities which will be best positioned to adapt to climate change; Cleveland ranked first. Grist did a similar piece back in May, and it placed Cleveland 6th on a list of the 10 “best cities to ride out hot times.”

Without question, Cleveland has a lot of assets that will help it deal with climate change. First, Lake Erie. Unlike other water-stressed cities which will suffer from the crippling effects of drought and water shortages, Cleveland has ample water resources thanks to Lake Erie and our various rivers. And as a result of the Great Lakes Compact, we can be relatively sure that our freshwater resources cannot be diverted to other areas.

Secondly, Cleveland is (largely) immune to many types of disasters. Sure, we get a lot of snow (not as much as, say, Syracuse), and our skies are as gray as our steel 6 months per year, but we don’t have to fear hurricanes, wildfires, earthquakes (unless you live near Youngstown), or tornadoes (for the most part). Trulia lists Cleveland as the 2nd best place to live if you want to avoid disasters; our sister city to the south, Akron, came in at 4th.

All of this good news can lead some people in the region to get cocky (see cartoon below) and assume that Clevelanders will be sitting pretty in a greenhouse world. But let’s not get ahead of ourselves – climate change will still suck for Cleveland; it will just suck less, relative to other locations.

This cartoon appeared on Cleveland.com on July 8. While it's certainly true that Baby Boomers have fled Cleveland in droves for cities that only exist due to the air conditioning & Manhattan financiers have screwed our city over, it won't be they who suffer the most from climate change. It will be the poor, elderly, disabled, and persons of color (courtesy of Cleveland.com).

This cartoon appeared on Cleveland.com on July 8. While it’s certainly true that Baby Boomers have fled Cleveland in droves for cities that only exist due to the air conditioning & Manhattan financiers have screwed our city over, it won’t be they who suffer the most from climate change. It will be the poor, elderly, disabled, and persons of color (courtesy of Cleveland.com).

I’ve already explored how changing temperature and precipitation patters will likely affect the levels of the Great Lakes and algal blooms in Lake Erie. But I also want to hone in on one risk that a lot of people in the region appear to overlook – the risk of increasing heat-related mortality in Greater Cleveland.

Now, I know what you’re thinking – heat-related mortality in Cleveland?! Winter lasts 6 months per year! Our average annual temperature is a whopping 49.6 degrees, and the thermometer dips below the freezing point 122 days a year. All that’s true, and that’s actually part of the issue. Heat-related mortality risk is a combination of two sets of factors – environmental and socioeconomic (which I explore in my next post).

Recently, Environmental Research Letters published a peer-reviewed article that explored how climate change will drastically increase extreme heatwaves globally. According to the authors, severe heat waves, those which fall at least 3 standard deviations above the mean (so-called 3-sigma events), will quadruple from affecting roughly 5% of the world’s land area to around 20% by 2040. More disturbingly, the models project that 5-sigma events, which are “now essentially absent” could cover 60% of land area by 2100 (under a high emissions scenario). In other words, we risk entering an entirely new climate reality, in which ever-increasing parts of the Earth may become uninhabitable.

Now, some people in Cleveland other northern climates may brush this off, believing that shorter, milder winters are somehow a blessing for the region (“Now we can swim any day in November,” as The Postal Service put it).

But let’s not get ahead of ourselves. According to the Union of Concerned Scientists (PDF), the number of days with temperatures at or above 90°F in Cleveland will likely climb from an historical average of just 9 to 61 by the end of the century, under a high emissions scenario. More disturbingly, the city is projected to endure 21 days in excess of 100°F by 2100, a situation which could be catastrophic for public health in the city.

The number of days above 90F and 100F in Cleveland under a low and high emissions scenario (courtesy of the Union of Concerned Scientists).

The number of days above 90°F and 100°F in Cleveland under low and high emissions scenarios (courtesy of the Union of Concerned Scientists).

We already know that such extreme heat waves can be deadly. The 2003 European heat wave was ultimately connected to the deaths of as many as 70,000 people. Such extreme heat waves are becoming more common and will continue to increase in frequency. According to the World Meteorological Organization, heat-related mortality jumped by more than 2000% during the last decade. Moreover, a 2011 study from Vorhees and colleagues, published in 2011, projected that, as a whole, there will be an additional 21,000-27,000 heat-related deaths (paywalled) per year in the US by 2050 due to climate change.

heat wave

Acclimatization in action. Suck it, heat wave (courtesy of Creative Commons).

Many of these deaths will likely occur in cities like Cleveland, Chicago, and Cincinnati, which are not currently equipped to handle extreme heat. Because they do not have to deal with such high temperatures on a regular basis, most people in Cleveland and other related cities have not become acclimatized to deal with significant heat waves. Air conditioning use is not nearly as prevalent as it is in Sun Belt cities, and municipal governments are unlikely to have sophisticated systems in place to help residents cope.

Scott Sheridan, a geographer at Kent State University (and, coincidentally, the program director of my semester in Geneva, Switzerland) published a study in 2011 that looked at the role of acclimatization and heat-related mortality in California. While the study predicts that morality rates will spike in most of California’s cities, Sheridan notes that acclimatization can help reduce these rates by 37-56%.

Such reductions are likely to occur in cities that are used to extreme heat, like Los Angeles, but not necessarily in places like Cleveland with much milder climates. Accordingly, while Cleveland’s relatively cool climate and mild summers will provide a buffer against the punishing heat that’s likely headed for the Southwest and Plains states, it may, ironically, leave the city more vulnerable to extreme heat waves. Such radical changes will almost certainly undermine people’s coping strategies, which they’ve developed over decades of living in a fairly stable climatic regime.

But, as I noted, the climate is just one of two factors that determine the impact of heat waves on mortality rates; the other is socioeconomic (and political). I will explore those issues in my next post.

Ocean acidification will make global warming even worse

Most of the focus and the words spilled on climate change tend to focus on the effects of increasing temperatures, changes in precipitation, and sea level rise. One issue that largely gets mentioned in passing or ignored altogether is ocean acidification.

Much like the way that the oceans have recently absorbed most of the heat trapped on Earth, our oceans have taken up roughly 25-50% of the CO2 which humans have released since 1850. Because carbon dioxide is soluble in water, it dissolves and bonds with the oxygen molecule and one of the hydrogen molecules to form carbonic acid (HCO3). As a result, the IPCC noted in its Fourth Assessment Report (AR4) that the pH level of the ocean has dropped by at least 0.1 units. Under a high emissions scenario (SRES A2), Feeley, Doney and Cooley (2009) project that the pH level of the ocean will fall from a pre-industrial level of 8.2 to roughly 7.8, equivalent to a 150% increase in acidity (PDF).

Graph of Ocean pH projections

The range of potential pH levels for the world’s oceans through 2100, based on the various IPCC emissions scenarios. Even under the best case scenario, which is becoming increasingly unlikely by the day, pH will fall below 8 for the first time in at least 800,000 years (courtesy of the IPCC).

We already know that ocean acidification will wreak havoc on ocean ecosystems by degrading coral reefs and dissolving aquatic creatures with calcium carbonite shells. These outcomes, in turn, will likely increase food insecurity for the more than one billion people who rely on fish for their primary source of protein. In other words, it’s going to suck – a lot.

image of dissolving aquatic snail shell

Side-by-side images of the aquatic snail Limacina helicina antarctica. The image on the left shows an intact snail shell, while the image on the right shows the same species in a severe state of dissolution from ocean acidification. Shelled aquatic organisms which live in the Southern Ocean, like Limicina helicina antarctica, are uniquely susceptible to acidification (courtesy of the British Antarctic Survey).

But thanks to a new article published this week (paywalled) in Nature Climate Change, we know that ocean acidification may produce another drastic outcome – amplifying global warming. According to the study, seawater that is more acidic is less saturated with dimethylsulphide (DMS), a compound of sulfur that is a byproduct of phytoplankton production. As the authors note (emphasis mine):

[It has been] observed that DMS, a by-product of phytoplankton production, showed significantly lower concentrations in water with low pH. When DMS is emitted to the atmosphere its oxidation products include gas-phase sulphuric acid, which can condense onto aerosol particles or nucleate to form new particles, impacting cloud condensation nuclei that, in turn, change cloud albedo and longevity. As oceanic DMS emissions constitute the largest natural source of atmospheric sulphur, changes in DMS could affect the radiative balance and alter the heat budget of the atmosphere.

Using data from a mecosm study conducted in 2010 off the coast of Svalbard, Norway, the researchers attempted to analyze the relationship between ocean acidity and DMS levels and the associated impact upon radiative forcing in the Earth’s atmosphere. In their analysis, they project that DMS production will decrease 26% by 2100, leading to a drop in DMS emissions of roughly 12-24%.

Because DMS emissions can alter cloud dynamics, the authors project that ocean acidification will lead to an additional radiative forcing of 0.40 watts per square meter (remember that burning fossil fuels since the dawn of the Industrial Revolution has so far led to an additional 1.6 W m−2). They conclude (emphasis mine):

We find that even in a future CO2 emission scenario as moderate as the IPCC SRES A1B, pH changes in sea water are large enough to significantly reduce marine DMS emissions by the end of the twenty-first century, causing an additional radiative forcing of 0.40 W m−2. This would be tantamount to a 10% additional increase of the radiative forcing estimated for a doubling of CO2.

As the authors note, ocean acidification may increase global temperatures by an additional 10%, equivalent to perhaps 0.2-0.3C by 2100. This study presents another important piece in the equation for estimating the potential scope and scale of the consequences of our meddling with our Earth’s climate. The climate is one of the most complicated phenomenon in the universe, and our best scientists are only beginning to understand some of its most nuanced facets. For hundreds of years, we have been blindly pulling levers and turning knobs on the machine that controls the habitability of our planet, blissfully ignorant of the implications. We are the guinea pigs in a global experiment of our own making, and the time to stop is now.

The next time the Plain Dealer writes about climate change, maybe it should interview an actual scientist

On July 23, Plain Dealer reporter and editor Cliff Pinckard published an article titled “A ‘pause’ in global warming keeps the climate-change debate in play.” As you can probably guess from the title, the post – which purported to report on recent research regarding the so-called “warming plateau” – ended up turning into a flawed, irresponsible piece that misrepresented climate science and gave climate deniers disproportionate footing and credibility.

The piece begins with a brief discussion of a recent series of three reports released by the Met Office Hadley Centre on the recent “pause” in global warming during the last 15 years. It is accurate to say that global surface temperatures have not increased at as rapid a rate since 1998 as they did in the previous 30 years. As the first of the three Met Office reports (PDF) notes, “Global mean surface temperatures rose rapidly from the 1970s, but have been relatively flat over the most recent 15 years to 2013.”

Climate deniers routinely use 1998 as the year to begin making their patently absurd claim that the Earth has been cooling over the past 15 years. This decision is strategic, as an abnormally active El Niño event that year led to a massive transfer of heat from the Pacific Ocean to the atmosphere. Since this point, the Pacific Ocean has largely remained in a neutral state, though a moderate La Niña period in the past few years has contributed to a moderate cooling trend in the region. Additionally, 1998 is no longer the warmest year on record. According to the World Meteorological Organization (PDF), 9 of the years from 2000-2010 were among the 10 warmest in recorded history, with 2010 and 2005 ranking first and second, respectively.

Decadal global average surface air temperatures for each 10-year period since 1891. As the chart illustrates, the period from 2000-2010 was the warmest decade on record, with a temperature anomaly of 0.84°C above the mean (courtesy of the WMO).

Decadal global average surface air temperatures for each 10-year period since 1891. As the chart illustrates, the period from 2000-2010 was the warmest decade on record, with a temperature anomaly of 0.84°C above the mean (courtesy of the WMO).

It is important to note, as Mr. Pinckard does briefly, that the Met Office and other climate scientists have attributed this purported “pause” in warming to a variety of potential causes, particularly the trapping of heat in the deep oceans. The first report continues:

Careful processing of the available deep ocean records shows that the heat content of the upper 2,000m increased by 24 x 1022J over the 1955–2010 period (Levitus, 2012), equivalent to 0.09°C warming of this layer. To put this into context, if the same energy had warmed the lower 10km of the atmosphere, it would have warmed by 36°C! While this will not happen, it does illustrate the importance of the ocean as a heat store.

The vast majority of global warming is stored in the oceans, particularly below 700 meters, due to sheer size of the oceans, compared to land area, and the ability of water to trap and store heat (courtesy of Skeptical Science).

The vast majority of global warming is stored in the oceans, particularly below 700 meters, due to sheer size of the oceans, compared to land area, and the ability of water to trap and store heat (courtesy of Skeptical Science).

Had Mr. Pinckard stopped there, his article would have been relatively accurate and innocuous. But instead, he ventured into false equivalence land, feeling the irrepressible need to provide “balance” by quoting climate deniers. James Fallows, who has spent far too much of his outstanding career at The Atlantic reporting on the media’s penchant for false equivalence, has settled on its definition:

False equivalence, the definition (courtesy of James Fallows & @natpkguy).

False equivalence, the definition (courtesy of James Fallows & @natpkguy).

Mr. Pinckard devotes the next 329 words of his article – 36.4% of the whole piece! – to quoting at length from professional climate denier/right wing columnist Rupert Darwall (who has no background in climate science) and someone named Nirav Kothari writing on a random Indian financial site. I’m not sure how these two gentlemen warrant mentioning or quoting at length, but actual climate scientists are shut out of the piece. Perhaps Mr. Picknard can elaborate.

Even more disturbingly, Mr. Pinckard grants equal footing to the claims of these deniers. He argues in both the piece’s headline and the caption under its sole picture that the Met Office’s work means the “the climate-change debate is in play” and that “some people [are] wondering if man-made emissions really have an impact on the environment.”

Mr. Pinckard’s decision to use a complex debate within the climate science community as a reason to launch these patently false and absurd claims is highly irresponsible, if not journalistic malpractice.

There is no debate within the scientific community as whether or not anthropogenic greenhouse gas emissions “have an impact on the environment.” Svante Arrenhius first discovered that greenhouse gases, particularly carbon dioxide, could alter the heat budget of the atmosphere and lead to global warming in 1895.

We know for a fact that  greenhouse gas emissions from human activities are increasing the heat-trapping potential of the atmosphere. Based on evidence from tree rings and ice cores, we know that the average concentrate of CO2 in the atmosphere during the Holocene, the mild and fair geological age in which human civilization has developed, stood at a fairly stead 280ppm. This changed with the advent of the Industrial Revolution, and C02 concentrations have spiked by more than 40%, reaching 400ppm in May for the first time in at least 3,000,000 years.

Historical concentrations of CO2 in the Earth's atmosphere, as measured over the last 800,000 years. As the chart suggests, the historical measure stayed at or below 280ppm throughout this period, but the number spiked rapidly after 1850 (courtesy of the Scripps Institution of Oceanography).

Historical concentrations of CO2 in the Earth’s atmosphere, as measured over the last 800,000 years. As the chart suggests, the historical measure stayed at or below 280ppm throughout this period, but the number spiked rapidly within the last 200 years (courtesy of the Scripps Institution of Oceanography).

During this period, the atmosphere has begun trapping an additional 1.6 watts per square meter of heat every second, equivalent to the amount of energy stored in four Hiroshima-sized atomic bombs.

We know that the warming has primarily been caused by increasing concentrations of CO2 and, to a lesser extent, other heat-trapping gases like methane and nitrous oxide. Scientists are able to determine this by measuring the wavelengths of long-wave infrared radiation as it reaches the ground and as it leaves the Earth. Sure enough, the mass spectrometers show spikes in radiation levels grouped around CO2 and other known greenhouse gases.

Spectrum measurements of the various wavelengths of greenhouse gas radiation at the surface of the Earth, drawn from Evans (2006). As the chart shows, the overwhelming majority of radiation falls within the spectrum of CO2, with significant contributions from CH4 (methane), O3 (ozone), and N2O (nitrous oxide), all of which are known greenhouse gases (courtesy of Skeptical Science).

Spectrum measurements of the various wavelengths of greenhouse gas radiation at the surface of the Earth. The overwhelming majority of radiation falls within the spectrum of CO2, with significant contributions from methane, ozone, and nitrous oxide, all of which are known greenhouse gases (courtesy of Skeptical Science).

Moreover, had Mr. Pinckard bothered to actually read the reports from the Met Office, he might have discovered that a decade or two of relatively flat temperatures has been predicted by climate models.

[T]he results show that a pause of 10 years’ duration is likely to occur due to internal fluctuations about twice every century.

The third Met Office report (PDF) also notes that the recent pause will not continue for long and will have almost no impact on the long-term trends in warming. The authors conclude first that “the physical basis of climate models and the projections they produce have not been invalidated by the recent pause.” Additionally, they argue “the recent pause in global surface temperature rise does not materially alter the risks of dangerous climate change.”

Mr. Pinckard fails to provide this important context to his readers or offer the additional evidence, besides average land surface temperatures, that global warming has continued apace. It is called global warming, not land warming, for a reason.

The next time that The Plain Dealer wants to cover an issue involving our global climate, which is easily one of the most complex and misunderstood topics in the world, I would suggest their reporter(s) do the following:

    • Go to Google and type the following: site:skepticalscience.com climate-related search term.
    • Watch the following video from NASA for further proof that, yes, the planet is warming

PD editorial on Obama’s climate plan is lazy, wrong & shortsighted

President Obama wipes his brow while delivering his climate speech at Georgetown University on June 25 (courtesy of The Atlantic Wire).

President Obama wipes his brow while delivering his climate speech at Georgetown University on June 25 (courtesy of The Atlantic Wire).

Last Sunday (June 30), the editorial board of The Plain Dealer published an editorial titled “Don’t bypass Congress on climate-change policy,” which criticized President Obama’s climate policy speech at Georgetown on June 25. In the piece, the board argued that the President is acting inappropriately by taking executive action to tackle the US’s greenhouse gas emissions through the Environmental Protection Agency. They note that the proposed regulations on GHG emissions from existing coal-fired power plants would “drive many of them out of business.” They continued:

Such plant closures would disproportionately hurt coal-dependent states such as Ohio. It is unfair to expect one region or small group of states to shoulder the chief economic impacts of a radical policy shift without subsidies or offsets.

An extreme U.S. policy aimed at divesting the nation from coal-fired energy should not be decided by the White House alone.

Unfortunately for the PD editorial board (and the public in Northeast Ohio it’s supposed to inform), this argument is a house of cards that one can easily dissect. So allow me to do so.

First, the board refers to the proposal as one of the “mandates that need no congressional approval” of which Americans must be “wary.” Nowhere in the piece does the board mention the fact that in Massachusetts et al. v. EPA (2007, PDF) the US Supreme Court ordered the EPA to determine if carbon dioxide constitutes a danger to public health in the country, the so-called “endangerment finding”. Justice Stevens, writing for the majority, noted that:

Because greenhouse gases fit well within the [Clean Air] Act’s capacious definition of “air pollutant,” EPA has statutory authority to regulate emission of such gases…

On December 7, 2009, the EPA issued the results of its endangerment finding, noting that

the current and projected concentrations of the six key well-mixed greenhouse gases…in the atmosphere threaten the public health and welfare of current and future generations.

Yet, despite this judicial ruling that EPA regulate GHGs, the editorial board makes no reference to the jurisprudence or the endangerment finding. It treats the President’s actions as if they were capricious and unexpected, rather than mandated by the highest court of the land.

Bipartisanship & consensus are to the PD editorial board as the ring was to Gollum (courtesy of Wikicommons).

Bipartisanship & consensus are to the PD editorial board as the ring was to Gollum (courtesy of Wikimedia Commons).

Secondly, the editorial board criticized the President for not working towards the consensus it reveres so highly. “Consensus” and “bipartisanship” are the buzzwords of the day for the Very Serious Persons who sit on editorial boards around the country. Yes, if only President Obama could reach out to Congressional Republicans and bring them to the table on climate action.

Of course, this belief completely belies reality. The modern Republican Party is the only opposition party in the world that steadfastly denies climate science. Moreover, the party remains completely obsequious to the fossil fuel industry. According to a recent study from the Investigative Reporting Workshop at American University, 411 elected officials around the country have signed a pledge to the Koch brothers-funded Americans for Prosperity promising to avoid taking action on climate change.

Furthermore, while VSPs at the PD and The Washington Post continue to write ballads about their fantasy carbon tax, recent evidence suggests that the EPA route may be the better alternative. A report from Resources for the Future suggests that, depending on the details, EPA regulation would likely be more effective at reducing GHG emissions than a carbon tax. This is particularly true, given the carbon tax that would likely come out of the current Congress – none.

Thirdly, the PD editorial board asserts, without providing any evidence, that the President’s climate plan will necessitate “sweeping economic sacrifice” and will change the “lifestyles and energy sources” of Ohioans.  Once again, the board refused to let fact get in the way of a [not so] good argument.

For decades, industry shills and their supporters have cried out against EPA regulations, claiming they would destroy the American economy. Yet, in case after case, the benefits of these regulations have far exceeded estimates, while the costs have been vastly lower than projected. The Edison Electric Institute claimed (PDF) that the 1990 Clean Air Act (CAA) amendments would carry $4-5 billion in annual compliance costs. The actual annual cost? $836 million. They were only off by 81.4%. According to a 2010 study, the benefits of the CAA and the 1990 amendments outweighed the costs by a ratio of 32.1 to 1 ($23.42 trillion in benefits to $730 billion in costs).

The monetized costs and benefits of the Clean Air Act and its 1990 amendments. As the table shows, the benefits of the CAA have vastly outweighed its costs (courtesy of Small Business Majority).

The monetized costs and benefits of the Clean Air Act and its 1990 amendments. As the table shows, the benefits of the CAA have vastly outweighed its costs (courtesy of Small Business Majority).

A recent study from the Natural Resources Defense Council suggest that EPA regulations on GHG emissions will once again provide a significant net benefit. In December, NRDC put together a proposed set of regulations for EPA to implement. This plan would set state-by-state emissions reductions standards, allowing coal-dependent states like Ohio to make a more gradual shift to more renewable energy sources. According to their assessment, the plan would reduce GHG emissions by 26% by 2020; its benefits would be roughly 6 to 15 times greater (PDF) than its associated costs.

NRDC recently had a respected firm run an economic assessment of this plan (PDF). The firm, Synapse Energy Economics, found that, contrary to the warnings of the naysayers at the PD, this plan would create 210,000 jobs and reduce electric bills by $0.90 per month through 2020.

Graph from Synapse Energy Economic's report on the NRDC policy proposal. As the graph shows, Ohio is projected to gain the second most jobs from EPA action (courtesy of Synapse Energy Economics).

Graph from Synapse Energy Economic’s report on the NRDC policy proposal. As the graph shows, Ohio is projected to gain the second most jobs from EPA action (courtesy of Synapse Energy Economics).

Ohio, one of the 14 states included in the analysis, would particularly benefit. The state would gain an additional 12,000 jobs – second only to Florida – and households would pay $1.03 less per month for electricity. Moreover, these regulations would simply speed up the transition away from coal that the state is already making. Under SB 221, Ohio is already obligated (PDF) to improve its energy efficiency by 22.2% and get 12.5% of its energy from renewable energy sources. Rather than increasing prices or killing jobs, a study from Ohio State has concluded that the policy saved ratepayers $170 million on their electric bills from 2008-2012 and created 3,200 jobs in the state.

Lastly – and unsurprisingly, given Ohio’s fealty to the coal industry – the editorial fails to mention any of the serious consequences of the state’s dependence on coal. A myriad of studies shows that coal carries significant costs for public health and well-being. According to a 2011 research article (PDF),

the life cycle effects of coal and the waste stream generated are
costing the U.S. public a third to over one-half of a trillion dollars annually.

If we were to internalize these externalities, the authors estimate that the price of coal-fired electricity would double or triple, making it noncompetitive with renewables. The Clean Air Task Force has concluded (PDF) that coal plants are responsible for 13,200 premature deaths, 20,400 heart attacks, and 217,600 asthma attacks annually in the US. Given Ohio’s dependence on this filthy fuel, the state ranked 2nd in 2010 for in coal-related mortality risk, hospital admissions, and heart attacks. The Cleveland metro area ranked 8th for mortality. All in all, evidence suggests that, for every $1 in economic benefits from coal, it carries $2 in costs to the public.

Mortality per 100,000 people from coal-fired power plants. As the map illustrates, coal-dependent states and their neighbors, including Ohio, suffer substantially from its effects (courtesy of the Clean Air Task Force).

Mortality per 100,000 people from coal-fired power plants. As the map illustrates, coal-dependent states and their neighbors, including Ohio, suffer substantially from its effects (courtesy of the Clean Air Task Force).

The Plain Dealer‘s editorial is just the latest in a series of inaccurate claims that EPA regulations will doom the American economy. They have proven wrong, time and again, and the PD will almost certainly be wrong here. The editorial is inaccurate, shortsighted, and – to be frank – an extremely lazy argument. As President Obama said in his climate speech,

[T]he problem with all these tired excuses for inaction is that it suggests a fundamental lack of faith in American business and American ingenuity. These critics seem to think that when we ask our businesses to innovate and reduce pollution and lead, they can’t or they won’t do it. They’ll just kind of give up and quit. But in America, we know that’s not true.

The next time the PD wants to write about climate policy, I suggest the editorial board actually does its homework, rather than relying on a tired set of easily disproved talking points.