Watch this year’s crazy winter unfold in 64 seconds

polar vortex image
polar vortex image

Surface air temperatures over the contiguous United States on January 6, the day the polar vortex slammed into the eastern half of the country.

The vernal equinox, which marks the official start of Spring for the Northern Hemisphere, may not come until the middle of next Thursday (March 20), but according to meteorologists, this year’s winter ended on February 28. While you may not agree if you live in the Midwest or Northeast and look outside today, meteorological winter occurs from December-February, the three coldest and snowiest months of the year for the contiguous United States.

Thanks to this new handy video from the National Center for Atmospheric Research (NCAR), you can now watch the crazy winter weather from the last two months unfold in just 64 seconds:

As you can see, there was a stark difference between surface air temperatures in the eastern and western US. While persistent influxes of frigid Arctic air have made it bitterly cold in the eastern US, the western US has experienced a remarkably mild winter, reflected by the frequent appearance of orange and red hues. While Detroit is experiencing its most miserable winter on record, both Las Vegas and Tuscon have seen their warmest winters ever, contributing to the persistent megadrought in the Southwest.

All told, the December-January period was just the 33rd coldest on record for the lower 48 states, and it’s unlikely that the entire December-February period will even break into the top 10 coldest winters on record. And, as NCAR notes, the Arctic spells have been interspersed by periods of unseasonably mild weather. Baltimore saw its mildest winter night on record, while it reached a balmy 63°F on December 21 in Cleveland.

So while it’s been cold as hell for those of us east of the Rockies, things could have been a whole lot worse.

Maintaining road quality is good for the climate & the budget

cleveland potholes
cleveland potholes

Driving down West 117th has been a real adventure this winter (courtesy of Cleveland.com).

It’s not exactly a secret that Cleveland’s roads are in rough shape right now. The city’s streets are pockmarked with potholes of all shapes and sizes, most of them enormous. The Scene recently parodied the issue, writing

After driving into a massive pothole at Clifton Boulevard and West 117th Street last week, Lakewood resident Harold Dreifer has now begun to live there. He tells Scene, “There was just nowhere else to go. It was a long fall down here; I decided that I may as well set up shop.”

While the City claims that this year has been relatively average, it does seem to be admitting it has been overwhelmed by the problem, as evinced by the fact that it is paying a private “pothole killer” $225 per hour to patch city streets. I have no doubt that I had to repair two flat tires last week in large part because of the fact that driving down my street is like driving down a Connect Four board lain on its side.

connect four

Connect Four: great for leisure, not so much for road surfaces (courtesy of Wikimedia Commons).

Obviously much of this road quality issue stems from this year’s relatively brutal winter. January’s polar vortexes gave way to Cleveland’s 5th coldest February since 1964. This winter has produced 10 days with temperatures below 0°F, the most in three decades, and 66 days with at least a trace of snow (through February 28). The persistent cold and snow, followed by continual freeze-thaw cycles, provides prime conditions for potholes. It weakens the pavement, leads to continued plowing, and prevents road crews from repairing potholes in timely manner.

But other factors beyond the weather have conspired to create this problem. Since taking office in 2011, Governor Kasich has balanced the state budget on the backs of local governments. Over the past four years (two biennial budget cycles), the State of Ohio has pilfered roughly $1 billion from the Local Government Fund, the primary pool of state tax revenues available to municipalities. The 2012-2013 state budget alone cost the city of Cleveland $45 million in foregone tax revenues that could fund vital city services.

In this era of slashing government revenue to provide tax breaks for the wealthiest Ohioans, it’t not surprising that road maintenance has gotten short shrift. While investing in public infrastructure construction and maintenance can create jobs and generate a wide array of other benefits, it’s also extremely expensive. According to data from a 2008 report by Nicholas Lutsey, maintaining road surface quality, or “strategic management of pavement roughness” in academ-ese, is much less cost effective than other transportation sector options, as shown in the table below.

transportation sector cost effectiveness

Cost-effectiveness of various transportation sector policies. Note that lower numbers – particularly negative values – indicate more cost-effective options (courtesy of Wang, Harvey & Kendall).

But incorporating the value from reducing greenhouse gas emissions can change these ratios. According to Ting Wang, John Harvey, and Alissa Kendall, authors of a recent article with the incredibly captivating title “Reducing greenhouse gas emissions through strategic management of highway pavement roughness,” maintaining road quality is an excellent strategy for tackling climate change.

Road maintenance can reduce greenhouse gas (GHG) emissions from the transportation sector because pavement roughness causes vehicles to lose energy as waste heat. As the authors note,

Because an improvement in smoothness immediately affects every vehicle traveling over the pavement, the cumulative effects on GHGs can be substantial in the near term.

According to their research, implementing optimal road maintenance strategies in California could reduce GHG emissions by 0.57-1.38 million metric tons of CO2 equivalent (MMTCO2e) per year in the state. The maximum value is similar to the annual GHG emissions of 300,000 passenger vehicles or, using the State Department’s extremely flawed analysis, the Keystone XL pipeline. Accounting for GHG reductions and total user costs, the cost effectiveness of maintaining pavement quality goes from $416 per ton of CO2 equivalent (tCO2e), a net loss, to -$710 to -$1,610/tCO2e, a major net benefit. Accordingly, incorporating the climate benefits of proper road maintenance can make the practice 2.75-4.9 times more cost effective for governments.

President Obama ordered federal agencies to incorporate climate change into their planning and policy making activities last fall. As this research demonstrates, this approach is a sound one, and municipal governments should follow suit.

Could climate change actually increase winter mortality?

great lakes ice cover
great lakes ice cover

Ice engulfs much of the Great Lakes in this image from February 19 (courtesy of NASA Earth Observatory).

If you already thought that the impacts of climate change were incredibly complicated and, often, downright confusing, I’ve got bad news for you – things just got even more complex.

For years, researchers focusing on climate change concluded that increases in heat-related mortality would, by and large, be accompanied by decreasing cold-related mortality. As winter temperatures warm – which they have at an extremely fast rate – the health risk posed by extreme cold is assumed to decrease in a nearly inverse proportion. In its Fourth Assessment Report (AR4), for instance, the IPCC highlighted research that projected cold weather deaths would decrease by 25.3% in the United Kingdom from the 1990s to the 2050s.

But a new study in Nature Climate Change calls this assumption into question (paywall). As the study’s authors note:

An extensive literature attests to the fact that changes in daily temperature influence health outcomes at the local levels and that [excess winter deaths] are influenced by temperature. However, our data suggest that year-to-year variation in EWDs is no longer explained by the year-to-year variation in winter temperature: winter temperatures now contribute little to the yearly variation in excess winter mortality so that milder winters resulting from climate change are unlikely to offer a winter health dividend.

In order to explore the potential effects of climate change on winter mortality rates, the authors analyzed the factors which contributed the number of excess winter deaths (EWDs) in the UK from 1951-2011. They found that, across this entire span, housing quality, heating costs, the number of cold winter days, and influenza accounted for 77% of variation in annual EWDs.

cold weather death correlations

These charts depict the correlation between excess winter deaths and either the number of cold days (left) or influenza activity in the UK. As the charts suggest, the number of cold days drove excess mortality until around 1976, when the flu became the dominant factor.

But, when they further broke the data down into three 20-year timeframes (1951-1971, 1971-1991, and 1991-2011), they concluded that, while housing quality and the number of cold days were the primary drivers of winter mortality from 1951-1971, this effect disappeared after that point. Instead, flu activity became the only significant driver from 1976-2011. Accordingly, as they argue,

[W]e show unequivocally that the correlation between the number of cold winter days per year and EWDs, which was strong until the mid 1970s, no longer exists.

But, the authors don’t stop here. They continue by explaining that climate records actually suggest that “winter temperature volatility has increased in the UK over the past 20 years,” despite global warming. As I discussed in a previous post on heat-related mortality, the ability of people to acclimate to local weather patterns is a key determinant in temperature-related mortality rates.

As winters continue to warm, people will slowly see their comfort baseline shift; accordingly, when extreme cold snaps, like the Polar Vortex that hit the Eastern US in January, occur,

The nefarious effects on EWDs could be substantial, with especially the vulnerable being caught off-guard by abrupt changes in temperature.

Due to this increasing volatility in winter temperatures, population growth, and the continued graying of populations (people aged 65 and over are far more susceptible to influenza), it’s entirely possible that global warming could actually increase cold weather mortality rates.

A similar study from fall 2012 (paywall), also published in Nature Climate Change, lends further credence to this research. The article examined the influence of climate change on mortality rates from extreme temperatures in Stockholm; the authors compared mortality rates from 1900-1929 to those from 1980-2009.

mean winter temperatures stockholm

This chart depicts the distribution of the 26-day moving average for mean winter temperatures in Stockholm. The black bars, which show data from 1980-2009, suggest that baseline winter temperatures have increased over the last century.

The study, which examined changes in mortality rates from both extreme cold and extreme heat, found increases in both phenomena. The number of extreme cold events increased to 251 in 1980-2009 from 220 during 1900-1929. This change led to an additional 75 deaths.

Significantly, this study echoed two key findings from the UK article. First, cold weather extremes appear to have increased in frequency over the last century, likely as a result of global warming. Secondly, little evidence exists to suggest that people have adapted to the changing climate. According to the authors of the Stockholm article,

The stable and constant mortality impact of cold and heat over the past three decades, independent of the number of extreme events, shows the difficulties in adapting to changing temperatures…Future changes in the frequency and intensity of heat waves might be of a magnitude large enough to overwhelm the ability of individuals and communities to adapt. The expected increase in the number of elderly and other potentially vulnerable groups, in absolute numbers and as a proportion of the population, could make the impact of temperature extremes on human health even more severe.

And just when you thought it couldn’t get any more complicated, it can. Two studies published in 2009, one focusing on Sweden and one focusing on Italy (paywall), established an inverse relationship between weather-related mortality rates in the winter months and mortality rates during the following summer.

In other words, because the vulnerability factors for both cold- and heat-related mortality overlap to such a degree, any decrease in winter mortality due to global warming will likely be offset by a corresponding increase in excess mortality during the summer months. As the authors of the Italian study wrote,

Low-mortality winters may inflate the pool of the elderly susceptible population at risk for dying from high temperature the following summer.

So, to all of the climate deniers or “skeptics” who claim that global warming will somehow be beneficial – I’m looking at you, Congresswoman Blackburn – please take note: climate scientists keep discovering new ways that life is going to get drastically worse, unless we act now to slash carbon emissions and prepare for the warming that’s already locked in.

Extreme heat increases migration from rural areas

hanna lake dried up
hanna lake dried up

A man walks through the desiccated remains of Hanna Lake in Balochistan, which dried up during a decade-long drought in the region (courtesy of Al Jazeera).

The link between extreme weather and migration remains ambiguous, despite the hype surrounding so-called climate refugees, but new research appears to bolster the connection.

A new study published this week in Nature Climate Change (paywall) explores the effects of different disasters on human migration patterns in rural Pakistan. In light of the severe floods that have affected Pakistan in recent years, particularly the historic 2010 floods that affected 20 million people, the authors focused on the impact that extreme rainfall and temperatures have on patterns of migration in the country. The study examines the relationship over a 21-year period (1991-2012), relying on data from three longitudinal surveys.

The authors analyze several key weather variables, including rainfall during the monsoon season, average temperatures during the Rabi (winter wheat) season, flood intensity, and a 12-month moisture index measurement.

The various measures of rainfall have no significant effect on the mobility of men or women, either within or outside of the villages surveyed. In fact, the results suggest that periods of high rainfall actually decrease out-migration within the villages, perhaps due to the fact that farm and non-farm incomes increase significantly during these periods.

These results correspond with previous studies examining the relationship between rainfall and migration. Afifi and Warner examined the influence of 13 different forms of environmental degradation on patterns of international migration. They found that only one of the 13 – flooding – failed to increase international migration flows. In addition, Raleigh, Jordan, and Salehyan (PDF) concluded in 2008 that Bangladeshis affected by flooding migrated just two miles from their homes, on average, and that the vast majority of those displaced returned home shortly after the flood waters receded.

In contrast to flooding, this study did find a robust relationship between extreme heat and out migration flows. The authors note that males in rural Pakistan are 11 times more likely to leave their villages when exposed to extremely high temperatures. These results hold for both land-owners and non-land owners, as well as asset-rich and asset-poor Pakistanis. This outcome likely stems from the fact that extreme heat decreases both farm and non-farm incomes by 36% and 16%, respectively.

The authors also find that both men and women appear far more likely to migrate during periods of both extreme high temperatures and low rainfall. This result indicates that out migration flows are likely to spike during extreme droughts.

While droughts often appear to develop due largely to below-average rainfall, they actually originate through a much more subtle interaction of precipitation and temperature. Less rainfall tends to lower soil moisture levels, which, in turn, increases heat transfer from the soil to the air and elevates surface albedo. These effects drive up temperature further, often creating a positive feedback cycle by which lower rainfall and higher temperatures work together to drive prolonged droughts.

The results of the study have important implications for governments, donor organizations, and NGOs operating in a greenhouse world. As global temperatures continue to rise, we already know that the likelihood of extreme heatwaves will spike dramatically. This outcome will likely increase rural out-migration in the developing world. Moreover, the authors suggest that their work will require donors and aid agencies to reconsider how they respond to and plan for disasters in the future.

Existing flood relief programs may potentially crowd out private coping mechanisms such as migration, particularly for the poor and risk-averse living in flood-prone areas. Our results also show the important role of heat stress — a climate shock which has attracted relatively less relief — in lowering farm and non-farm income and spurring migration. Sustainable development will require policies that enhance adaptation to weather-related risks for farmers and for enterprises tied to the rural economy. Shifting relief towards investments in heat-resistant varieties, producing and disseminating better weather forecasting data and weather insurance, and policies that encourage welfare-enhancing migratory responses might improve individual abilities to adapt to an array of weather-related risks.

January is the vanguard of climate change in the US

map mean temperature anomalies january 2014
map mean temperature anomalies january 2014

Mean temperature anomalies for the continental US from January 1-26 (courtesy of the National Weather Service).

It’s been freaking cold in the Eastern half of the US, and it’s only gotten colder in the past 12 hours or so.

Another section of the dreaded Polar Vortex has broken off and is hovering over the Midwest. This morning, temperatures hovered around -9°F in Cleveland, just shy of the record low for the date. Further inland, however, temperatures plummeted to -14°F or lower.

There’s no question that this January has been abnormally cold and snowy for the region. Through yesterday, the average temperature this month was 22.6°F, which is 5.4°F below the long-term average of 28.1°F. The only way for the monthly temperature to reach that mark would be if the next 5 days were, on average, 63°F. Given that it’s currently 5°F and tomorrow’s high will be 12°F, that isn’t going to happen.

Yet, by most regards, this January has been far from record-breaking in Northeast Ohio. To date, it is only the 16th coldest January since 1964, and the temperature anomaly is not statistically significant (for fellow nerds, the z score is -0.723). Furthermore, just 5 years ago in 2009, the average monthly temperature for January was 19.4°F, the third coldest on record.

But, as we know, one cold month or even winter does not a trend make; the world is warming steadily. And, in the US, January has warmed at a faster rate than any other month. It has been the vanguard of warming.

From 1970-2013, January warmed by a rate of 1.14°C per decade, nearly twice as fast as any other month. [Interestingly, this trend does not hold worldwide. October, which has the lowest rate of warming in the continental US, has warmed at the greatest rate (0.33°C per decade) globally. This result is likely due to the fact that global temperatures include data from both sides of the equator.]

monthly average temperature anomalies

Monthly average temperature anomalies for the United States for 1970-2013 (Data from NOAA National Climatic Data Center).

Cleveland has followed a similar trend. Over the last 50 years, January has demonstrated the greatest rate of warming, with average monthly temperatures increasing by 1.371°F per decade. This number is nearly two-thirds larger than second-place February, which has warmed at a rate 0.826°F.

cleveland monthly temperature anomalies

Monthly average temperature anomalies for Cleveland from 1970-2013 (data from Northeast Ohio Media Group).

Moreover, the number of days on which temperatures dip below 10°F has fallen steadily during this period, decreasing by 2.31 days per decade. This winter has clearly bucked that trend, as there have already been 12 days below 10°F since the beginning of December. That’s the most since we had 15 such days in 2009, and we aren’t even into February yet.

number of frigid nights in cleveland

The number of frigid nights in Cleveland per year, 1970-2013 (courtesy of Climate Central).

Interestingly, for as far below average as temperatures have been in the Midwest, they’ve been even higher than average throughout the West. While mean temperatures have been 4-5°F below average throughout much of the country, nearly all of California, Montana, and Nevada have seen temperatures upwards of 8-9°F higher than normal. This disparity doesn’t even account for Alaska’s abnormally warm winter weather. Fairbanks, for instance, has been three times warmer than normal this January.

Climate change deniers have consistently tried to use the cold snap blanketing most of the eastern US as evidence that, as noted climatologist Donald Trump put it, global warming is “bulls#*t.” Cold weather in January doesn’t disprove climate change. In fact, January has been the proverbial canary in the coal mine for global warming, and that trend hasn’t changed in the last 28 days.

2013 made 1988 look downright frigid by comparison

James Hansen 1988 testimony
James Hansen 1988 testimony

Dr. James Hansen testifying before the Senate Committee on Energy and Natural Resources in 1988 (courtesy of The Washington Post).

In his landmark testimony (PDF) before the US Senate Committee on Energy and Natural Resources on June 28, 1988, Dr. James Hansen, then director of the NASA Goddard Institute for Space Studies said,

The present temperature is the highest in the period of record…The four warmest years, as the Senator mentioned, have all been in the 1980s. And 1988 so far is so much warmer than 1987, that barring a remarkable and improbable cooling, 1988 will be the warmest year on record.

Hansen’s testimony proved to be accurate. 1988 ended up as the warmest year on record at that time, dating back to 1880, according to data from NOAA. The average global temperature in 1988 was 0.34ºC above the 20th century average, just edging out the 0.33ºC temperature anomaly from 1987.

global annual temperatures 1880-1988

Annual temperature anomaly records, worldwide, from 1880-1988 (courtesy of NOAA).

Flash forward to today. Yesterday, NOAA reported that, globally, 2013 tied 2003 as the fourth warmest year on record. Despite abnormally cool temperatures in the continental US during November and December, those months proved to be the warmest and third warmest on record worldwide, respectively. Overall, 2013 was 0.62ºC above the 20th century average. Accordingly, the temperature anomaly for 2013 was 82% larger than that for 1988.

Seth Borenstein, the great AP reporter on weather/climate issues, made a remarkable observation on Twitter yesterday after NOAA released its data.

The great warming of 1988, which sparked Hansen’s testimony and put climate change on the map as a political issue, is now so ordinary that it no longer ranks among the 20 warmest years on record. The rate of warming may have slowed slightly since 1988, but the total warming trend continues to plow ahead. It took all of 25 years to push 1988 out of the top 20.

annual global temperatures 1988-2013

Annual global temperature anomalies from 1988-2013 (courtesy of NOAA).

Looking at the overall trend, you can really get a sense of how quickly warming has increased since the 1940s. Due to a string of cooler than normal years from the 1880s-1930s, the average temperature anomaly from 1880 (when record keeping began) and 1988 was actually -0.08ºC. Since 1988, that warming anomaly has skyrocketed to an average of 0.49ºC.

Twenty five years ago, 1988 stood out for its searing, abnormal heat. Unless we take action, 25 years from now we may end up looking back nostalgically at that year and praying for temperatures that cool.

What separates a storm from a disaster?

two girls tornado destruction
two girls tornado destruction

Two girls look over the devastation left by the tornado that struck Moore, Oklahoma in May (courtesy of MSNBC).

My last post drew far more attention that I could have ever imagined. Unsurprisingly, it also garnered criticism, some of which was warranted. First, Typhoon Haiyan’s initial reported death toll of 10,000 appears – thankfully – to have been inflated. As of Monday morning, the Philippines national disaster agency had confirmed that the storm killed 3,976 people, with an additional 1,598 still missing.

Haiyan provides yet another reminder that, in the immediate aftermath of disasters, reports of the number of people killed are almost always wrong. A week after Hurricane Katrina, then-New Orleans Mayor Ray Nagin warned that perhaps 10,000 people had died; the final death toll stood at 1,833. Days after Cyclone Nargis crashed into Burma’s Irrawaddy Delta on May 2, 2008, the AP reported that the storm had killed roughly 350 people. After the floodwaters had receded, Cyclone Nargis emerged as the third most destructive storm in modern history, killing 138,373 Burmese.

Another individual noted that a storm as powerful as Typhoon Haiyan would have caused significant damage anywhere it hit, regardless of the level of development or political situation in the affected areas. This is probably true. Haiyan may have been a once in a lifetime storm. As I noted, some forecasters believe it was the most powerful storm at landfall in recorded history.

Given the fact that less powerful storms have wreaked havoc in significantly more developed parts of the world, it’s hard to imagine that Haiyan would not have become a severe disaster had it hit New York or London or Tokyo. Accordingly, and given the fact that there is no such thing as a “natural” disaster, this may lead you to as what separates a storm from a disaster.

In a word – capacity.

I described the three variables that form disaster risk – a natural hazard, physical and economic exposure, and socioeconomic vulnerability. While these three define the risk that a disaster will occur, there is a fourth variable missing.

The ability of an individual or a community to sustain and overcome the potentially destructive effects of an extreme event ultimately determines whether or not a natural hazard will become a disaster. Individuals with low levels of capacity and high levels of vulnerability often sit on the precipice of disaster on a daily basis. As Wisner et al noted, for marginalized individuals with low levels of social, political, and financial capital, “the boundary between disaster and everyday life can be very thin.” Those of us who can afford health care and homeowner’s insurance may be able to overcome a minor car accident or house fire. For those who lack these assets, such events may constitute life-altering disasters that trap them into a permanent state of emergency.

Vulnerability and capacity are determined by a cumulative set of decisions that can take place over a period of years, if not decades. These decisions are rooted in dominant social structures and ideologies, which unevenly distribute disaster risk among citizens. Anthony Oliver-Smith has described the May 1970 Ancash earthquake that struck Yungay, Peru as a “500 year earthquake (PDF).” By this, he meant the vulnerability to this seismic hazard was borne from the destruction of Incan infrastructure and land use policies that started with the Spanish conquistadors. While the proximate hazards that contributed to the disaster were local, the broader systems in which the disasters occurred grew from a set of structures remote in both space and time which overwhelmed the limited capacity of people in Yungay.

red cross hazard mapping india

An IFRC staff member conducts a participatory hazard mapping exercise with women in Varap, a village in India’s Maharashtra state (courtesy of the IFRC).

But focusing solely on vulnerability is not an effective strategy for reducing disaster risk. People facing disasters have developed a variety of coping mechanisms and strategies to help them survive. These form the heart of their adaptive capacities, and it is important for development and humanitarian actors to pay attention to these as well. The International Federation of Red Cross & Red Crescent Societies (ICRC) has been central in developing this concept. Its Vulnerability & Capacity Analysis (VCA) tool allows actors to use participatory assessments to identify both the vulnerabilities and capacities of people living in harm’s way. Only through this process can we determine both the risks that must be mitigated and the existing assets that we can build upon.

In the wake of disasters like Haiyan, there exists a window of opportunity during which change can take place. Deluding ourselves by claiming that disasters are natural events only serves to ensure we maintain the status quo. But treating survivors as nothing more than victims who need our help and our solutions is just as dangerous.

People living on the island of Leyte understand the threat of typhoons far better than I ever will. Accordingly, they already had ways to endure them long before Haiyan made landfall. What they need is not for us to bring solutions to them. They need support in identifying what assets they already possess and the resources necessary to build upon and enhance them.

We’ll never be able to create a world in which extreme weather events no longer occur. And the science suggests that every ton of CO2 we pump into the atmosphere will only increase their frequency. But we already know that investing in disaster risk reduction pays dividends. If we make it a priority to invest in building upon existing capacities and minimizing vulnerabilities, we may be able to create a world in which disasters are far less common and less destructive.

There’s no such thing as a natural disaster

typhoon haiyan image
typhoon haiyan image

An image of Super Typhoon Haiyan as it appeared the morning of Friday, November 8, just before making landfall in the Philippines (courtesy of the Capital Weather Gang).

As we all know, Super Typhoon Haiyan devastated the Philippines over the weekend. At its peak, Haiyan was perhaps the strongest tropical storm ever recorded at landfall, packing sustained winds of at least 195 mph with gusts of 235 mph. The United Nations and the Philippine Red Cross are both warning that 10,000 people may have been killed in Tacloban alone; this would make Haiyan the deadliest disaster in the history of the Philippines (though President Aquino is revising those numbers down).

I should note, however, that the true scale of a disaster is measured not in the number of dead, but in the number affected and displaced. According to the UN Office for the Coordination of Humanitarian Affairs, Haiyan affected 11.3 million people and displaced at least 673,000 Filipinos. In contrast, the 2004 Indian Ocean tsunami killed perhaps 250,000 people throughout Southern Asia but affected roughly 5 million.

The scope and scale of the devastation in the Philippines is, for lack of a better term, biblical. If you are in a position to provide support, I encourage you to make a monetary donation to the Philippine Red Cross or one of InterAction’s partner organizations working on the ground. Please donate money only. Survivors and relief organizations know what is needed, and they can source materials much more quickly and cheaply from regional sources.

As individuals and media outlets have tried to grasp the sheer scale of the devastation, they have almost unanimously referred to Haiyan as the worst natural disaster in Philippines history. The search term “Haiyan natural disaster” brings back at least 49,300,000 hits on Google, including headlines such as:

Let me be blunt: there is no such thing as a “natural” disaster. Disasters are complex, multifaceted, frequent, and overwhelming. We have a hard time fully grasping the nuance and complexity of each disaster – particularly one that strikes halfway across the world – so we turn to calling it a “natural” event. The term natural disaster is, in essence, a heuristic that we fall back upon in order to interpret the event.

In their landmark work, At Risk: Natural Hazards, People’s Vulnerability, and Disasters, Wisner, Blaikie, Davis, and Cannon term the tendency to view disasters in this light as the “myth of naturalness.” As Comfort et al put it (PDF):

A disaster is widely perceived as an event that is beyond human control; the capricious hand of fate moves against unsuspecting communities creating massive destruction and prompting victims to call for divine support as well as earthly assistance.

But a tropical storm or a tornado does not a disaster make. Rather, the risk of a disaster is a product of three variables: a natural hazard (e.g. a fault line or damaging winds), physical and economic exposure to the hazard, and socioeconomic vulnerability. To borrow liberally once more from Wisner et al:

Disasters happen when hazards hit vulnerable communities whose inherent capacity is not enough to protect itself and easily recover from its damaging effects. Disasters are the product of the social, economic and political environment.

As I tried explaining to a colleague of mine last Spring, Superstorm Sandy in DC was a hazard or an extreme weather event. Superstorm Sandy on the Jersey Shore or in Lower Manhattan was a disaster. Now, granted, most of this difference was due to the severity of the natural hazard as a result of weather dynamics. But Sandy may very well have been a disaster for someone living in a flood zone in Southwest DC (high levels of exposure) or to a homeless person without access to safe shelter from the storm (high levels of exposure and vulnerability).

Pressure and Release Model chart

The Pressure and Release Model, one way to depict the construction of disaster risk (courtesy of Wikipedia).

To their credit, a lot of journalists are starting to get it. Seth Borenstein has an excellent overview today of the social, economic, and political drivers of Haiyan.

Meteorologists point to extreme poverty and huge growth in population — much of it in vulnerable coastal areas with poor construction, including storm shelters that didn’t hold up against Haiyan.

More than 4 out of 10 Filipinos live in a storm-prone vulnerable city of more than 100,000, according to a 2012 World Bank study. The Haiyan-devastated provincial capital of Tacloban nearly tripled from about 76,000 to 221,000 in just 40 years.

About one-third of Tacloban’s homes have wooden exterior walls. And 1 in 7 homes have grass roofs, according to the census office.

Those factors — especially flimsy construction — were so important that a weaker storm would have still caused almost as much devastation, McNoldy said.

Andy Revkin had a similar analysis of the massive tornado that ravaged Moore, Oklahoma in May over at Dot Earth. But, unfortunately, these types of reports are the exception that proves the rule. Most media coverage falls back upon the “myth of naturalness.” Others obsess over debating whether or not we can attribute each individual disaster to climate change. The science of attribution is improving by leaps and bounds, and perhaps in a year or so, scientists will be able to tell us whether or not they can identify the specific fingerprints of a changed climate in the DNA of Haiyan.

But taking such an all-or-nothing approach to disasters is irresponsible. Every disaster is different, not all natural hazard events are disasters, and whether or not climate change acts through individual extreme events is not the point – it’s our new baseline. Instead, we need to understand that, contrary to conventional wisdom, humans can and do influence all three of the disaster variables. And, as a result, the number of disasters has spiked over the last century. I’ll briefly explore how we have altered each variable below.

chart of disaster occurrence 1900-2011

The number of reported disasters increased dramatically from 1900-2011, from roughly 100 per decade during the first half of the 20th century to 385 per year from 2007-2011 (courtesy of EM-DAT).

Exposure

Population and economic growth and rapid urbanization have heightened our exposure to disasters significantly in recent years. According to the UN’s International Strategy for Disaster Reduction (ISDR), the number of people and total GDP exposed to flood risks increased by 28% and 98% (PDF), respectively, from 1990-2007. As economies develop and individuals build fixed assets like homes and infrastructure in disaster-prone areas (e.g. floodplains), economic exposure spikes. At least 3.4 billion people are now exposed to one or more hazards, while 25 million square kilometers of land is hazard-prone.

Rapid and unplanned economic development has taken its toll on ecosystems, which provide vital sources of natural protection against disasters. For instance, despite the fact that intact mangrove forests can reduce the flow rate of tsunamis by up to 90%, at least half of all mangrove forests have disappeared globally. In the Philippines, 70% of mangroves were destroyed (PDF) from 1918-1993. This destruction has substantially increased physical exposure to disasters and reduced the natural environment’s ability to mitigate the risk.

Vulnerability

Of the three disaster variables, vulnerability is the most closely linked to the social, economic, and political environments. By definition, some groups appear more vulnerable to disaster risks than others. Key intervening variables include class, occupation, caste, ethnicity, gender, disability, physical and psychological health, age, immigration status, and social networks. One’s ability to access the resources s/he needs to cope with and adapt to stress – their “architecture of entitlements” (paywall) – is determined by these various factors, which shape social relations, political contexts, and structures of domination.

Differential vulnerability helps to ensure that different individuals and groups weather (no pun intended) disasters better than others. During the 2004 Indian Ocean tsunami, for instance, women were three to four times more likely to die than men in affected areas. This outcome occurred for a variety of reasons. Due to cultural norms, most women wore bulky clothing that covered most of their bodies; when this got wet, it weighed them down. Women were also far less likely to be able to swim (PDF) given their social roles and religious mores.

Natural Hazards

Interestingly, even natural hazards –the most natural of the three variables – have also undergone changes due to human actions.  Global temperatures have increased 0.85°C since 1880. Since the 1970s, global average precipitation has decreased by 10cm per year, but it has increased by more than 20% in certain regions (including the Northeast and Midwestern US). Accordingly, the number of extreme events associated with climate change rose by 50% over this three decade period. There even appears to be evidence that human activities can alter seismic risks. Researchers have connected a string of earthquakes in states from Ohio to Oklahoma to the high-pressure injection of wastewater into underground wells.

While it is difficult for reporters on a deadline to analyze the social, economic, and political drivers of various disasters, it is important that we begin to inch away from the myth of naturalness. Placing the blame for every disaster on the “capricious hand” of God or nature is dangerous and irresponsible.

First, it strips robs disaster survivors of their agency. They are just poor victims suffering from Acts of God. Secondly, in places where disasters are common (like the Philippines) it allows people who are disconnected from the events to blame the victims for not moving away from the threat. Thirdly – and perhaps most importantly – this mindset tends to make us complacent. If we accept disasters as natural events that we cannot control, what is our incentive to invest in disaster risk reduction strategies like curbing poverty, replanting mangrove forests, or hardening critical infrastructure? What is the hook for curbing carbon emissions to mitigate climate change?

The first step to addressing the rise in disasters worldwide is to admit that disasters aren’t natural. They’re manmade. Maybe if we do that we can get off our asses and do something about them.

Spoiler alert: Climate change is still happening

So NOAA is out with its monthly update of climatic conditions in the contiguous United States. The latest report covers the month of May. Here are some of the key takeaways:

  • The average temperature for the contiguous US during May was 61.0°F, making it 0.9°F above the monthly average for the 20th century. May was the 339th consecutive month in which the average temperature in the contiguous US was higher than the 20th century average. If you were born after February 1985 (as I was), you have never known a climate not altered by the hand of man.
May 2013 was the 40th warmest May since we began keeping records 120 years ago (courtesy of the NOAA National Climatic Data Center).

May 2013 was the 40th warmest May since we began keeping records 119 years ago (courtesy of the NOAA National Climatic Data Center).

  • May was warmer than average in the Northeast and the West, but it was considerably cooler than average in the South. Florida and Georgia had their 11th and 12th coolest Mays on record, respectively. California had the warmest average temperature, relative to other May records, of any state; it was its 18th warmest May.
While temperatures were higher than average in the Northeast, Great Lakes, and West, they were significantly cooler in the Southeast (courtesy of NOAA NCDC).

While temperatures were higher than average in the Northeast, Great Lakes, and West, they were significantly cooler in the Southeast (courtesy of NOAA NCDC).

  • As you could probably tell by looking outside, May was extremely wet. It wast the 17th wettest May on record; the total precipitation average (3.34 inches) was 0.47 inches about the 20th century average. May was particularly damp in the Midwest and Great Plains, where Iowa had its wettest May on record, and North Dakota had its second wettest May.
May continued our above average precipitation trend for 2013. Total precipitation has only increased during the first third of June (courtesy of NOAA NCDC).

May continued our above average precipitation trend for 2013. Total precipitation has only increased during the first third of June (courtesy of NOAA NCDC).

  • Despite the above average rainfall (and snowfall, for that matter), much of the country west of the Mississippi River continues to experience at least moderate drought. While the total land area under drought fell by 2.8% during May, 44.1% of the US (including Alaska & Hawaii) remains in drought conditions. This prolonged drought is contributing to historic wildfires in Colorado and California.
While the total land area experiencing drought fell in May, nearly half of the country is still experiencing drought conditions (courtesy of US Drought Monitor).

While the total land area experiencing drought fell in May, nearly half of the country is still experiencing drought conditions (courtesy of US Drought Monitor).

The moral of the story: it may not be as insanely hot as last year, but climate change is here to stay, folks.