Thursday 31 August 2017

Climate Change and Cascading Disasters

Flooding in central Bangladesh. (photo: DA)

Once again, disasters are topical. As usual, why they are topical rather depends on what else is featuring in the news at the same time. Floods in the southern USA and South Asia throw into sharp relief the possibility that climate change may already be causing extreme events to be larger and more destructive. Perhaps in the images of destruction and inundation we have a graphic illustration of an outcome that needs to be shown to people for them to believe it. Experts prognosticating in front of television cameras are not enough to convince the sceptics about climate change (let alone the hard-line deniers): what is needed is a good, solid floodwave.

But let me introduce a new element: cascading disasters. In essence, a primary impact such as rising floodwaters leads to a series of knock-on effects. But it does not stop there. The interaction of different sources of vulnerability means that effects can be transformed into new causes.

In 2002 flooding on the Moldava or Vltava River severely inundated the city of Prague, but also impacted the Spolana chemical factory, causing an explosion and a toxic cloud. As I write, something similar is expected at the Arkema factory in Crosby, Texas, as a consequence of flooding caused by Tropical Storm Harvey. Primary and back-up systems for cooling volatile chemicals have failed. Explosive or combustive reactions are expected. What will be their consequences? Time will tell.

On the other side of the world in the Indian sub-continent, commuters are being prevented from getting to work and children are being deprived of schooling by flooding that is greater in magnitude and impact than its American counterpart. A building has collapsed in Mumbai, killing and trapping its occupants, leading to a relief effort that must be added to that mounted against the effects of the floods and intense rainfall.

It may be that all future disasters above a certain size will be cascading events to a greater or lesser extent. This is because both the degree of mutual dependency and the growing complexity of society make such an outcome inevitable.

So what can we do about cascading disasters? First, we must recognise that the game has changed. The idea of disaster as simple cause-and-effect must be abandoned. Planning based on this assumption is likely to lead to the wrong remedies, or at least to inefficiency, with respect to both disaster risk reduction and disaster response.

Secondly, in developing strategies, tactics, plans and procedures, we must place the emphasis squarely on understanding vulnerability in all its forms. Commonly it is broken down into categories: physical, environmental, social, psychological, institutional, and so on. However, it also includes elements such as the risks of dependency upon technology, corruption, failure to innovate, and social polarisation. This means that vulnerability is best viewed as a complex, multi-faceted phenomenon. We must understand its mechanisms and the interactions between the facets. As has been written many times, disaster is socially constructed. It is the result of decisions made by individuals or groups, for they are those who put people and their possessions in harm's way. The study of cascading disasters involves the search for escalation points, at which vulnerability becomes compound and creates new "disasters within disasters". Remember that the Japanese M9 earthquake of 11 March 2011 was not the real cause of the Tōhoku disaster: that was the resulting tsunami and its effect on the Fukushima Dai'ichi nuclear plant. This is now one of the largest examples of a cascading disaster.

Thirdly, we must investigate the 'disaster pathways', which are the directions in which impacts propagate, including the 'escalation points'. This will give us the basis for anticipating the effects of a primary agent of disaster and either reducing them a priori or intervening to limit the damage.

In the twentieth century the concept of 'disaster' was viewed very much as one based on static relationships. From 1950, empirical studies of equilibrium were fashionable, and if a system failed to achieve it, then 'homeostasis' could be invoked, or in other words the system was assumed to have a tendency to return from perturbations to its equilibrium, and thus to have a 'central tendency'.

The agenda has changed, and so should the outlook upon disasters. Physically, we have climate change; socially we have population growth and socio-economic polarisation of wealth and opportunity. We also have rapid changes in the global outlook coupled with increasing international interdependency. Seldom has vulnerability looked less stable.

The current floods in the USA and South Asia reveal the gaps and weaknesses in planning, with respect to both risk reduction and disaster response. Rather than cutting budgets and turning away from disaster risk reduction, decision makers need to devote far more resources to the problem--and, of course, to take cascading into account. This will require a shift from a 'technofix' approach that stems from hazard reduction to one based on vulnerability reduction. Many of us in the disaster studies community have been saying this for at least three and a half decades, vox clamantis in deserto. It is now, more than ever, economically and politically advantageous to listen to us.

Saturday 5 August 2017

In Europe we're all going to die in disasters - or are we?


The top news on the BBC website this morning was that "deaths in Europe from extreme weather events could increase 50-fold by 2100". In my opinion, there are two lessons to be drawn from this.

The first is that the authors of the study (Forzieri et al. 2017) were very clever to release it at the time of maximum impact. As I write, the temperature outside my room is in the 40s Centigrade. The article was embargoed until 11.30 last night and pre-distributed to the mass media. Small wonder that today it got maximum exposure.

The second is that the research is pretty much worthless. It is misleading and highly unlikely to offer an accurate forecast. It is a hazards-driven study that effectively uses exposure as a surrogate for vulnerability, about which the authors have remarkably little to say (see my comments in Davis 2017). And yet it has been demonstrated all over the world that vulnerability defines death tolls - i.e., people can live in highly hazardous zones and not die if they are not vulnerable (Wisner 1993). Various African countries, India and Bangladesh have all had some notable successes in reducing disaster mortality in areas of high population growth (e.g. Paul et al. 2010). Moreover, one of the effects of the International Decade for Natural Disaster Reduction was to hold the line on death tolls (it would have been nicer if they had gone down, but anyway, it was an achievement of sorts).

By way of illustration, the current heat wave is probably going to be comparable to that of 2003, during which it is estimated that there were 70,000 excess and premature deaths (Lagadec 2004). The figure is highly contentious, but, leaving that aside, since then measures have been put in place to avoid a repetition (Boyson et al. 2014, Pascal et al. 2012). These are mainly early warning systems to detect and assist vulnerable people. In Tuscany, where I am writing this, they have been highly effective, and I believe they have in France and Spain, too.  In the United States, as population rose, heat-related mortality declined (Sheridan et al. 2009). In contrast, Forzieri et al. (2017, p. e206) forecast that heatwave deaths in southern Europe will go up by 7,000 per cent in a century. If that were so, perhaps our work in disaster risk reduction would be a waste of time.

People put faith in figures because they seem precise and scientific, even when the reasoning that supports the figures is a hollow shell. The good side of the article is that it draws attention to the problem - or to part of it (and what a pity it does not draw enough attention to the extreme dynamism of vulnerability!). The bad side is that policy may end up being based on projections that are largely fantasy. There may indeed be massive increases in mortality in weather disasters in Europe, but that would be a function of many other factors - whether there is conflict, the impact of cascades, the functionality of antibiotics, emerging threats and hazards, dependency on critical infrastructure, the status of emergency preparedness, exotic diseases, the wealth differential, etc...

References

Boyson, C., S. Taylor and L. Page 2014. The National Heatwave Plan: a brief evaluation of issues for frontline health staff. PLoS Currents Disasters 13 January 2014.

Davis, N. 2017. Extreme weather deaths in Europe 'could increase 50-fold by next century'. The Guardian 5 August 2017.
https://www.theguardian.com/science/2017/aug/04/extreme-weather-deaths-in-europe-could-increase-50-fold-by-next-century

Forzieri, G., A. Cescatti, F. Batista e Silva and L. Feyen 2017. Increasing risk over time of weather-related hazards to the European population: a data-driven prognostic study. Lancet Planetary Health.
http://www.thelancet.com/journals/lanplh/article/PIIS2542-5196(17)30082-7/fulltext

Lagadec, P. 2004. Understanding the French 2003 heat wave experience: beyond the heat, a multi-layered challenge. Journal of Contingencies and Crisis Management 12(4): 160-169.

Pascal, M.,  K. Laaidi, V. Wagner, A.B. Ung, S. Smaili, A. Fouillet. C. Caserio-Schönemann and P. Beaudeau 2012. How to use near real-time health indicators to support decision-making during a heatwave: the example of the French heatwave warning system. PLoS Currents Disasters 16 July 2012.

Paul, B.K., H. Rashid, M.S. Islam and L.M. Hunt 2010. Cyclone evacuation in Bangladesh: tropical cyclones Gorky (1991) vs. Sidr (2007). Environmental Hazards 9(1): 89-101.

Sheridan, S.C., A.J. Kalkstein and L.S. Kalkstein 2009. Trends in heat-related mortality in the United States, 1975-2004. Natural Hazards 50(1): 145-160.

Wisner, B. 1993. Disaster vulnerability: scale, power and daily life. GeoJournal 30(2): 127-140.

Tuesday 1 August 2017

Seven Rules for the Application of Operations Research to Disaster Management


It is currently very fashionable to apply the methodologies of operations research to disaster mitigation, management and response. Is this a fashion or a fad? Will the algorithms be used and appreciated, or are they merely wasted effort? Do the algorithm makers understand what conditions are like in a disaster, and what the real needs of managers and responders are?

In disaster management there is a well-founded hostility towards over-sophisticated routines and equipment. Managing emergencies will always be a rough-and-ready process, in which most of what is done is a kind of approximation. Such is the nature of uncertainty and rapid change in the field that it could never be otherwise.

If operations research is to make a useful contribution to disaster management, it will have to take account of these principles:-

1.    In emergencies, 'optimisation' is a very relative term. Pre-planned activities require considerable scenario modelling in order to take account of the real needs that will be generated during a future emergency.

2.    Optimisation based on an assessment of pre-disaster conditions is unlikely to be relevant to the post-disaster situation. Infrastructure will be damaged, inefficient and probably partly non-functional.

3.    Optimisation that assumes perfect knowledge of the situation is bound to fail. During major emergencies, the common operating picture is constructed slowly and with difficulty. One cannot optimise a situation that is partially unknown.

4.    Algorithms that are designed to be used in emergency situations should be capable of deployment during emergencies. This means that at the height of a crisis time cannot be expended on collecting data or running lengthy analyses.

5.    To make an algorithm credible, evidence should be provided that it is acceptable to field commanders, who would use it or act upon the results that it provides. Optimisation is not an objective held by most emergency managers and field commanders. An algorithm that does not take account of their needs and ways of thinking is highly unlikely to be appreciated or utilised by them.

6.    Decision support systems are welcomed if they really do support decision making. No sensible emergency manager would put blind faith in an algorithm unless the results clearly demonstrate that it works and visibly improves the situation.

7.    Flexibility is an essential ingredient of any algorithm. In disasters, conditions on the ground can change abruptly and without warning. Algorithm makers need to understand the difference between 'agent-generated demands' and 'response-generated demands', as described in the classical literature on the sociology of disasters.