Thursday 15 December 2016

'Optimising' Failure in Disaster Response


At the end of the 18th and beginning of the 19th centuries, mechanisation of the weaving industry in Manchester, England, began to destroy the cottage industry of loom work. The Luddites were people who sought to destroy the machines that were destroying their livelihoods. Things are different now: to be a latter-day Luddite is akin to being a modern King Cnut, the ruler who sat on his throne amid the waves to show his subjects that he had no power to stop the tide coming in (and was later misinterpreted as the king who thought he could control the tides but failed). Technology is the tide, indeed the unstoppable tsunami. It brings good and bad things with it.

Regarding the modern advance of technology, I have no desire to be a Luddite of any kind. However, I am concerned about how know-how and equipment are being misused in disaster risk reduction.

Thirty-six years ago, when I began to study disasters, we were only a select few people in this new area of scholarship. DRR is now a crowded field. That, of course, is good because it means that the importance to humanity of DRR is recognised and research institutions are taking the field seriously. However, there are several branches of 'disasterology' in which many of the proponents appear not to have an adequate background in the field, nor an appreciation of the reality behind the problems they tackle. The consequence of this is that the ceaseless application of technology becomes a problem, not a solution. It can create vulnerability by inducing reliance on routines or equipment that in a disaster may function badly or not at all.

Humanitarian logistics is a field in which there is a vivacious tendency to produce algorithms. One can imagine cohorts of mathematicians and computer scientists casting around for problems to solve. Suddenly, they see the delivery of humanitarian aid as the answer to their prayers: a field in need of algorithms. A proper critical analysis would examine the key problem of whether the algorithms have any value in the field. I strongly suspect that more often than not the answer is 'no'. The models are based on assumptions. In some cases these are inadequate, and in many they are untested. Few attempts have been made to find out whether the models function during real disasters, whether they would improve the situation and whether they are attractive to emergency managers. In my experience, they are not.

Regarding the very fashionable problem of how to optimise the location of critical facilities, in a recent earthquake that I studied in the field, facilities such as emergency operations centres and warehouses were located in the only places that were available, accessible and functional. There was no element of choice or optimisation. It was a typical event of its kind, and optimisation algorithms would not have helped in any way. Nor would they have been accepted by the emergency managers on site, who overwhelmingly wanted to simplify their decision making, not complicate it with over-sophisticated routines.

Algorithms designed to optimise facilities presuppose that the data will be available to carry out the optimisation during the critical phase of a disaster aftermath. That is unlikely. Alternatively, they tend to presuppose that decisions can be made before disaster strikes. That ignores the range of geographical variation in possible impacts.

I have a strong feeling that humanitarian logistics and common sense have parted company. For instance, consider a recently published model that optimally located shelters and efficiently assigned evacuees to the nearest shelter site. There is no idea in this work of how to preserve social cohesion. My field research, and that of many others, shows that this is a critical factor in the success or failure of shelter strategies.

If it occurs at all, model testing is usually hypothetical, or it is carried out under highly artificial conditions. There is very little research on the effectiveness of algorithms in real crises, and whether they are able to improve situations. This is hardly surprising. Most emergency decision makers are not interested in optimisation routines and are not equipped to understand or use them. They would be very suspicious of any attempt to replace informed judgement with automated routines. The key question is not "how can we optimise the location of a warehouse?", but "are there any suitable warehouses in the area that we can commandeer and use?"

Increasing dependency upon electronic routines may indeed increase vulnerability to disaster. Any failure of sophisticated electronics and software (or batteries, or connections) risks causing serious problems (will there be electricity after the disaster has struck?). This explains the widespread reluctance of emergency managers and responders to use the algorithms. Indeed, there is significant and well-founded opposition to the 'technofix' approach to emergency management. At the very least, users will have to assume the burden of memorising yet another plethora of mnemonics and initials. Complex training will need to be allied with ability to fix software and hardware glitches with extreme rapidity. Rarely if ever do the papers that present algorithms discuss how dependency upon them may be dangerous, or what would happen if the algorithm fails. When the algorithm is described, seldom is any redundancy offered.

'Optimal' and 'optimisation' are misnomers when applied to a problem that has only been thought through partially. Rather than supporting decisions, the algorithms may thwart them by encouraging decision makers not to think problems through.

Many have argued that the 'technofix' approach to disaster is wasteful and damaging. In an age in which technology has become the world's obsession, it cannot easily be dismissed, nor should it be. Indeed, it holds the answer to many complex and intractable problems, but only if it is used with intelligence and insight. What technology has to offer to disaster management depends on its robustness, its ease of use in a crisis, redundancy if it becomes blocked, its cultural and technical acceptability to users, and its ability quickly to provide useful answers to intractable problems. When papers are published that offer technological solutions to disaster problems, they should necessarily address these issues. They should prove that the technology is attractive to users, that it will indeed be adopted and that it will make a positive difference to disaster management.

Disaster Risk Reduction in the Post-Truth World


There is a footnote in my book Confronting Catastrophe (Alexander, 2000), which runs as follows:-
 "Thus the alarm bells rung by Vance Packard in his 1950s book The Hidden Persuaders (Packard 1981 - a classic critique of advertising) have largely been ignored. His death in 1997 was greeted with a resounding silence on the part of the chattering classes and the mass media he criticized so aptly. Perhaps that was their revenge."
This small intellectual aside has become more, not less, important since I wrote it.

Advertising is the creation of illusion in order to further the consumption of goods and services. My lifetime, 63 years, is a period in which the population of the world, the army of consumers, has more or less tripled and at the same time we have become inured to advertising. During these six decades it has gradually become more and more insidious until it has a finger in, and sometimes a stranglehold over, virtually all of our activities.

Advertising has developed a remarkably strong synergy with the entertainment industry, another great producer of illusion. Many people have grown up in a world in which these giants of human fantasy are two of the principal points of reference to which our model of the human condition is anchored. This is fertile terrain for the growth of post-truth politics. Politics these days is heavily dependent on advertising. Once it loses its moral compass, and once the electorate no longer expects truthfulness, then any old lie will suffice.

Now the last thing we need in disaster risk reduction is illusion. And the first thing we need to do is face up to reality, however brutal it may be. Is DRR perhaps the antidote to advertising? Perhaps it is, or should be, but how is it served by "post-truth politics"?

In 1918 Hiram W. Johnson, a Republican senator from California, is reputed to have coined the phrase "The first casualty when war comes is truth." His observation is not surprising when one considers that politics and economics are behind war. But is the same true of disaster? Observers of the immediate aftermath of the Fukushima would argue that truth was very much the first casualty (Kushida 2012). Moreover, the effect of this was to damage or destroy the bond of trust between government, science and the people.

Is this part of a trend or an isolated incident? I suggest that it is neither. In 1986 the Soviet Government struggled to conceal the Chernobyl disaster until the plume of radiation across Europe meant that the pretence could no longer be sustained (Moynagh 1994, p. 724). In this it followed a long tradition of secrecy after disaster that had been prevalent in the USSR and China (which until 1986 did not allowed foreign scrutiny of the 1976 Tangshan earthquake). Yet in both Russia and China the situation has changed: it could hardly be otherwise in a world dominated by instantaneous electronic communication and 'citizen journalism'. Moreover, secrecy and outright lies are slightly different traits.

If indeed politics have entered a new era in which truth no longer sways electorates, will this situation extend to disaster risk reduction and the management of disaster impacts? I think there are grounds for limited optimism. Pluralism is the antidote to Great Historical Lies, and pluralism cannot be suppressed by disaster, nor, in the modern age, can it so easily be suppressed by dictatorship.

References

Alexander, D.E. 2000. Confronting Catastrophe: New Perspectives on Natural Disasters. Dunedin Academic Press, Edinburgh, and Oxford University Press, New York, 282 pp.

Kushida, K.E. 2012. Japan’s Fukushima Nuclear Disaster Narrative, Analysis, and Recommendations. Shorenstein APARC Working Paper. The Walter H. Shorenstein Asia-Pacific Research Center, Stanford University, Stanford, California, 73 pp.

Moynagh, E.B. 1994. The legacy of Chernobyl: its significance for the Ukraine and the world. Boston College Environmental Affairs Law Review 21(4): 709-751.

Packard, V.O. 1981. The Hidden Persuaders. Pocket Books, New York, 288 pp.