Sunday, 15 October 2017

Why the Hazards Paradigm Remains Stronger Than the Vulnerability Approach

Image result for hogarth 
One of the great paradoxes of disaster studies is the dominance of the hazards paradigm over the vulnerability approach. In 1983, Kenneth Hewitt and his colleagues published Interpretations of Calamity (Hewitt 1983), which cogently set out the arguments for regarding hazard as the trigger of disaster and vulnerability as the essence of the phenomenon. More recent attention to the underlying risk drivers (Blaikie et al. 2003) and disaster risk creation (FORIN Project 2011) have reinforced that view. But what do we see? Hazards-based approaches continue to dominate the field. Indeed, they continue to strengthen their dominance. There are ten reasons why this is so, as follows.

1. It is easier to blame disasters on a neutral agent, such as an extreme natural event, than on human decision making. Having stated this, it is becoming less easy as the full force of human-induced climate change become more and more apparent.

2. People, including scientists, tend to shy away from root causes, which can be complex, agonising and therefore intimidating. Vulnerability as a root cause is often a particularly difficult phenomenon to get to grips with as it tends to be multi-faceted, complex and insidious.

3. Political decision making is a major root cause of vulnerability to disaster. It is all too often divorced from rational advice and wedded to ideology. In the face of political forms of 'rationality', it is hardly surprising that it seems more attractive to study natural phenomena than the vagaries of human behaviour.

4. For many decades there have been massive investments in 'hard' science and no corresponding levels of support for endeavours to understand vulnerability.

5. There is a widespread and enduring belief in the 'technofix' approach to disasters. The bigger the problem, the more technology is needed to fix it. This is, of course, an ideological position in its own right. As it seldom succeeds, but remains wildly popular (especially among those who make a living out of selling technology), the result is that worsening conditions engender yet more dependence on technological solutions, and vulnerability continues to rise.

6. In many parts of the world, libertarianism dominates over regulation. Yet the conditions that produce vulnerability need to be regulated if it is to be brought under control.

7. The position of the social sciences is subordinate to that of the physical sciences in the world's academic systems. There is still considerable prejudice in scientific quarters against the 'softness' of social sciences, which are regarded as lacking in rigour because they do often not produce concrete or precise results.

8. There is a particular view of magnitude and frequency that acts as a framework for responding to disaster. I refer to the physical magnitude and frequency of events, not the magnitude of vulnerability.

9. Physical development (such as urban development and the building trade) is a juggernaut that often crushes dissent and restraint. It has enormous political support and it creates vulnerability by putting more and more assets in harm's way.

10. Finally, vulnerability is a paradoxical phenomenon. Like friction, it only really exists when it is mobilised (by impact) and therefore it must be studied either hypothetically before it manifests itself or post hoc after it has been converted into damage. It is thus much less tangible than the physical forces of hazards that can be measured in the field.

Taken together, these ten observations go a long way to explaining why the disaster problem is such a long way from being solved and, indeed, why it continually gets worse. Of course, there is no guarantee that a better understanding of vulnerability would lead to better management of it, but it is nevertheless clear that more and more knowledge of physical hazards does less and less for the process of reducing disaster.


Blaikie, P., T. Cannon, I. Davis and B. Wisner 2003. At Risk: Natural Hazards, People's Vulnerability and Disasters (2nd edition). Routledge, London.

FORIN Project 2011. Forensic Investigations of Disasters. Integrated Research on Disaster Risk, Beijing, 29 pp.

Hewitt, K. (ed.) 1983. Interpretations of Calamity from the Viewpoint of Human Ecology. Unwin-Hyman, London: 304 pp.

Sunday, 1 October 2017

On Integrity

The spectacle of President Donald Trump endeavouring to belittle the mayor of San Juan, about aid to Puerto Rico after the devastation wrought by Hurricane Maria prompts me to a rather personal reflection about the breadth of people's attitudes. The argument over aid is a squalid one and it betokens a squalid outlook by the dominant opponent.

Many years ago I formed a close friendship with a man who was 30 years older than myself, whom I shall refer to by his title and first name, Don Rocco. He was a retired medical doctor, of considerable stature in his profession. During his career he founded a clinic for the treatment of tuberculosis and established a hospital in an area that at the time lacked the most basic medical amenities. Don Rocco was a modest man in everything except his concern for the safety and well-being of his people. I came to know him because he lived in a region that suffered badly from natural hazards and he was keen to encourage researchers to come and study there, and to provide some answers to the problem of disasters.

Don Rocco was a man of remarkable integrity. Others enriched themselves and gained status out of their work with the poor and needy, or their efforts against hazards: he did not. He would always listen to people's concerns and, wherever he could, he would try to help. Not all those around him were as admirable. He and I got on well and we would take daily walks and tell each other our secrets. On one occasion, I met him coming out of the hospital he had founded decades earlier. His expression was grim and I asked him what was up. He replied, "I feel like a father who has just learned that his daughter is a prostitute." I did not ask him what he had learned that day in the hospital but I did what I could to revive his spirits. As others succumbed to base instincts, his stature simply grew. People from places near and far admired and respected him. The more squalid the behaviour of others became, the more Don Rocco was admired. He won a presidential gold medal, but in his study the only item he showed off was a facsimile of the Magna Carta, which was for better or worse the symbol of his faith in democracy.

Don Rocco lived on into his nineties and was finally buried in the small cemetery of his home town, on the hill, at the bend in the road, overlooking the valley where once, a thousand years ago, the Saracens passed by on their way towards conquest. When he died, the hospital and the clinic were named after him. Outside the latter, there is a fairly lifelike statue of him, the man of faith and integrity, the man who always set an example but without showing the slightest pretence or ostentation. Don Rocco will live on in my heart until I too cease to exist. In the meantime, I must confess that it is very difficult to come to terms with the fact that there is now a public monument to my close friend. Such is the human condition.

Thursday, 31 August 2017

Climate Change and Cascading Disasters

Flooding in central Bangladesh. (photo: DA)

Once again, disasters are topical. As usual, why they are topical rather depends on what else is featuring in the news at the same time. Floods in the southern USA and South Asia throw into sharp relief the possibility that climate change may already be causing extreme events to be larger and more destructive. Perhaps in the images of destruction and inundation we have a graphic illustration of an outcome that needs to be shown to people for them to believe it. Experts prognosticating in front of television cameras are not enough to convince the sceptics about climate change (let alone the hard-line deniers): what is needed is a good, solid floodwave.

But let me introduce a new element: cascading disasters. In essence, a primary impact such as rising floodwaters leads to a series of knock-on effects. But it does not stop there. The interaction of different sources of vulnerability means that effects can be transformed into new causes.

In 2002 flooding on the Moldava or Vltava River severely inundated the city of Prague, but also impacted the Spolana chemical factory, causing an explosion and a toxic cloud. As I write, something similar is expected at the Arkema factory in Crosby, Texas, as a consequence of flooding caused by Tropical Storm Harvey. Primary and back-up systems for cooling volatile chemicals have failed. Explosive or combustive reactions are expected. What will be their consequences? Time will tell.

On the other side of the world in the Indian sub-continent, commuters are being prevented from getting to work and children are being deprived of schooling by flooding that is greater in magnitude and impact than its American counterpart. A building has collapsed in Mumbai, killing and trapping its occupants, leading to a relief effort that must be added to that mounted against the effects of the floods and intense rainfall.

It may be that all future disasters above a certain size will be cascading events to a greater or lesser extent. This is because both the degree of mutual dependency and the growing complexity of society make such an outcome inevitable.

So what can we do about cascading disasters? First, we must recognise that the game has changed. The idea of disaster as simple cause-and-effect must be abandoned. Planning based on this assumption is likely to lead to the wrong remedies, or at least to inefficiency, with respect to both disaster risk reduction and disaster response.

Secondly, in developing strategies, tactics, plans and procedures, we must place the emphasis squarely on understanding vulnerability in all its forms. Commonly it is broken down into categories: physical, environmental, social, psychological, institutional, and so on. However, it also includes elements such as the risks of dependency upon technology, corruption, failure to innovate, and social polarisation. This means that vulnerability is best viewed as a complex, multi-faceted phenomenon. We must understand its mechanisms and the interactions between the facets. As has been written many times, disaster is socially constructed. It is the result of decisions made by individuals or groups, for they are those who put people and their possessions in harm's way. The study of cascading disasters involves the search for escalation points, at which vulnerability becomes compound and creates new "disasters within disasters". Remember that the Japanese M9 earthquake of 11 March 2011 was not the real cause of the Tōhoku disaster: that was the resulting tsunami and its effect on the Fukushima Dai'ichi nuclear plant. This is now one of the largest examples of a cascading disaster.

Thirdly, we must investigate the 'disaster pathways', which are the directions in which impacts propagate, including the 'escalation points'. This will give us the basis for anticipating the effects of a primary agent of disaster and either reducing them a priori or intervening to limit the damage.

In the twentieth century the concept of 'disaster' was viewed very much as one based on static relationships. From 1950, empirical studies of equilibrium were fashionable, and if a system failed to achieve it, then 'homeostasis' could be invoked, or in other words the system was assumed to have a tendency to return from perturbations to its equilibrium, and thus to have a 'central tendency'.

The agenda has changed, and so should the outlook upon disasters. Physically, we have climate change; socially we have population growth and socio-economic polarisation of wealth and opportunity. We also have rapid changes in the global outlook coupled with increasing international interdependency. Seldom has vulnerability looked less stable.

The current floods in the USA and South Asia reveal the gaps and weaknesses in planning, with respect to both risk reduction and disaster response. Rather than cutting budgets and turning away from disaster risk reduction, decision makers need to devote far more resources to the problem--and, of course, to take cascading into account. This will require a shift from a 'technofix' approach that stems from hazard reduction to one based on vulnerability reduction. Many of us in the disaster studies community have been saying this for at least three and a half decades, vox clamantis in deserto. It is now, more than ever, economically and politically advantageous to listen to us.

Saturday, 5 August 2017

In Europe we're all going to die in disasters - or are we?

The top news on the BBC website this morning was that "deaths in Europe from extreme weather events could increase 50-fold by 2100". In my opinion, there are two lessons to be drawn from this.

The first is that the authors of the study (Forzieri et al. 2017) were very clever to release it at the time of maximum impact. As I write, the temperature outside my room is in the 40s Centigrade. The article was embargoed until 11.30 last night and pre-distributed to the mass media. Small wonder that today it got maximum exposure.

The second is that the research is pretty much worthless. It is misleading and highly unlikely to offer an accurate forecast. It is a hazards-driven study that effectively uses exposure as a surrogate for vulnerability, about which the authors have remarkably little to say (see my comments in Davis 2017). And yet it has been demonstrated all over the world that vulnerability defines death tolls - i.e., people can live in highly hazardous zones and not die if they are not vulnerable (Wisner 1993). Various African countries, India and Bangladesh have all had some notable successes in reducing disaster mortality in areas of high population growth (e.g. Paul et al. 2010). Moreover, one of the effects of the International Decade for Natural Disaster Reduction was to hold the line on death tolls (it would have been nicer if they had gone down, but anyway, it was an achievement of sorts).

By way of illustration, the current heat wave is probably going to be comparable to that of 2003, during which it is estimated that there were 70,000 excess and premature deaths (Lagadec 2004). The figure is highly contentious, but, leaving that aside, since then measures have been put in place to avoid a repetition (Boyson et al. 2014, Pascal et al. 2012). These are mainly early warning systems to detect and assist vulnerable people. In Tuscany, where I am writing this, they have been highly effective, and I believe they have in France and Spain, too.  In the United States, as population rose, heat-related mortality declined (Sheridan et al. 2009). In contrast, Forzieri et al. (2017, p. e206) forecast that heatwave deaths in southern Europe will go up by 7,000 per cent in a century. If that were so, perhaps our work in disaster risk reduction would be a waste of time.

People put faith in figures because they seem precise and scientific, even when the reasoning that supports the figures is a hollow shell. The good side of the article is that it draws attention to the problem - or to part of it (and what a pity it does not draw enough attention to the extreme dynamism of vulnerability!). The bad side is that policy may end up being based on projections that are largely fantasy. There may indeed be massive increases in mortality in weather disasters in Europe, but that would be a function of many other factors - whether there is conflict, the impact of cascades, the functionality of antibiotics, emerging threats and hazards, dependency on critical infrastructure, the status of emergency preparedness, exotic diseases, the wealth differential, etc...


Boyson, C., S. Taylor and L. Page 2014. The National Heatwave Plan: a brief evaluation of issues for frontline health staff. PLoS Currents Disasters 13 January 2014.

Davis, N. 2017. Extreme weather deaths in Europe 'could increase 50-fold by next century'. The Guardian 5 August 2017.

Forzieri, G., A. Cescatti, F. Batista e Silva and L. Feyen 2017. Increasing risk over time of weather-related hazards to the European population: a data-driven prognostic study. Lancet Planetary Health.

Lagadec, P. 2004. Understanding the French 2003 heat wave experience: beyond the heat, a multi-layered challenge. Journal of Contingencies and Crisis Management 12(4): 160-169.

Pascal, M.,  K. Laaidi, V. Wagner, A.B. Ung, S. Smaili, A. Fouillet. C. Caserio-Schönemann and P. Beaudeau 2012. How to use near real-time health indicators to support decision-making during a heatwave: the example of the French heatwave warning system. PLoS Currents Disasters 16 July 2012.

Paul, B.K., H. Rashid, M.S. Islam and L.M. Hunt 2010. Cyclone evacuation in Bangladesh: tropical cyclones Gorky (1991) vs. Sidr (2007). Environmental Hazards 9(1): 89-101.

Sheridan, S.C., A.J. Kalkstein and L.S. Kalkstein 2009. Trends in heat-related mortality in the United States, 1975-2004. Natural Hazards 50(1): 145-160.

Wisner, B. 1993. Disaster vulnerability: scale, power and daily life. GeoJournal 30(2): 127-140.

Tuesday, 1 August 2017

Seven Rules for the Application of Operations Research to Disaster Management

It is currently very fashionable to apply the methodologies of operations research to disaster mitigation, management and response. Is this a fashion or a fad? Will the algorithms be used and appreciated, or are they merely wasted effort? Do the algorithm makers understand what conditions are like in a disaster, and what the real needs of managers and responders are?

In disaster management there is a well-founded hostility towards over-sophisticated routines and equipment. Managing emergencies will always be a rough-and-ready process, in which most of what is done is a kind of approximation. Such is the nature of uncertainty and rapid change in the field that it could never be otherwise.

If operations research is to make a useful contribution to disaster management, it will have to take account of these principles:-

1.    In emergencies, 'optimisation' is a very relative term. Pre-planned activities require considerable scenario modelling in order to take account of the real needs that will be generated during a future emergency.

2.    Optimisation based on an assessment of pre-disaster conditions is unlikely to be relevant to the post-disaster situation. Infrastructure will be damaged, inefficient and probably partly non-functional.

3.    Optimisation that assumes perfect knowledge of the situation is bound to fail. During major emergencies, the common operating picture is constructed slowly and with difficulty. One cannot optimise a situation that is partially unknown.

4.    Algorithms that are designed to be used in emergency situations should be capable of deployment during emergencies. This means that at the height of a crisis time cannot be expended on collecting data or running lengthy analyses.

5.    To make an algorithm credible, evidence should be provided that it is acceptable to field commanders, who would use it or act upon the results that it provides. Optimisation is not an objective held by most emergency managers and field commanders. An algorithm that does not take account of their needs and ways of thinking is highly unlikely to be appreciated or utilised by them.

6.    Decision support systems are welcomed if they really do support decision making. No sensible emergency manager would put blind faith in an algorithm unless the results clearly demonstrate that it works and visibly improves the situation.

7.    Flexibility is an essential ingredient of any algorithm. In disasters, conditions on the ground can change abruptly and without warning. Algorithm makers need to understand the difference between 'agent-generated demands' and 'response-generated demands', as described in the classical literature on the sociology of disasters.

Tuesday, 18 July 2017

The 'Should Ratio'

The word 'should' is damnable. We all should. There are probably ten or twelve things I should be doing now instead of writing this, and umpteen that I should have done but have not finished. But 'should' is a more serious business when it is applied to official documents. The 'should ratio' is the number of times the word 'should' appears per page of text. For UN Habitat's New Urban Agenda it is 0.33; for the Sendai Framework for Disaster Risk Reduction it is 0.37; for the UN's Sustainable Development Goals it is 0.4. But for the Oslo Guidelines on the Use of Foreign Military and Civil Defence Assets in Disaster Relief it is a whopping 1.93. Perhaps we can excuse the Oslo document because it is a set of guidelines, not a formal treaty.

The point about the 'should ratio' is that 'should' is a weaker word than 'will' or 'shall' or 'must'. We live in an increasingly fluid world in which, paradoxically, as the imperative to act increases, the will to do so declines, and along with it the sense of global responsibility. On the one hand, countries and their governments cannot be compelled to act, and some even resent being told that they should act: witness the response of the Trump administration at the G-20 meeting to the climate treaty. However, I firmly believe that should is a word to avoid. In academic papers, when the discussion and conclusion sections start "shoulding", the reader knows that they are in the process of delivering prescriptions that no one will heed.

I urge you, gentle reader, to make use of the 'should ratio'. It is very easy to compute. In a PDF document, a search function will tell you how many 'shoulds' appear and a calculator will tell you how often this is per page. Please go on to name and shame the writers who overuse the word.

Perhaps in the future a piece of research will tell us what is an acceptable should ratio, if there is such a thing. In the meantime, this blog resolves that the word 'should' should be replaced by the word 'must' in all worthwhile initiatives to reduce disaster, curb pollution, stop poverty, diminish vulnerability, increase safety and security, etc. Should be replaced...

Monday, 10 July 2017

The Tinderbox Tower (2): Disaster in the Sky

In 2001, I was appointed Scientific Director of the training school in civil protection run by the Regional Council of Lombardy. I was based in Milan, Italy. One of my first tasks was to train 28 senior emergency managers. As some of them worked in the 32-storey Pirelli building, and as it was not long after the World Trade Center disaster in New York, I set them the task of devising an evacuation plan for the Milanese skyscraper. Shortly after I had given them my analysis of the events in New York during the terrorist attacks known as "9-11", a light aircraft was deliberately flown into the Pirelli Building, killing one occupant and the pilot and starting a fire. I was relieved to find that the evacuation plan worked: it was based on careful investigation, clear thinking and rigorous drills for the users of the building.

Back in the UK, a couple of years ago I attended a trade fair for emergency service equipment. The organisers presented me with a copy of the 1974 disaster movie Towering Inferno. I thought at the time that this was a rather crass gesture, but as I was making a study of disasters and popular culture, I sat down and watched the film. It is a rather silly movie: heroes, villains, bungled escapes, all the usual ingredients of the genre. At the end, there is a conversation between a fire chief and the architect who designed the building that burnt down. The latter, who evidently uses his head merely as a place to park his architect's hat, has never thought about the flammability of the building and the fire chief has to convince him that now, after the disaster, is the time to give such matters his attention.

Here we are, 115 years after New York's Flatiron Building was erected as one of the very first skyscrapers, with more than a century of accumulated knowledge about how to improve the performance of tall buildings under duress. And yet in London, 43 years on, it is bizarre to see The Towering Inferno become a sort of self-fulfilling prophecy. The Grenfell Tower, or "tinderbox tower" as one resident called it, was at one point a sheet of flame that stretched from the ground to 24th floor. How is that possible, given all that is known about the performance of tall buildings and inhibiting the spread of fire? And what about the electrical surges of 2013 and the naked gas pipes that so frightened the residents? All too often there is a yawning gap between what we know about safety and what we do about it.

In the 1980s and 1990s the British sociologist Anthony Giddens and his German counterpart Ulrich Beck defined the "risk society", a concept that became very popular among students of modernity. Risk is a function of our preoccupation with safety, and the risk society is our way of fighting against the threats and uncertainties caused by the increasing pace of modernisation, or so they argued. For me, this view of modernity is too technocratic. I believe we live in a vulnerability society. It is not the desire to quantify risk and put it in conceptual pigeon-holes that defines modernity, but the political decision making that condemns groups and whole classes of people to be especially vulnerable to disasters and other adversities. We share the risks unequally. One thing that is particularly striking about the aftermath of Grenfell Tower is that there is little sign of redress. So far, those who were marginal and at risk before the disaster continue to be marginal after it.

Oh, and I forgot: in Britain we don't have disasters, we have 'major incidents', as the official language terms them. But, viewed without the cultural belittling, it was indeed a disaster, and one that reverberates around the nation by revealing a whole landscape of forgotten, ignored vulnerability, so much of which is morally and ethically unacceptable.

As a precaution, Transport for London closed the underground lines that run through Latimer Road station, which is almost in the shadow of the blackened wreck of Grenfell Tower. One suspects that the transport planners had in mind Exercise Unified Response, the £800,000, six-nation disaster drill which was conducted in London in February and March 2016. In this, emergency responders dealt with a simulated emergency situation in which a tall building had collapsed onto a tube train. The wreckage and the rubble were packed into a redundant power station in Dartford. The trains were gone, but how many of the residents who live closer to it were evacuated? What did the housing managers have in their minds when they let the residents stay? The consequences of radically different attitudes to risk could be encountered in but a few hundred square metres.

Walking to the site shortly after the fire from the nearest tube station, Holland Park, one passed through a landscape of Porches and Bentleys, gracious Georgian houses and well-swept streets. Suddenly, around the corner there was a gloomy football pitch squashed under Westway, rusty fencing, concrete gardens, barracks-like social housing. And then came all the signs of agony, anger, distrust. In some respects, it reminded me of Miraflores in Lima, Peru, where the houses of the rich are carefully defended against the tide of 'informal' housing of those who are not fortunate enough to live in comfort and luxury.

Emergency response systems are usually mosaics. Like the curate's egg they tend to be good in part. There are social, political and economic explanation of why this is so, but one of the main reasons is that they depend on actual people: a good leader, a good organiser, committed supporters, all of them can transform a failing system into a functional one. In October 1999, the Royal Borough of Kensington and Chelsea had the Ladbroke Grove train crash in its back garden. This 'major incident' (31 dead, 520 injured) was managed competently. On the other hand, it was not exactly an indigenous matter, as the survivors quickly moved on. Eighteen years later, it is breathtakingly awful to see the richest borough in the land make so many classic mistakes as it struggles to respond to disaster on its doorstep. London's emergency arrangements are not bad. indeed, they are excellent in some crucial respects. But a failing of the British system is the relatively loose connection that exists between planning for an emergency, on the one hand, and managing and responding to it, on the other. So much can "fall between the cracks".

My colleagues and I have developed a fascination with the 'transitional phase' of disaster aftermaths, the period that is supposed to connect the phase of intense activity in the initial response with the more measured phase of reconstruction, in which long-term solutions are devised to the problems caused by disaster. We have been studying this in a variety of settings: the Philippines after Cyclone Haiyan (2013), Mexico after landslides and floods, Japan after the tsunami and nuclear release of 2011. We find that in some cases--the worst ones--it isn't a transition at all, or rather it is a transition from nothing much to nothing else. Absence of clear strategies breeds lack of trust in authority, loss of confidence and a fear of the future that, sadly, is often well-founded. Is this what we see in north Kensington, a transition from precarious marginalisation to more of the same?

In the great practical field of endeavour that is now known as 'disaster risk reduction', there is a sort of reverence for the idea of 'community', as if it were a universal palliative to a whole catalogue of ills. In reality, communities can be therapeutic, but they can just as easily be vehicles for division, dissent, distrust and disassociation. Its borough may be a divisive place, but North Kensington in adversity is a good kind of community: I suspect it is very much the best of its kind in Kensington and Chelsea, if we also give recognition to the Chelsea pensioners. At this point, "working with the community" comes into play. Everything we know about the local scale in disasters suggests that imposing solutions onto people is a bad idea. Doing things for people is by no means as helpful as doing things with them, and for many issues it is better to support them while they solve the problems themselves. The ideal situation is one in which those whose decisions and actions caused the Grenfell Tower fire 'own' the problem, while those who survived it 'own' the solution. And may that solution be a catalyst for rectifying the entire national 'landscape of negligence' that is being slowly and steadily revealed as the weakness of fire regulations becomes apparent. If the community is to triumph, those who provide the assistance, run the enquiries, put things right, and speak for the community need to remember the motto of E.M. Forster: 'only connect'.

Sunday, 18 June 2017

The Tinder-Box Tower: Fire and the Neo-Liberal Model of Disasters

During the night of Wednesday 14th June 2017 a fire developed in Grenfell Tower, a 24-storey residential block in North Kensington, London. The building was quickly engulfed by the flames and within 24 hours it was a burnt-out wreck. At the time of writing, the death toll has not been established, but it is probably between 70 and 90 people. Many of them were trapped on upper floors by the fast rising flames. The emergency response was massive and extremely rapid. This was only the third 40-engine fire response since the 1960s. It was the first time for many years that more than 100 firemen were committed to a highly dangerous environment. Despite their professionalism and heroism, at that point no fire-fighting operation could have stopped tragedy from unfolding.

It is a principle of the construction of tall buildings that in the event of damage to one floor progressive collapse should not happen. This principle was consolidated after a gas explosion in 1968 at Ronan Point, a 22-storey residential block in east London, led to the domino-like collapse of an entire corner of the building. In this case, the lesson was taken on board and governed practice. Another well-known principle is that fire should not be able to leap from floor to floor and thus climb the building. The speed and ferocity with which it did so at Grenfell Tower were quite extraordinary. Here, the lesson of past events was not transformed into safer practice. The building lacked a sprinkler system (whose installation would have cost less than 2.5 per cent of the cost of the renovations that took place over the period 2014-2016). Instead, for insulation purposes, the building was clad with panels that were not fire resistant, and the question now arises as to whether the building codes were observed, or whether they were at fault.

A further element of the Grenfell Tower disaster is that it exposed the gap in living conditions between wealthy and poor residents. The location of the tower, the Royal Borough of Kensington and Chelsea, is the wealthiest residential district in the United Kingdom, with the highest property prices. The northern part of the borough presents an entirely different picture. In fact, it is one of the most deprived enclaves of London and the United Kingdom. It is evident that the resources of the borough have not been channelled into making living conditions safe for the residents. The contrast between extreme wealth and relative poverty in the same local district is a salutary reminder of socio-economic conditions in 2017 and an illustration of the consequences of almost half a century of the divergence of living standards between rich and poor.

In the days that followed the fire there was much debate about the scope and quality of the response, about the division and assumption of responsibilities and about the fact that the Grenfell Residents' Association had spent four years warning the building's owner and operator about fire risks.

The neo-liberal model of disasters suggests that they are often used as a means of consolidating power and exploiting the poor and needy (Klein 2008, Loewenstein 2015). Here is a case which illustrates neoliberalism in its other guise, in which the leaders and arbiters of society care very little or not at all about the conditions of risk under which the poor live. The Royal Borough of Kensington and Chelsea is the richest local authority in Britain: why could it not support its own residents, and why did it condemn them to live in patently dangerous conditions? The answers to these questions lie in political priorities and how they are formulated, marketed and supported.


Klein, N. 2008. The Shock Doctrine: The Rise of Disaster Capitalism. Penguin, Harmondsworth, 576 pp.

Loewenstein, A. 2015. Disaster Capitalism: Making a Killing Out of Catastrophe. Verso Books, London, 376 pp.

Sunday, 4 June 2017

Disaster Risk Reduction: A Dose of Reality

The Times, 1922

I am now going to discuss things that I have not been invited to say, that you don't particularly want to hear, that are not part of my brief, and that will not seem relevant to your interests. I would like to share ideas that are not even half-formed, about problems that I can't prove are important, for which I have absolutely no solution.

Francis Fukuyama wrote about "the end of history", a catchy but facile idea. The kindest criticism is that he must have wanted a metaphor for something else, whatever it might have been. Human social development is certainly not ending, but where is it going? What we are actually seeing is the end of historiography, and no doubt it is a temporary end. Collectively, we are confronted with changes that are so profound they have only tenuous analogues in history, so complex that any simple projection of historical trends into the future tells us next to nothing. As a result, there is a widespread reluctance to look at history, to compare our predicament with that of people in the past and to try to extract wisdom from seeing "how it turned out". That is a pity, as history still has very much to teach us. But unfortunately history seems to change as much as the modern world does, for it has always been a matter of selective interpretation of half-known facts. Sadly, for many people, history has become the nostalgic theme park of illusion rather than a source of clear inspiration. More darkly, history is the justification of ideology, and that is a powerful stimulus to analyse it with a great deal of selectivity.

For the past two years I have had an increasing feeling that we are all on the wrong track. The journal that I founded and edit, the International Journal of Disaster Risk Reduction will this year have a submission rate of over 1,000 manuscripts. I have to look at all of them and I am increasingly getting the sensation that we are all on the wrong track, that we are missing the point. The world changes far faster than science does. In DRR we still use theory compiled in the period 1945-1970. The latter year is the one in which increasing global economic inequality started to invalidate it.

Orthodoxy tells us that hazards act upon vulnerability to create risk and disaster. Abating hazards and reducing vulnerability makes people and their communities happier, healthier and safer. But it isn't that simple. Constructing an alternative interpretation is going to be monumentally challenging.

One ingredient is "manufactured consent", as Herman and Chomsky described it (Herman and Chomsky 2008). When Chomsky wrote his critique of the process, he only had to deal with what we now call traditional media, and with a larger, less powerful world-wide oligarchy. Since then, there has been a massive concentration of wealth into fewer hands and a constantly widening wealth differential between the super-rich and the poor and relatively poor (Oxfam 2017). The years since the period 1970-73 have seen a sustained transfer of wealth from the poor who produce it to the rich who command and consume it. Much of that has been achieved by 'democratic' means, in which the choices presented to voters ask them to confirm processes that act to their own disadvantage. Democracy has become a plebiscite of the unpalatable.

Big data and the apparently uncontrolled networks of social media, coupled with changes in people's predilections for how they obtain and interpret information, have opened up new vistas. In the same way that people's shopping habits can be tracked and exploited, so can their political leanings, policy knowledge and voting preferences. To do so successfully requires massive resources, and massive resources are being devoted to this process (Davies 2017, Cadwalladr 2017, Grassegger and Krogerus 2017). Conspiracy theories are almost always a waste of time, but perhaps we are arriving at a point in which they may be self-fulfilling hypotheses rather than mere fantasy.

In the industrial revolution, automation created a disenfranchised underclass and then transformed it (Sale 1995). That is happening again, but some have argued that the opportunities for positive change are much fewer this time. There have been prediction that, as a consequence, social differentiation may widen until it forms an unbridgeable abyss. The more extreme among the futurologists have invoked a Wellsian dystopia (Wells, 1895, 1910). Anyone who reads not only H.G. Wells's longer novels but also his short stories will know that he was an extremely acute observer of the effects of social class upon people's development (see, for example, the tragic outcome of A Slip Under the Microscope - 1896).

In my lifetime - so far! - world population has almost tripled. Technological developments have occurred that would have been utterly inconceivable in 1953 when I first opened my eyes. Technological advances have spurred cultural changes of momentous importance. The pace of change continues to accelerate.

The dark side of humanity continues to develop. The arms trade becomes ever more lucrative and fuels more and more deadly conflicts in more and more places ((Akerman and Larsson 2014, Kassimeris and Buckley 2016)). Fuelled by the major producers of materiel (the USA, Russia, China, Germany and France, aided by Britain, Italy, Sweden and Israel) the arms trade has suffered no recession and has seen no barriers. If trade can globalise, so can illicit trade (in people, drugs, weapons and illegal profits), and so can terrorism. Half of world trade goes through the world's 78 tax havens (ActionAid 2013); eight men own as much as 3.6 billion people (Oxfam 2017); hidden global wealth may amount to three and a half times global GDP (Credit Suisse 2016).

Reported in this manner, these observations have the disadvantage that they are not tied to any particular system of explanation or an agenda for positive change. For disaster specialists, they suggest one powerful problem: much of the progress based on the traditional models that create a link between simple actions and positive progress is likely to be illusory. If not, it is nonetheless at risk of being summarily reversed.

One problem with constructing an alternative system is that any schema for interpreting disasters must be built upon a model of society, but society is changing too fast for the basic model to be stable. The goalposts seem to be moving around the pitch faster than the ball.

The first part of the solution to this problem is to acquire a healthy dose of realism. The actual environment in which we conduct our studies is no longer that of orthodox, traditional models. It is also capable of massive, rapid change. Secondly, in democracies, consent is increasingly managed or manufactured, and by ever more subtle and insidious means. Before we analyse or attempt to influence the agenda in favour of disaster risk reduction (DRR), we should be asking who controls it. Thirdly, the roots of almost all DRR are political. Political decisions determine what will happen and what will not. Human rights violations, corruption, undue influence, the exercise of arbitrary power, internment, forced migration, violence against citizens and failure to protect them are all potential negative consequences of the modern human predicament, and they stem from politics. It follows that scientific logic will be subordinate to political logic, often known as expediency, and sometimes as opportunism. Professor Terry Cannon notes that in climate change expenditures on things that worsen the problem, such as subsidies for fossil fuel extraction, refinement and use) are three orders of magnitude higher than expenditures on combatting the problem. This, he says, represents a 'cure to damage ratio' of 1:1000. For disasters as a whole, it seems to be about 1:46 (although I am blessed if I can locate the source of that rather tenuous assertion).

In our work we need to impose a caveat emptor. Failure to recognise the world as it is, and the real constraints on progress will diminish or annul the practical value of our work. However, once we understand the problem we can start to tackle it. In the end, even massive obstacles are no match for human ingenuity.


ActionAid 2013. How Tax Havens Plunder the Poor. ActionAid, London, 20 pp.

Akerman, A. and A. Larsson 2014. The global arms trade network 1950-2007. Journal of Comparative Economics 42(3): 535-551.

Cadwalladr, C. 2017. The great British Brexit robbery: how our democracy was hijacked. The Observer, London, 20th May 2017.

Credit Suisse 2016. Global Wealth Report 2016. Credit Suisse Research Institute, Zürich, 63 pp.

Davies, W. 2017. How statistics lost their power – and why we should fear what comes next. The Guardian, London, Thursday 19 January 2017.

Grassegger, H. and M. Krogerus 2017. The Data That Turned the World Upside Down. January 28, 2017. Art, Life, You.

Herman, E.S. and N. Chomsky 2008. Manufacturing Consent: The Political Economy of the Mass Media (revised edition). Bodley Head, London, 408 pp.

Kassimeris, G. and J Buckley (eds) 2016. The Ashgate Research Companion to Modern Warfare. Routledge, London, 486 pp.

Oxfam 2017. An Economy for the 99%. Oxfam Briefing Paper, January 2017. Oxfam, Oxford, 48 pp.

Sale, K. 1995. Rebels Against the Future: the Luddites and Their War on the Industrial Revolution: Lessons for the Computer Age. Addison-Wesley, Boston, Mass., 320 pp.

Wells, H.G. 1895. The Time Machine. Penguin Classics (2005). Penguin, Harmondsworth, UK, 144 pp.

Wells, H.G. 1910. The Sleeper Awakes. Penguin Classics (2005). Penguin, Harmondsworth, UK, 288 pp.

Thursday, 27 April 2017

The Institute of Civil Protection and Emergency Management

What is the difference between a professional society and a learned society? On the face of it, the answer is simple: a professional society is an association of people in a profession who wish to work together in some form, and to have a common identity and voice and to promote their profession. A learned society is a custodian of knowledge that acts on behalf of those members of a profession that are members of it. It immediately becomes clear that these definitions are inadequate and improving them is likely to cause the definitions to converge. Yet although some learned societies are professional associations, and some professional societies act as if they were learned societies, there is a difference.

Let us consider some examples. The UK's Royal Geographical Society (RGS) is a quintessential learned society. It exists to promote geography in all relevant guises and places. In the 1970s, the RGS worked alongside the Institute of British Geographers (IBG), which had been founded by university academics to promote interaction among those who teach and research geography (although school teaching was the preserve of another professional association, the Geographical Association). At a certain point, the RGS merged with the IBG. This did not preclude non-academics from becoming members and fellows of the resulting organisation (named the "RGS with the IBG") but it strengthened both as a result of joining forces. Both organisations retained their own journals, which had acquired different characters (nevertheless, it is hard to say that the IBG journals are more 'academic' than the RGS's flagship Geographical Journal).

The Geological Society of London (GSL) began in 1807 as a fairly loose association of gentleman-enthusiasts for geology. It grew into a major authority on geological matters and in 1991 absorbed the Institution of Geologists, which was the apposite professional association. The GSL provides the chartered geologist route to accreditation. Organisations such as the British Psychological Society (with 60,000 members) provide a focal point for knowledge, influence and identity in the given field and thus acts more as a professional association.

From this we can see that these organisations seldom seem to want to maintain a distinction between learned society and professional association. A strong mission to further their discipline or profession means that they will offer any services they think are appropriate and in demand by their membership.

The Institute of Civil Protection and Emergency Management (ICPEM) was born in 1938 as the Institute of Civil Defence. This is a field that, in its modern form, was effectively born in 1937, the year before ICD was founded, when the first concerted aerial bombardment of the modern era occurred in Guernica, northern Spain, and there were semi-improvised efforts to protect the non-combatant population. In point of fact, the idea of absorbing non-combatants into safe zones had gained traction in the 1930s, but it did not produce a civil defence body until 1958, when the International Civil Defence Organisation was founded in Geneva. ICD was effectively an association for the nascent profession of civil defence operative. Yet when it later became the Institute of Civil Defence and Disaster Studies (ICDDS) it adopted as its motto "excellence in disaster research" - the badge of a learned society. In 2009 ICDDS became ICPEM, but it retained the motto.

What should a learned society do to distinguish itself from a professional association? Perhaps before answering that one should ask, why would it want to distinguish itself in that way? The premier learned association is the Royal Society [of London for Improving Natural Knowledge - to give it its full title], which was founded in 1660. There is no doubt that the Royal Society is more prestigious and therefore carries more weight than other bodies that unite scientists, academics or people from the professions. Hence, there may be something of a desire to emulate it. Other than that, there is little in the way of a reason. A learned society must care for knowledge, a professional society for people, but there is no reason why it should not also be the other way around. Societies that are keen to thrive and grow are usually also keen to cover as many bases as possible within their fields, and to provide as comprehensive a set of services as possible.

As a learned society, ICPEM can trade on the fact that it is the oldest such body in this field in the world, older than, for example, the International Civil Defence Organisation. It has a more modern name, one that is entirely appropriate to today's emphasis on grass-roots emergency planning and management, and collaboration rather than authoritarian chains of command. The best way that ICPEM can fulfill its mission is to provide fellowship (in the broadest sense of the term) for those people whose professional or academic interest is civil protection or emergency management, and help to preserve, share and encourage the knowledge that they need to acquire or can offer. The best benefit from being part of ICPEM is to be part of a common process in which knowledge in its many forms is generated, developed and shared.

Tuesday, 18 April 2017

Publish, Perish and be Damned

A journal that is good enough to eat - Launching the IJDRR with a Swiss chocolate version.

We live in an age in which there are many inverse relationships between quantity and quality. One of these is in academic publishing. The journal I edit and helped found (the International Journal of Disaster Risk Reduction) rejects about 70 per cent of papers submitted to it. Last year (2016) there were 750 submissions. In 2017 it is estimated that at least 1,040 papers will be submitted. Perhaps 700 of them will not appear in print because they are not of an acceptable quality.

I have been a major journal editor continuously since April 1985. I long ago concluded that at least three fifths of academic publishing is motivated by personnel considerations. Scholars publish in order to get a job, retain a job or obtain promotion. Relatively few are lucky enough to be able to publish for the sheer fun of it, or for reasons of pure creativity.

When I started as an editor, the motivation for publishing rested largely with the author. Now it is almost exclusively a function of institutional impetus. Researchers publish because they are forced to: it is a requirement of their employment. In many cases, the requirement is tied to the usual absurd bibliometric parameters such as impact factors, 'upper quartiles' and so on. The least important consideration seems to be what is actually in the articles - i.e. the scholarly content.

Academics may be taught to carry out research, but they are not taught to publish, or at least not properly taught. It is amazing how little authors seem to understand about the process of seeing their work into print (digital print, as things now are). The great "advantage" of digital publishing is that with very small increases in costs one can now have colossal amounts of uninteresting, uninnovative research at one's fingertips, with search engines and portals that make it a piece of cake to ignore all this work.

In much academic writing there is a great lack of professionalism. Very many authors submit articles for peer review with badly constructed, appallingly written abstracts, clamorous grammatical errors in the title, and so on. Then they wonder why no one wants to do the peer reviewing! Most authors ignore a journal's style requirements and in formatting terms few produce well-constructed manuscripts. I suppose an author might ask why this matters, given that the message and the content of the article are the real matters at hand. Well it does matter. Editors and referees cannot entirely free themselves from forming opinions based on the appearance of an article.

A good article is carefully laid out, double spaced, with page numbers and non-intrusive line numbers. The references are properly cited and listed, and care is taken to make sure that they are complete and in the proper format. Tables are well laid out, figures are fully legible, and the language in which the article is written is good enough for an international audience. This means that it is free from slang, syntactical contortions and grammatical errors (see below).

Sending in a well-formatted article (or book manuscript, indeed) betokens a sensitivity to the needs of the reader. Yet we who habitually read academic manuscripts have to cope with all sorts of horror. One of those that upsets me the most is the manuscript that, thanks to advanced word-processing capabilities, is formatted as a facsimile of a published manuscript, complete with the journal's masthead banner and two-column pages. Evidently the author thinks that this will increase the paper's chances of being accepted. Not on your life! If it is eventually accepted, I feel pity for the copy editor and typesetter who have to unravel all the formatting so that they can start again with publishers' software.

The other common problem, and increasingly so, is that in disaster risk reduction, too many manuscripts are being submitted for publication by people who do not have enough background and learning in this field. One only has to read the first quarter or third of the paper to find this out, when it becomes apparent that (a) the authors have not considered the principal literature on the topic they are writing about, (b) they are unaware that much of their study has been done before, in some cases decades ago, and (c) the literature review, which should be the basis of or starting point for their analysis, is nothing of the sort. It all contributes to the tendency to "reinvent the wheel" which is such an 'industrial disease' of disaster studies.

Writing is a process of communicating. Like all communication, it is useless to talk to an interlocutor in a language that he or she does not understand. Comparatively few academic authors seem to think about the effects of communication from the point of view of the reader, listener or viewer. It is not a one-way, one-sided process.

Put together, all of this adds up to a 'per capita inspiration gap' - more to read, but less to learn from it. Let us do what we can to reverse the trend, bring back the creativity, instil the professionalism, and stick two fingers up at the institutional requirements that have done so much to "dull down" science.


Typical errors that occur, and recur, in academic papers written in English (or a species of English) are the following. All of these are avoidable defects.
  • absence of articles where they are required
  • errors of common English usage
  • awkward, unbalanced and contorted syntax
  • indiscriminate use of slang
  • clash of tenses
  • misuse of tenses
  • failure to use the possessive case when it is needed
  • misuse of capitalisation (common and proper nouns)
  • clash of singular and plural in the same sentence
  • failure to  recognise collective nouns
  • use of past imperfect where the past definite tense is needed
  • lop-sided clauses
  • failure to observe standard conventions
  • missing hyphens in compound adjectives
  • misuse of the present participle
  • wrong wording to introduce lists
  • wrong choice of words
  • synthetic words with no real meaning
  • misuse of the possessive case
  • misuse of idiom.