Thursday, 9 April 2020

More on the Covid-19 Academic Gold Rush


A few days ago, I was accused of "putting the careers of trainees and junior faculty at risk" because a member of my editorial team was slow to complete the review process on a paper that had been submitted to the journal I manage. This reminded me that perhaps 70 per cent of academic publishing is for personnel reasons (to get a job, keep a job, obtain a salary raise, or achieve promotion). I cleave to the old-fashioned view that publishing should take place to further the sharing of good ideas. Nevertheless, I cannot ignore the breakneck speed with which papers are propelled into [digital] print nowadays.

In this context, I salute the thoughtful work of Christopher Gomez, Dierdre Hart and JC Gaillard (Gomez and Hart 2013, Gaillard and Gomez 2015) on the phenomenon of the "disaster gold rush". When a major disaster occurs there is an almost reckless desire to be first in print. This also exists outside the academic field. Indeed, someone ought to do a study of the "book of the disaster" and see who gets the award for the earliest "instant book" to commemorate the damage, destruction and casualties. Gomez and his colleagues drew attention to the worst traits of the "gold rush", namely potential abandonment of ethics and rationality in pursuit of a first-past-the-post research gain.

The Great East Japan Earthquake, Tsunami and Nuclear Release (GEJET), as it has come to be known, produced at least 2,000 papers, and a variety of books, during the first three years of its aftermath. This is probably a substantial underestimate. Thereafter, more and more continued to appear. Currently, as an editor I am dealing, nine years after the event, with two or three new submissions on this disaster. However, the GEJET publication surge is beginning to pale into insignificance next to the Covid-19 gold rush. We confront a new phenomenon: intra-disaster research publication.

Between 1st January and 3rd April 2020, 6,659 papers on Covid-19 were published. Some 83% were in peer reviewed journals and 17% (1,135) came out as unreviewed pre-prints. According to a leading researcher (Erica Bickerton), "keeping on top of which preprints ... are relevant and have robust methodologies is one of the key challenges emerging from the scientific response to Covid-19" (Baker 2020). It is of note that many of the articles were in fields other than medicine, genetics and epidemiology, such as sociology, psychology, jurisprudence and international relations. In short, papers on Covid-19 are coming out at the rate of 67 a day. It is highly probably that the flow will amply exceed 100 a day once research really gets into gear. It is predicted that, in the short term, the proportion of pre-prints will rise.

Much of the research that appears will be repetitive, short on insight, premature and lacking in rigour and scientific testability. Hence, these are some good criteria for presenting Covid-19 research to a potential readership.

Rigour. Does the research conform to the standard tenets of the scientific method: reproducibility, verification, completeness?

Novelty. Will the paper add anything to the debate on Covid-19, or our knowledge of the disaster, that is not already known and present in some of the many other articles that are available?

Utility. Will anyone read the paper? Will the benefit from it in any way? How can a potential readership be convinced to read the paper rather than the other 66 that came out on the same day?

Transformation. Is there any way of measuring or monitoring the take-up of ideas that come from this paper?

There is still much value in papers that have no "pathway to impact". Moreover, it may be that the real impact of a piece of research is not being measured, because to do so is difficult or impossible. In that case, there needs to be another kind of justification for publishing the paper.

As the university world undergoes a radical metamorphosis and transfers its activity to remote working and distance learning, we are all asked to "do more" to achieve this seismic shift. One of the greatest failings of the modern university is its utter lack of appreciation that time is not an elastic commodity. If we are asked to do more, it must be at the expense of some other activity. Paradoxically, "doing more" reduces our productivity, because it forces us to do less important–but more urgent–tasks in place of those that produce a more enduring, positive legacy. The compensatory mechanism involves providing evidence of productivity by going hell-bent for the "quick fix". The most absolute casualty is the time to read and bring oneself up to date with the latest developments.

Major disasters usually lead to a substantial increase in information flow. Covid-19 may be different because information may well become available to a geater order of magnitude than ever before.

It is obvious that much of what is written will be read by practically no one beyond the authors and perhaps a couple of referees. What use is it then? One should bear in mind that in older neglected literature there may be nuggets of gold that escaped the rush, if we only care to go back and look for them. But beyond that, the only valid survival technique is to try, perhaps vainly, to learn how to be ultra-selective on what one does read.

References

Baker, Simon, 2020. Huge Covid-19 output prompting ‘sea change’ in access to research. Times Higher Education, April 9, 2020.

Gomez, C. and D.E Hart 2013. Disaster gold rushes, sophisms and academic neocolonialism: comments on ‘Earthquake disasters and resilience in the global North’. Geographical Journal 179(3): 272-277.

Gaillard, J-C. and C. Gomez 2015. Post-disaster research: is there gold worth the rush?  Jámbà: Journal of Disaster Risk Studies 7(1): 1-6.

Friday, 27 March 2020

Covid-19 and the Disaster Research Gold Rush


As I write, the Covid-19 pandemic is ramping up in many countries. So is the response by academic authors. In 2015 Gaillard and Gomez published an interesting paper on the "disaster research gold rush". It was inspired by some thoughts expressed in 1967 by eminent sociologists of disaster on problems associated with researching transient events (Dynes et al. 1967). In the life-cycle of a disaster, when is it appropriate to do such research?

Disasters give rise to many imperatives. They also generate 'perishable' data that, if they are not collected, will disappear without trace. By common consent, disaster researchers rarely go to events in the early stage of the crisis. To interrupt vital life-saving efforts with social surveys or demands for data would be unconscionable. However, it is a different matter if the researcher can work without visiting the site, putting a foot in the door of the emergency room or stumbling across the path of rescuers.

I am the editor-in-chief of a large international academic journal. The Covid-19 gold rush has already begun. The trickle of papers threatens to turn into a raging torrent, and the disaster is not yet half way through the crisis phase. This points to a conflict. On the positive side, academics wish to throw light on the problems caused by Covid-19, suggest solutions and launch valuable new initiatives. They also wish to capture experience and preserve it as evidence on which to base future policies and plans. On the negative side, there seems to be an urge to be the first in the field with a paper, as if this were are race to be won.

Authors can write in haste and repent at their leisure: editors can rue the day. Much of what is written will need to be reconsidered in the light of the outcome of the pandemic, which is months away, and the post-event debate that follows it. I admit that this is equally true of the present blog, but my criticism is not aimed at those who express an opinion. Debate is healthy, even when there is a need for national and international solidarity. However, any analysis based on half the story is likely to be suspect.

A positive side of the urge to publish is the desire to contribute to the debate before it lapses because attention is diverted to other issues. However, there is a prevailing question about how soon in the sequence of a disaster is it appropriate to take stock? This depends on how easy it is for earlier conclusions to be invalidated by the progress of events. The question is then, to what extent is this predictable or likely to create exigencies that cannot be included in the present analysis?

The Covid-19 pandemic is distinguished by high levels of uncertainty in many of the tenets that anchor the scenario: infection rates, geographical spread, case-fatality rates, government policies and their impacts, public discipline or indiscipline in the face of emergency measures, and repercussions on the economy and people's livelihoods. These factors militate against an over-hasty academic response. So when you read academic papers written in the thick of Covid-19, caveat lector!

References

Dynes, R.R., J.E. Haas and E.L. Quarantelli 1967. Administrative, methodological, and theoretical problems of disaster research. Indian Sociological Bulletin 4: 215-227.

Gaillard, J-C. and C. Gomez 2015. Post-disaster research: is there gold worth the rush? Jàmbá: Journal of Disaster Risk Studies 7(1), Article no. 120, 1-6.

Thursday, 26 March 2020

Interpreting the Pandemic for Decision Making and Action



As the Covid-19 pandemic progresses, causing distributed crises in one country after another, it is like watching all I have taught about for the last four decades flash past in a sort of speeded-up film.

In disasters of this kind, three attitudes are common. Normalcy bias, the tendency to accept the most reassuring interpretation of a situation, tells people that it will not be as bad as it seems. The syndrome of personal invulnerability tells them that the worst can only happen to other people. This leads to cognitive dissonance, simultaneous belief in two incompatible ideas. All of these are misleading states of mind, perhaps dangerously so, and especially when found in politicians and decision makers. Other states of mind combine with the three syndromes to worsen the environment of reaction to the crisis.

For a century, the Hollywood model of disaster has been widely disseminated in disaster movies. The underlying precept of this is that beneath the civilised veneer of society pervasive anarchy lurks, which in cases of extreme disruption would bring out the "savage" in ordinary people, a tendency to safeguard their own interests by all conceivable means, including extreme violence. The Hollywood model relies on heroes and villains. In disaster movies the villains either have rapacious ulterior motives connected with power and greed, or they are immune to common sense. The hero (almost always a man, not a woman, who is invariably his loyal supporter) is a young, fit, active, misunderstood visionary, the only person who can see the truth and is prepared to fight for it. Against all odds, he wins. What nonsense! In reality, fighting disasters is a collective effort that places complete reliance of developing enough consensus for measures to work. It demands meticulous prior planning and the development of detailed scenarios before the phenomenon strikes. While anti-social behaviour does not disappear, especially where it is ingrained in society, there is often an upwelling of social participation, which the sociologists know as the therapeutic community.

For Covid-19 we had the scenarios. I have taught them at least once a year for more than a decade. My primary message was that a pandemic is as much a socio-economic and behavioural problem as a medical and epidemiological one. There was nothing visionary about my approach. It was based on material that has been well publicised and freely available since the second half of the first decade of the century. Normalcy bias, fiscal stringency and the ideology of profit conspired to stop preparations from being effective. Dire warnings were given by emergency planners and epidemiologists. There were ignored or underrated, largely because dire warnings tend to be discounted. They do not encourage decision makers to assume responsibility.

What is now fascinating about the consequences of this is that improvisation is being used frantically to cover the yawning gaps in preparedness. Improvisation cannot be totally avoided in disaster, but failure to prepare leads to avoidable deaths and suffering. Hospital beds, medical staff, personal protective equipment, ventilators, vaccines, antivirals, palliative medicines, economic subsidies, payment for holidays, substitute wages, organised assistance, the policing of social distancing, the reorganisation of public transport and basic services, emergency communication, they are all being improvised. It is rather too much and hence the task is simply overwhelming.

For many people, the Hollywood interpretation, familiar as it is, was the only model they had recourse to in order to understand the workings of this disaster. It rapidly proved unserviceable and has been abandoned by many, thank goodness. However, it has been followed by catastrophism. We might see this interpretation as one of the "orderly breakdown" of society, in contrast to the anarchic version propounded by Hollywood. Inasmuch as there is a model, it is the 1920s, a dire period that led, through mass unemployment, general strikes and failure to care for demobilised servicemen, to the Great Crash of 1929, and ultimately the Second World War. Perhaps the overriding question is "after Covid-19 has finally abated, to what extent will the world be the same as it was?" There is little sense that it will be a better, stronger, more prosperous place.

A related question concerns whether the lessons of this pandemic will be learned. The indications are, at best, mixed. There is often a tendency to assume, once a disaster has passed and recovery is well underway, that it is "all over", an episode that will not be repeated. One of the greatest lessons is the imperative need to take emergency planning and preparedness more seriously. I hope that this will be received in full, resources will be supplied and major effort will be expended, but I am not going to bet on it.

In previous posts, I have suggested that disasters need to be understood with more emphasis on their context. I suggested the "egg model", in which the yolk is a set of causal factors directly related to disasters and the white is all the factors that impinge on the yolk without being direct causes of disaster. In another model, general vulnerability encapsulates specific vulnerability to disasters. We will need to look long and hard at the context of future susceptibility to disasters and chart the interaction in a world that will inevitably bear the marks of this major upheaval. Until some form of stability is reached, progress on this important task is likely to be very restricted.

Sunday, 15 March 2020

Covid-19: An Address


Once upon a time in Tuscany, when my son was very young, I lamented to a friend that all he seemed to have learned at school was how to swear and blaspheme. My friend put on a wise expression and said, very sagely, "David, you must remember that swearing and blaspheming are an ancient tradition in our countryside here." I mention this because, in AD 1244, a group of men in a bothy decided that they were swearing and blaspheming too much and each time they did it they would put a groat in a jar. At the end of the year they would open up the container and spend the proceeds on a feast. As it happened, when they came to open the jar there was so much money in it that they hadn't got the nerve to spend it on food and drink for themselves. So they founded a rescue and care society. Thus was born the Venerable Company of the Misericordia. After 776 years it is still headquartered in the same building, situated next to the cathedral in central Florence. In the small Tuscan town in which I live, the Misericordia did not arrive for a further 291 years (in 1535, precisely), but what are a few centuries between friends?

An organisation of this kind obviously has very deep roots. And there is great pride in it. Moreover, it started a process of creating associations that proliferated to the point that volunteer organisations are the backbone of the Italian civil protection system and there are 3,600 of them. They connect the world of emergencies with the general population. In the last four earthquakes in thed affected areas there were more rescuers than population, which is a remarkable achievement, especially as they are trained, organised and equipped. Most importantly, they are a full part of the system.

In times of grave crisis, we need solidarity, but where are we going to get it? The Covid-19 pandemic ought to teach us the importance of both preparedness and social participation. With regard to preparedness, emergency response consists of planned activities, standardised procedures and improvisation. Too much of the last of these is tantamount to negligence, because it shows up the lack of planning and preparedness. Social participation is about the democratisation of emergency response. Civil protection is us–all of us. It works best where people are protagonists and assume some responsibility for their own destinies. Participatory civil protection is prudent. It draws in the beneficiaries of the system as actors and protagonists, but it, too, needs to be organised in advance, with foresight.

Different countries have different attitudes to emergency preparedness and the protection of the general public. These stem from their administrative and legal systems and cultures, and from the attitude of the holders of power to direct democracy. Cyclone Nargis devastated Myanmar and killed 136,000 of its citizens, in part because they had no role in creating resilience against such an event.

In Britain, there is a culture of "leave it to the experts", or perhaps, given the anti-intellectual wind blowing through the country, disdain for experts (a paradox, indeed). Since the stand-on-your-own-two-feet privations of the Second World War, there has been a gradual and sustained retreat from personal respomsability in favour of leaving the work to officialdom–and blaming officialdom when there are shortcomings in its response. There has also been the ideological assault of Thatcherism, a retreat from social provision and a gradual abandonment of the idea of participatory democracy. "There is no such thing as society," she said, and did her best to make it a self-fufilling prophecy. At the same time, control of the country's assets became ever more centralised.

Britain does not lack a social conscience, or sensitivity to social issues. Many people labour ceaselessly to provide what the state has increasingly turned away from. When there is a major crisis, the response is generally to revive what sociologists call the 'therapeutic community'. But it has been weakened. Britain follows, not the continental European model of solidarity, but the American model of individualism, and this is particularly true in civil protection. It is a pity, as the United States have lost the lead in this field–if they ever had it. The coup de grâce was the response to Hurricane Katrina in August 2005. It was a model of inefficiency and failure.

Commentators and officials used to say that "Britain does not have disasters". We have instead, "major incidents". Why we should be so reticent about calling them disasters is a mystery to me, for that is what they are. Covid-19 is rather more than a "major incident". It is a gargantuan civil contingency that will leave its mark for decades to come–among those who survive it.

There was a period in the decade of the 2000s and early 2010s in which pandemics were the focus of emergency planners' attention. They were then supplanted by a renewed emphasis on counter-terrorism and that in turn was overlain by frantic preparations for the supply-chain disaster of Brexit (emergency response is very much a "flavour of the month" business, just like ice-cream shops). Looking at the current British response to Covid-19, it is difficult to see quite where all that effort on pandemic emergency planning went. Perhaps the answer is that it is too difficult to plan a system that has been decimated, after eight years of Draconian budget cuts. However, it is not exactly easy to organise an alternative system in the thick of the emergency. Voluntarism comes in three types: spontaneous, organised and organised-and-incorporated. The last of these makes a real civil protection system. The first is inefficient because it is shorn of training and equipment, it is not adequately absorbed into the emergency response machine, and it has to be organised with much effort on the spur of the moment. That is what we are currently seeing in Britain. But at least we are seeing an attempt to generate solidarity, even if it is not of the optimum kind.

Emergency planning is something that needs to be done in times of quiescence and peace, but it also goes on throughout the emergency, as the provisions that were made in advance have to be adapted to the dynamic, evolving situation. Much use is made of scenarios. A scenario is not a prediction of the future. In this context, it is a systematic exploration of a range of possible future outcomes. For pandemics, we use parametric models. These are simply sets of equations in which, on the basis of–often flimsy–assumptions and a minimal amount of data–coefficients are set and calculations are made. Garbage in-garbage out if the sum total of the data and the assumptions are not adequate. And the trouble with pandemics is that there is truly massive uncertainty: disease transmission rates, infectiousness, survival rates, social behaviour, the effect of emergency measures, healthcare response–they are all infinitely mutable.

Under such circumstances, two elements of strategy are to be recommended, neither of which has been properly practised in Britain. One is not to discuss the worst case scenario unless there is a very strong practical reason for doing so. The other is to devote very substantial resources to early preparedness, as a precautionary measure and a demonstration of prudence. "Normalcy bias" makes us assume that it won't be that bad. It might be, so we must be ready to fight it.

Besides Covid-19, the shadow of another phantom stalks us. That is the Hollywood model of emergencies. Research (including my research) shows that it is remarkably resilient, not only among the general public, but also among the emergency response communty and the political classes. In essence, the Hollywood model is one in which a destabilising emergency causes the breakdown of society, revealing the essentially selfish, self-seeking egotism of its members. The antidote is marshall law. There have been rumours about preparations for this in Britain, a country which tends to look upon emergencies as public order problems. If we ever arrive at such a situation it will be a sad, sad indictment of British society. It would presuppose that, by and large, people have no social conscience and that the fabric of society is completely threadbare. As that seems unlikely, I think the model should be dropped like a hot potato. It is as well to remember that outbreaks of looting tend to occur strictly where there are substantial preconditions for them. They do not happen spontaneously in a completely unpredictable manner. Nor do other forms of antisocial behaviour. In this, we reap what we sow.

Disasters need to be seen in the contexts in which they occur--I nearly said the context in which we generate them, for they are not natural. Identity politics, scapegoating, contraction in the welfare safety net and public services, all these impinge, negatively, on the developing disaster scenario. For many people, Covid-19 is a bolt from the blue. It either is or will become a major source of anxiety and destabilisation in their lives. Yet what is happening is predictable and was predicted, comprehensively and repeatedly. The question is who was listening? And who was willing to act? There is now a major opportunity to rethink the tenets of modern life and refound them on the basis of prudence, solidarity and participation. And let us not forget that for the inhabitants of Syria, Yemen and all those places where grinding poverty routinely overwhelms people, existential threats have been the normal way of life. The influenza pandemic of 1918-1920 killed most of its victims in developing countries that entirely lacked health-care systems. That is a sobering lesson. Amartya Sen wrote that health care is a prerequisite of development: never more than now.

Wednesday, 11 March 2020

Covid-19: Elements of a Scenario



It is now more than ten years since there was a general push to induce countries to plan for pandemics (WHO 2005). In some quarters, it had an immediate effect (e.g., US Homeland Security Council 2005, UK Government 2008), while in others it did not. In 2020, some confusion arises from the fact that much of the planning refers to influenza, whereas the SARS category of diseases is not strictly a 'flu virus, but most of the planning principles are exactly the same, so this is mainly a labelling issue.

In the wake of the WHO report, Professor Ziad Abdeen of the Palestinian Health Authority said about pandemics: "My task is to tell you things you don't want to know, and ask you to spend money you haven't got on something you don't think will happen." About the same time, 2007, Dr Michael Leavitt of the US Department of Health and Human Services wrote: "We don't know when a pandemic will arrive. However, two things are certain: everything that we do before will seem alarmist; and everything that we do afterwards will seem insufficient." Nevertheless, Dr Margaret Chan, then the Director of the Communicable Diseases Section of the WHO, stated "For the first time in human history, we have the ability to prepare ourselves for a pandemic before it arrives. Therefore, the world community must take action immediately." That was at a time when an influenza pandemic with devastating consequences was greatly feared. When it came, in 2009, it was less deadly than expected, but that does not negate the possibility of a highly contagious disease with a case fatality rate equivalent to that of SARS in 2003, namely 10%.

Major epidemics and pandemics (what is the difference?) are crises which have to be managed simultaneously at several levels, from international to local. They also involve very high degrees of uncertainty. The mathematical models that are used to predict the diffusion of diseases tend to be parametric and to depend on simple but debatable assumptions (e.g. Mathews et al. 2007). If the progress of the disease is linear or simple non-linear, that is acceptable. However, aggregate human behaviour and many local factors, including prevention measures, can modify the prognosis.

I have no magic answers to the problem of the SARS-CoV-2 virus. One thing we know about pandemics is that the socio-economic effects, not only the medical ones, can be profound. Vast mutations in society, economy and social behaviour are possible. Although we hope that they will be temporary, they may well have some lasting effects. For example, civil aviation is particularly vulnerable. Based on SARS 2003 and early infection rates, a projection for the effect of Covid-19 on the airlines shows a dip for six to nine months (Pearce 2020). However, this depends on whether companies manage to remain solvent during and after the crisis. The civil aviation landscape could change drastically in the longer term.

The figure above is an attempt to summarise the issues connected with the Covid-19 pandemic. In each of the five categories there are opposites or alternatives. Thus, the vulnerability and fragility of people and society need to be seen in the light of socio-economic changes induced by the epidemic. Here is how I would classify the impacts and challenges. This is intended as the basis for possible scenarios of the outcome of the pandemic.

Severe negative impact:-
  • transportation, especially civil aviation and trains
  • hospitality and tourism
  • catering
  • the entertainment industry and cultural attractions
  • general medicine
  • ill and handicapped people
Under severe duress:-
  • decision making systems and decision makers
  • police
  • hospitals
  • civil protection forces
  • social welfare systems, including non-governmental ones
  • insurance
  • prisons
  • civil liberties?
Under duress:-
  • food supply and supermarkets, including supply chain
  • general consumption of consumer goods and consumer durables
  • distribution systems
  • IT systems
  • manufacturing
  • basic infrastructural services
Presenting opportunities:-
  • sanitary items, pharmaceuticals
  • vaccine research, production and distribution
  • security industry
  • information services (mass media, social media)
Changing and needing to adapt:-
  • mass media
  • the financial system
Intangible effects:-
  • trust in authorities
  • medical ethics (in the apportionment of and priorities for healthcare and vaccination)
  • difficulties associated with work
  • difficulties associated with social interaction (family, recreation, etc.)
  • difficulties associated with procuring reliable, intelligible information
  • being in the wrong place at the wrong time
  • general uncertainty in the evolution of the epidemic
These, then, are the possible building blocks of scenarios that could be used for future planning. It is important to ensure that emergency planning measures are balanced among society's and people's needs. This is where scenarios can help by providing the broad overview of demand for help. It is also important to remember that the Covid-19 pandemic occurs in the context of other developments. These include post-recession recovery, polarisation and wealth distribution issues, and the pressing need for climate change adaptation. In Britain, they also include the Brexit negotiations and the shape of the post-Brexit economy that will emerge after 2020.

A very important question to be answered is how a country functions amidst a general shut-down, something that is looking increasingly necessary in a number of places. This matter has to be examined from the point of view of citizens, workers, decision makers, crisis responders, and other stakeholders. Perhaps comparisons with the general strike of 1926 might yield some insights. However, it is important not to overdo historical comparisons, as we know from looking at the effects of the 1918 'Spanish' influenza (Spinney 2018).

Finally, another aspect to take into consideration is the politicisation of disease. The 'Spanish flu' was so called because Spain was liberal in its control of information, yet the virus came from China. Wherever a disease comes from, and whoever brings it, there is no justification for creating scapegoats. One has the impression that in 2020 the world needs to relearn the practice of solidarity between nations and between communities. This brings us back to context, and the potential role of identity politics, a corrosive role if ever there were such a thing.

References


Mathews, J.D., C.T. McCaw, J. McVernon, E.S. McBryde and J.M. McCaw 2007. A biological model for influenza transmission: pandemic planning implications of asymptomatic infection and immunity. PLoS One 2007:2(11): e1220, 1-6.

Pearce, B. 2020. COVID-19 Updated Impact Assessment of the Novel Coronavirus. International Air Transport Association, Geneva, 14 pp.

Spinney, L. 2018. Pale Rider: The Spanish Flu of 1918 and How it Changed the World. Jonathan Cape, London, 352 pp.

US Homeland Security Council 2005. National Strategy for Pandemic Influenza. US Government, Washington DC, 12 pp.

UK Government 2008. Pandemic Flu: UK International Preparedness Strategy. United Kingdom Cabinet Office, London, 27 pp.

WHO 2005. Global Influenza Preparedness Plan: The Role of WHO and Recommendations for National Measures Before and During Pandemics. Department of Communicable Disease Surveillance and Response, Global Influenza Programme, World Health Organisation, 49 pp.

Thursday, 6 February 2020

Community Resilience or Community Dystopia in Disaster Risk Reduction?


In disaster risk reduction circles, there is an almost desperate reliance on 'community' and a strong growth in studies and plans to "involve the community" in facing up to risks and impacts (Berkes and Ross 2013). The intentions are laudable, as DRR needs to be democratised if it is to function. However,'community' is contentious concept (Barrios 2014).

The many places in which I have lived have had highly varied levels of expression of community. For example, I spent many years frequenting a small town in the mountains of Southern Italy. It was founded 3,770 years ago and has suffered many vicissitudes over the better part of four millennia. The current urban form was largely given to it by Norman invaders 940 years ago. It is an architectural paradise of which the inhabitants are, rightly, fiercely proud. Yet, faced with natural hazards, relative isolation, economic deprivation and cultural decline, it badly needs social solidarity, and that is something it lacks. Boredom and the accumulated malaise of centuries of oppression, envy, division and exploitation have divided the community into invisible factions. It may be an exaggeration to talk of Edward Banfield's 'amoral familism' (Banfield 1958), but in such an environment community cohesion is a highly relative phenomenon.

At the other end of the scale, I reside in the most multi-ethnic and politically homogeneous district of London. It is an area with a shifting population of migrants, students, temporary workers and those who soon move on to greener pastures. Unfortunately, some of its most defining characteristics as a community relate to drugs, homicide, homelessness and terrorism. In this, I do not exaggerate, as the record of recent events in London demonstrates the connections quite clearly. Community can just as well be defined negatively as positively.

Several arguments can be marshalled against the idea of community resilience:-
  • The concept of 'community' has no inherent geographical scale. A group of like-minded individuals spread across the globe might just as well be a community as a collection of people in a local neighbourhood.
  • If some form of social cohesion defined communities, they do not necessarily have it.
  • In modern cities, neighbourhoods may well have a shifting, rootless population that lacks common ground.
  • Identity politics can split communities (considered as the population of an area) into factions.
  • In a local area, the population may have different, perhaps conflicting objectives
  • 'Community' is not an efficient way of using people's skills.
With regard to the last of these points, it is worth considering the criticisms of the concept of social capital, which is often used in conjunction with community-based DRR (Haynes 2009, Inaba 2013). I need not repeat them here.

Moreover, in many places the 'community' is dominated by the most powerful people in the area, or indeed by a single person. The manifestation of community may simply be an expression of the will of the powerful. The weak and the marginalised may be deprived of a voice or not listened to. In the worst cases, they are effectively invisible outside their own circles. Hence, 'community' is about influence. At its most benign, it is about relative influence (Barrios 2014).

The struggle to create community resilience pits organised collective action against individualism. The latter was perhaps best articulated by Margaret Thatcher in an interview in 1987. "There is no such thing as societeigh", she said in that false plummy, slightly hectoring, distinctly overbearing voice (Tice 2010). She went on to make that pronouncement a self-fulfilling prophecy by conducting war on social institutions until they failed or were dismantled. To soften the blow, she added "It is our duty to look after ourselves and then, also, to look after our neighbours." The door had been left open for voluntarism to compensate for the retreat of the state. Altruism had been put in its place. Queen Elizabeth II said in one of her Christmas addresses to the nation that, in a lifetime of visiting and meeting, the happiest people she had encountered were those who were helping others. Does this confirm the Thatcherite view of the social order or contradict it? A more useful question for those who wish to create community resilience might be "what stands in the way of collective action?"

Sociologists have long known that disaster creates community by spontaneously knitting people together into a social grouping that they term a 'disaster subculture' (Granot 1996). This is an association of people from diverse backgrounds whose cohesion derives from an overwhelming emphasis on a common aim. For example, the survivors of the Grenfell Tower fire of June 2017 are united in their quest for justice and decent living conditions. But how does this translate into community resilience?

In any manageable geographical unit there are likely to be associations of citizens: faith groups, voluntary associations; recreational, political and business groups. But do they add up to a community? Perhaps under the disaster subculture, they do, but that does not help preparedness.

And then there is community dystopia. 'Community', a word apparently absorbed into the English language from French after the Norman invasion, is the subject of numerous definitions in the Oxford English Dictionary. Many involve the idea of a shared destiny or common ground. Some require uniform ethnicity or belief, or some form of fellowship. At least one makes a distinction between officialdom and the laity (i.e., ordinary people). Community-based DRR necessarily requires the two parties to collaborate, but what if the relationship is antagonistic rather than symbiotic? And what if the community is fragmented? How does one initiate a dialogue with the community if it has no centre, no identifiable single representation? Who leads the community in a crisis?

From this we may conclude that the community, if it exists, is heterogeneous, not homogeneous. It is likely to be polycentric (hoping that I am not taking too metropolitan view of the phenomenon). It may be therapeutic or it may be dysfunctional.

Rioting and looting occurred in London in 2011 and in Concepcion, Chile, after the 2010 earthquake and tsunami. Such behaviour is a result of the spontaneous abandonment of both property norms and the canons of reasonable behaviour (Alexander 2013). It does not stem from inability to involve community in DRR, but it can be fuelled by a general sense of marginalisation and lack of justice. Once again, involving people in DRR will not solve such problems, as the broader context has to be addressed. That is why rioting has continued in Chile, a decade after the disaster.

In many modern cities, the development of a collective consciousness is hampered by the problems of urban living: expensive, sub-standard housing, which is in short supply; exploitation through poor, badly remunerated working conditions; pollution, overcrowding, and so on. Social participation often decreases when people are under duress, although communal protest may revive it.

Another factor to consider in any assessment of community-based DRR is culture. Britain has long had a culture of secrecy, especially in official dealings. This is allied with the pervasive concept of "leave it to the experts" that only recently has been challenged. Despite attempts at openness, for example by passing laws about disclosure, secrecy has survived remarkably well, and it is strongly backed by Draconian libel and slander laws. The UK Official Secrets Act (1989) defines vaguely what information is protected but is quite clear about the consequences of sharing such information: "the person into whose possession the information, document or article has come is guilty of an offence if he discloses it without lawful authority knowing, or having reasonable cause to believe, that it is protected against disclosure" Cultural change is a slow, progressive process that requires much persistence and prodigious resources. Britain needs to be levered out of its obsession with secrecy towards a form of emergency management and disaster risk reduction based on disclosing information, not withholding it. That is the route to the democratisation of DRR. Secrecy has its place, but that place should be well circumscribed–by common sense, a spirit of openness, and a desire to share information for the betterment of society. Progress has been made, but much more is needed.

Another aspect of culture is its role in creating community identity. In London it is now extremely rare to hear someone speak with a London accent. In Florence, by contrast, there is a living, active concept of fiorentinesimo. Various well-patronised websites promote it through reviving dialect and other traditions, many ceremonies and cultural events promote it as well. Individuals such as the actor Roberto Benigni give it a high profile. This should be a point in favour of defining the community. However, it is as well to note that in Florence there are two quite separate communities: Florentines and others (including tourists).

In my opinion, initiatives that rely on trying to inculcate or promote community resilience are hanging their strategies on the wrong hooks. Rather than searching for the 'community', I advocate identifying, utilising and developing specific mechanisms. The DRR community has to be created, not discovered. It will be hard work, and even more hard work will have to go into making the result sustainable as society continues to change at an ever accelerating rate.
  • Alternatives to 'community engagement in DRR' do exist:-Organised voluntarism can be incorporated into the official civil protection system.
  • The vigorous promotion of equity, and a greater degree of equality, would make society healthier and happier.
  • Working with specific groups and networks will encourage participation and ensure that it fits with DRR needs.
  • The barriers to positive action between official and unofficial forces need to be reduced.
  • The social groups involved in DRR need to be recognised by officialdom.
Returning to Thatcher, I would argue that there is such a thing as society but in many places "there is no such thing as community". Nevertheless, DRR requires social democracy and social participation. It is the antithesis of neoliberalism and rank individualism. If 'community' is to overcome the factionalism of identity politics, there must be a shared identity and a sense of shared destiny. This must be stronger than factionalism in the constituent population. The challenge of the 21st century is to involve people and organisations in managing their own risks.

[This is a publication of Sceptiques Sans Frontières (SSF), home of the Rapid Scepticism Force.]

References

Alexander, D.E. 2013. Looting. In K.B. Penuel, M. Statler and R. Hagen (eds). Encyclopedia of Crisis Management. Sage, Thousand Oaks, California (Vol 2): 575-578.

Banfield, E.C. 1958. The Moral Basis of a Backward Society. Free Press, New York, 188 pp.

Barrios, R.E. 2014. 'Here, I'm not at ease': anthropological perspectives on community resilience. Disasters 38(2): 329-350.

Berkes, F. and H. Ross 2013. Community resilience: toward an integrated approach. Society and Natural Resources 26(1): 5-20.

Granot, H. 1996. Disaster subcultures. Disaster Prevention and Management 5(4): 36-40.

Haynes, P. 2009. Before going any further with social capital: eight key criticisms to address. Ingenio Working Paper no. 2009/02. Ingenio (CSIC-UPV), Polytechnic University of Valencia, Valencia, Spain, 22 pp.

Inaba, Y. 2013. What’s wrong with social capital? Critiques from social science. In I. Kawachi, S. Takao and S.V. Subramanian (eds) Global Perspectives on Social Capital and Health. Springer, Berlin: 323-342 (Chapter 13).

Tice, A. 2010. 'No such thing as society'. Socialist (25 November 2010), p. 10.

Wednesday, 9 October 2019

A man walks into a room


A man walks into the room. He is extremely handsome and quite tall (at least for his time, about 520 years ago). He is neither fat nor thin. In fact he likes his food and drink, he is fond of convivial dinners with good food and wine. His circle of friends is small but select, much more so than his large circle of enemies and opponents. One of his friends is his neighbour, Lisa Gherardini. She comes from an ancient but penniless Florentine family and lives in rented accommodation in the same street as his father, near Santo Spirito, Oltrearno. She is married to a cloth and silk merchant called Francesco del Giocondo and has five children (her sixth child was stillborn). They live modestly on his earnings from trade and hers from the income of a small farm. Our man likes her for her shy and slightly enigmatic smile. She goes on to outlast him by more than 20 years, and also to outlast her husband, who dies young of bubonic plague. This leaves her no option but to retreat into her preferred nunnery, Saint Ursula (Sant'Orsola), where two of her daughters have already taken orders. When she dies there, she is buried in the central courtyard–until 2016, when her remains are dug up. Why is that? Curiosity... If you are curious too, you can visit Sant'Orsola, which is a massive complex that occupies a whole city block in the centre of Florence. Or rather, you can't visit it without using a pneumatic drill and a sledgehammer, as due to a dispute over ownership, for decades all the doors and windows of the building have been bricked up. It has quite a sinister appearance. Lisa could never know that her smile would become the most famous in the world, roundly outclassing that of Marilyn Monroe.

But I digress. Our man's other friends include Giorgio Vasari, the influential painter, architect, biographer and historian, Andrea del Verrocchio, painter, sculptor and goldsmith, who was his maestro, and a fresh-faced young man with curly hair called Raffaello Sanzio. Raffaello became a well-respected painter (although a remained a thoroughly feeble poet). He could never know that even his preparatory sketches would eventually fetch more than $50 million each at auction.

Our man has long, blond hair–or is it blond? It has a reddish tinge, which leaves him open to much ridicule both in the streets and at fashionable gatherings. His eyes are blue. No, they are not piercing, but his gaze is steady and intelligent, alive with reckoning. Another reason he is often ridiculed and snubbed is that he is illegitimate. In fact, he has no surname. No qualifying appellation links him to a great and noble family. Never could he go straight to heaven, and always must he assume responsibility for a state of life that was in no way of his own making. His father is an accountant and notary, a man who does other people's sums for them, witnesses documents, draws up bills of sale. He owns a small farm in the countryside, in the Alban Hills that cut across the marshy middle Arno valley. It is situated 12 km north of Empoli, the town that marks the half-way point between two old enemies, terrestrial Florence and maritime Pisa. The farm is a mean sort of affair, with simple, Spartan buildings (I have been there). Sir Peter, our man's father, takes a serving girl in the fields one hot day in July and in consequence she falls pregnant. Her name is Catherine Lippi (Caterina). Despite the name, she is of North African origin. Curious that her son should looks so Nordic–an accident of genetics.

The child is taken from her and brought up by its paternal grandfather in the nearby town. Catherine survives the loss and goes on to marry a local farmer. The boy spends his summers exploring the countryside around the farm. Not only is he able at drawing, he is can write equally well both forwards and backwards. The latter he finds comfortable, as he is left-handed. Writing in mirror image will not save him from being despised–and indeed detested–by those who feel keenly that he is more brilliant than they are. This will induce him to be careful to the point of practising elaborate secrecy. Unlike his contemporary, Alessandro Filipepi, nicknamed Botticelli, he was no follower of the violent, revolutionary monk Savonarola. Indeed, he may have heaved a sigh of relief when Girolamo Savonarola was carted into Piazza della Signoria, tied to the stake and incinerated. Bombastically, Savonarola had ridiculed his theories on painting and in the process corrupted Botticelli's technique. The punishment proved that one cannot be too careful if one wants to be a rebel. The defence of the social order is likely to be both vigorous and vicious.

So our man is a painter. In his entire lifetime there will be only 19 extant works, some of which will be painted largely by his assistants. Yet no paintings anywhere by anyone will ever be so highly prized. And the irony is that our man is a hopeless frescoist, quite unable to master the technique. But he has other ambitions. His chief objectives are to write a book to demonstrate that painting needs a scientific technique and to write another on the behaviour of water. He achieved neither aim, but he left 2,500 pages of notes and drawings. He became a scientist long before the concept of science was even devised.

In the modern age we see about 70,000 images a day (and the paper we use, whether it is tangible or 'virtual' is something upon which we place little value). In this manner we cannot possibly appreciate or understand the images. But think back to half a millennium ago, when people saw very few images, and they were highly prized and long remembered, when paper was a scarce, expensive commodity (it came from Fabriano in the Marches).

There is a place called Pisignano in Chianti, where I like to go for walks (I was there shortly before writing this). It extends northwards into the Arno valley along a ridge-top situated in the extreme northwest of the Classic Chianti (DOCG) wine-producing area. The landscape is typical of central Tuscany: an alternation of fields of silver-leaved olive trees and well-regimented vineyards, serried rows of pointy cypresses, occasional pear, peach, fig and walnut trees, four-square farmhouses with honey-coloured stucco, some campaniles and towers with battlements nestling in groves of trees. With the obvious concessions to modernity, it is much as our man would have found it. A sketch housed in the Uffizi Gallery confirms this.

In England, Queen Elizabeth the First conversed fluently with her tutors in Latin. Our man did not have the benefit of a formal education. He knew no Latin or Greek and hence could read no standard texts. It freed him from dogma and gave reign to his great attribute, the ability to think freely and laterally. We, especially in universities, receive much of our reality directly from others. Our man had to construct it for himself, very much the active, not the passive, form of learning. He did so with unparalleled ability. Germaine Greer told a modern audience that his work is hugely overrated. It is not and this throw-away comment diminished her. Thousands of scholars have spent their entire careers seeking to understand his ways of thinking. His mind is labyrinthine. It amply repays study and never does it reveal all its secrets.

The triumph of dogma is at once an ancient phenomenon and a perfectly contemporary one. Our man kept his thoughts to himself. He also moved house far more than was usual in his age. It preserved his life at a time when sudden raids, trials by fire, and opportunistic warfare were the rule, not the exception. But his thoughts–so original, so precious–remained unappreciated for 350 years. This is a remarkably important point in an age in which the 'pathways to impact' are expected to catapult you to your destination in an instant. This is the age of "impact as instant gratification". So his collections of notes, drawings, sketches, maps and diagrams were stuffed into out-of-the-way library shelves. One wonders how many priceless offerings were simply thrown away, burnt on the rubbish dump, unappreciated. But as Groucho Marx memorably said: "Why should I do anything for posterity? What's posterity done for me?"

We live in an age that seems determined to learn nothing at all from history. One thing we could do is to reverse the trend. Rather than courses on how to use technology, or how to get a well-paying job, perhaps we need instruction on how to look, listen and evaluate. Our man struggled to find the deeper meaning in things. He too grappled with technology, much of which he invented from scratch in extraordinary bursts of creativity. The timelessness of his achievement shows that life is about far more than a foreshortened pathway to impact.


Thursday, 22 August 2019

Is it Possible to Keep Up with the Literature?







I am the founding editor of the International Journal of Disaster Risk Reduction (IJDRR), which began publishing in August 2012 with just four papers. Seven years later, the submission rate is equivalent to about 1,500 manuscripts per year (although the rejection rate is over 80 per cent). Two years ago, the journal published its first issue to contain 100 papers.

In 2011, when I was approached by Elsevier about establishing the IJDRR, the first question was, "Is there a need for a new journal in this field?" I replied that, as there are more than 80 dedicated journals in the disasters, risks and hazards fields, and more than 500 others that occasionally publish papers on such themes, no such need existed. However, there was a need for a journal that competed well, that outclassed its competitors in terms of being a reliable, authoritative source of knowledge. On this basis, we went ahead to establish and develop the IJDRR.

The reason for mentioning this example is part of my response to an article that appeared in Times Higher Education (THE 2019). In it, eight academics explained their strategies for keeping up with the literature. The problem can be explained in the following syllogism:
  • In this globalised, digital age, academic publishing has ballooned.
  • All over the world, academics are under great pressure to teach more, do more research, advise students more, apply for more grants, cope with more bureaucracy and participate in more initiatives. Where is the time to read the literature? Where, indeed, is the incentive?
Therefore:
  • It is impossible to keep up with the literature in any given field. Don't even think about it: trying to do it could damage your health, or at least your eyesight.
Fortunately, there are some mitigating factors, for example, the lack of innovation in most published research. How many times has one been asked to review a manuscript for a journal, only to find that one learns nothing new from it? I suggest we (a) make sure we know our field thoroughly (yes, more reading!), and (b) assign an innovation quotient to each new paper with which we come into contact. This could be extend from zero to ten, with one decimal place, so that a highly innovative paper is, perhaps, a 9.2 or an 8.7. The number would symbolise the answer to the question "what did I learn from reading this?" The trouble with this approach is that there might be a paper that contains only one new idea, but that idea is pricelessly good. Unfortunately, searching for such information is very much like looking for a needle in a haystack.

Prior to submitting a work for publication, one must make sure that one has read all the truly pertinent literature. I am amazed at how many authors submit work and do not even seem to have spent those vital two minutes putting the basic key words into Google Scholar. This makes it all the more likely that earlier work will unknowingly be repeated.

When I was a student, it was possible to keep abreast of the literature by reading the major sources, most of which were leading research journals. What one needed was a well-equipped library, perhaps a photocopy machine, and a free afternoon once a week. Even then, there was the lurking sensation that something vital might slip under the radar, but the risks were much smaller.

Nowadays, the technique has to be a very different one. The first part of it is awareness of trends in research, which comes from conducting a broad survey of what is coming out - titles, abstracts and provenance only. The second part is using experience and judgement to know what is worth reading. It can lead one astray, but this is not a frequent occurrence. The third and final part is highly targeted reading. It also helps to speed-read one's way through some of the less important detail in order to get to the fundamental message of a published work. The biggest risk is superficiality in one's understanding of the research and the new developments in one's field. However, the sheer volume of publication makes that risk inevitable, and there are no simple remedies.

Academic publishing continues to mutate at a bewildering rate. The key to managing change is adaptability. However, it is not yet clear how one will have to adapt in order to absorb and cope with the changes that are on the horizon. In the meantime, we need to keep up the campaign to create a core curriculum in disaster studies (see my earlier posts) and not ignore the messages of the fundamental literature of the past 100 years.

Reference

THE 2019. How can academics keep up with the literature? Times Higher Education, 22 August 2019, by Verity Archer, Oliver A. H. Jones, Danielle Sands, Rivka Isaacson, Helen Sword, David Sanders, Saikat Majumdar and Christopher H. Hendon.
https://www.timeshighereducation.com/features/how-can-academics-keep-literature

Monday, 29 July 2019

Disasters: Knowledge and Information in the New Age of Anomie


Bertrand Russell once observed, "Most people would die sooner than think--in fact they do so." Perhaps this goes some way to explaining the common failure of risk estimation and the tendency willingly to take unnecessary risks. However, an understanding of risk requires, not only an ability to think things through, but also enough information with which to make informed decisions. It could be argued that people do not seek the relevant information, or when supplied with knowledge and expertise they wilfully ignore it. However, there is another side to this, one which has parallels with looting.

Quarantelli and Dynes (1970) developed a three-stage model whereby property norms are progressively abandoned as a neighbourhood descends into an outbreak of uncontrolled looting. What was previously forbidden by custom, practice, sanction and retribution becomes possible, even desirable, as the emerging group of looters frees itself from the shackles of normal constraint. Something similar has happened with the Internet and social media. The former has been in widespread use since about 1993, and the latter since around 2009. In a quarter of a century and a decade, respectively, attitudes and customs have changed profoundly. Early views of social media (e.g. Bird et al. 2012) found that the negative aspects, such as the diffusion of unfounded rumour, were self-correcting. Early views of the Internet and disasters (e.g. Gruntfest and Weber 1998) were optimistic and identified its positive attributes. However, by the Haiti earthquake of 2010, a different picture had become to emerge and establish itself (Alexander 2010).

In the last decade there has been a massive and utterly profound change in the way that modern, technological channels of information dissemination are used. Although not characterised by loss of control, there has been a change in the way that media, and the information they purvey, are controlled. The change is achieved through apomediation (bypassing information gatekeepers), and control now rests in the information itself, and how it is served up to its consumers (Alexander 2014).

By analogy with the abandonment of property norms, there has been a trend towards forsaking basic ethics and loss of adhesion to the truth. The result is a communication process which has been termed chronic contagion (Pomerantsev 2019). Pronouncements are made on the basis of whim, prejudice and predilection, rather than any solid analysis of evidence or morality. Arbitrary rule is, of course, nothing new, but what is new is the role of networked electronic communication. Vast resources are now devoted to distorting the picture, and all three superpowers are busy utilising them (Druzin and Gordon 2018, Merrin 2019, Rudick and Dannels 2019).

One could argue that this is an inevitable outcome of political ideology and international rivalry, given the new tools that are available forcefully to disseminate views that would not survive any kind of sober, rational analysis. Population growth, migration, proxy wars and environmental change have together generated a current climate of nihilism, sometimes characterised as 'identity politics' and distinguished by a retreat from globalism and a global outlook. In reality, what has evolved is a sort of 'apolitical politics' characterised by the ideology of prejudice, unbridled competition and, often, plain vindictiveness.

To understand this situation, one needs to turn back to the work of the French sociologist David Émile Durkheim (1858-1917). He coined the term anomie, which he defined as a reduction in interaction between social groups leading to a breakdown in mutual understanding, perhaps with a clash of norms, values and ideologies. He attributed it to rapid population growth. Since the late 20th century, the concept of anomie has been reinterpreted (Allan 2005, pp. 128-130). It has also been related to the growth of individualism (Marks 1974, p. 335).

Any attempt to relate the current anomie to disaster risk reduction (DRR) must take account of the 'egg hypothesis'. In this, the factors that directly affect DRR are the yolk, which reposes in the white, which is their political, social, economic and environmental context. Another way of looking at this is that problems caused by disasters cannot be solved in isolation from more general afflictions. For example, if people are poor and their lives are generally precarious, they cannot be made resilient against disasters such as floods and earthquakes unless the problem of vulnerability to life's exigencies in general is reduced. That is where anomie comes in, with a modern interpretation based upon the nihilism and anarchy that has crept into global electronic networked communication. The result, negative as it is, can be seen in Figure 1. Obviously, there are many exceptions to this picture. Its antidotes are thought, reasoning, action, activism, and the application of ethics and morality. People must not allow themselves to be absent from the debate, and they must be willing to learn and reason. Misinformation is successful because of the ability to generate it both forcefully and copiously, but also because of passive acceptance.

                              Figure 1. Anomie and shortage of disaster governance.

There is also a factor associated with official mindsets. In modern disaster risk reduction, problem solvers abound. They vary from individual researchers to major pan-national organisations. One thing they have in common is a tendency to look for the solutions to disaster problems by concentrating on the problem, not its context. This is understandable because in many cases the context is highly complex, subtle and sophisticated. It may pose issues that could well be regarded as intractable. However, the solution often lies in the context, not the problem. For example, great progress has been made in seismology and seismic engineering. However, two analyses (Escaleras et al. 2007, Ambraseys and Bilham 2011) suggest that the principal cause of earthquake disasters is corruption, which weakens or prevents defensive anti-seismic measures. The question then becomes "What is the process whereby a morally neutral threat such as an earthquake is turned into one with strong moral connotations by the spread of corruption?"

The tendency in research and policy advice is to assume that everyone in power has a strong desire to reduce hazards and threats. In contrast, we are all too often confronted with a situation that varies along a spectrum which extends from utter indifference to a desire to orchestrate conditions for private gain or personal aggrandisement. To shy away from such situations, especially in times of mass manipulation of the global means of communication, is to miss out on understanding why disasters prove to be such intractable problems. Further study of this essential problem requires that we develop a thorough understanding of how real issues, real developments and real knowledge are affected by anomie and its vital relationship with chronic contagion.

References

Alexander, D.E. 2010. News reporting of the January 12, 2010, Haiti earthquake: the role of common misconceptions. Journal of Emergency Management 8(6): 15-27.

Alexander, D.E. 2014. Social media in disaster risk reduction and crisis management. Science and Engineering Ethics 20(3): 717-733.

Allan, K. 2005. Explorations in Classical Sociological Theory: Seeing the Social World. Sage, Thousand Oaks, California, 448 pp.

Ambraseys, N. and R. Bilham 2011. Corruption kills. Nature 469: 153-155.

Bird, D., M. Ling and K. Haynes 2012. Flooding Facebook: the use of social media during the Queensland and Victorian floods. Australian Journal of Emergency Management 27(1): 27-33.

Druzin, B. and G.S. Gordon 2018. Authoritarianism and the Internet. Law and Social Inquiry 43(4): 1427-1457.

Escaleras, M., N. Anbarci and C.A. Register 2007. Public sector corruption and major earthquakes: a potentially deadly interaction. Public Choice 132: 209-230.

Gruntfest, E. and Weber, M. 1998. Internet and emergency management: prospects for the future. International Journal of Mass Emergencies and Disasters 16(1): 55-72.

Marks, S.R. 1974. Durkheim's theory of anomie. American Journal of Sociology 80(2): 329-363.

Merrin, W. 2019. President Troll: Trump, 4Chan and memetic warfare. In C. Happer, A. Hoskins and W. Merrin  (eds) Trump’s Media War. Springer, New York: 201-226.

O'Connor, D. 2009. Apomediation and the significance of online social networking. American Journal of Bioethics 9(6-7): 25-27.

Pomerantsev, P. 2019. This is Not Propaganda: Adventures in the War Against Reality. Faber & Faber, London 256 pp.

Quarantelli, E.L. and R.R. Dynes 1970. Property norms and looting: their patterns in community crisis. Phylon 31: 168-182.

Rudick, C.K. and D.P. Dannels 2019. “Yes, and …”: continuing the scholarly conversation about the dark side of social media. Communication Education 68(3): 393-398.

Friday, 5 July 2019

Disaster Risk Reduction is not a Paradigm


In 1973, when I was a young undergraduate, I was taught the rudiments of the philosophy, theory and methodology of science. In the library of the London School of Economics there was an entire shelf that contained 29 well-thumbed copies of Thomas Kuhn's The Structure of Scientific Revolutions (Kuhn 1970). This was standard reading on many courses in the social sciences. It was accepted orthodoxy, but was Kuhn's model right?

According to Kuhn, researchers perceive a particular approach or methodology to be promising and productive. It seems to enable them to make progress with their collective enquiries. Gradually, they exhaust its potential. Pioneer researchers discover and propose a new approach that appears to be able to solve problems that the old way could not. Bit by bit, the community of scholars is convinced and progressively abandons the old approach while taking up the new one. Thus a 'scientific revolution' is born. Kuhn's is a model of innovation diffusion, based on observations of the 'model of natural science' (Harvey 1969). It lacks the spatial dimension of the 1960s work of the geographers Torsten Hägerstrand (1968) and his colleagues, but it has all the other components.

Because it is such a well-known, powerful and simple explanation of modern scientific method, the Kuhnian paradigm has elements of self-fulfilling prophesy. The model has been criticised both because it is too simple and because there are rival theories of paradigms. One of the most penetrating criticisms is that in any discipline many parallel paradigms are operating. In other words, there are different ways of seeing a problem and hence different methods of solving it can exist together at any point in time. If there are many paradigms, there is in effect no paradigm. It would be premature to suggest that there is also no discipline, but there is currently a process of combining and recombining forms of knowledge in new ways. Science is reorganising itself to meet the demands of a very difficult century, and the interpretation of the plethora of knowledge so far produced.

The 1970s were a period in which Kuhn's ideas were vigorously debated by people who were much more knowledgeable in the philosophy of science than I will ever be (Meiland 1974, Nickles 2003, Scheffler 1972). However, my point here is not to bring in the heavyweight models of that discipline, but to make a simple point about the misuse of terms. In fact, 'paradigm' is one of the most misused words in the scientific lexicon. It is used lazily for any common thread in scientific endeavour.

So do we have a paradigm in disaster studies? If we do, is it going to be overthrown by a scientific revolution (or a social scientific one)? Some researchers evidently think so (Ismail-Zadeh et al. 2017), but to answer the question properly, we have to ask another. Is our field (may I call it a discipline? - perhaps not!) susceptible to fads and fashions instead of true paradigms? To begin with, let us introduce some rigour: a vaguely defined way of doing things, or thinking about them, does not constitute a true paradigm. Secondly, a paradigm should involve parallel but independent investigations upon similar lines, and many of them. By contrast, a fad or fashion involves imitation rather than strictly parallel endeavour (a fad is a more pejorative version of a fashion). Imitation involves adapting other people's reasoning to a new set of data or circumstances. It does not require careful reasoning to choose the right tools for the job, or exactly the right question to ask in order to obtain the most productive answer.

Consider the notorious case of René Thom's catastrophe theory (Zeeman 1979). The name was attractive, although highly misleading, as it was not a theory about catastrophes. It was instead a mathematical interpretation of discontinuities (singularities) in state space. Only the lowest level of singularity could be used to model any physical reality. This severely limited the applicability of the theory. Its use in earth sciences (Henley 1976) produced few interpretable results and did almost nothing to shed new light on the problems to which it was applied. It was soon dropped, despite having been hailed as the start of a new paradigm. In reality it was a fad.

Such examples illustrate a rather shallow idea of what represents the 'cutting edge' in any discipline. Sadly, a follow-the-herd mentality all too easily develops among researchers. The residual question is how to liberate and encourage creativity. Like any field of study, disaster risk reduction needs lateral thinking. In other words, it needs diverse entities to be linked in new and productive ways. A paradigm is a concerted attack on a problem by many researchers with similar aims and parallel but diverse perspectives. DRR is now a large enough field to generate this. Indeed, will cascading disasters be a paradigm? Time will tell - or scepticism will dismiss the idea! Fads and fashions are far too imitative to be efficient, effective ways of addressing our problems. A non-paradigm approach might simply bring widely different perspectives to play, thus generating a more pluralistic approach. This is probably what is happening. It is a healthy sign in a field that draws upon more than 40 disciplines for its knowledge (Alexander 2013). My only reservation about this is that we may need the impact of a recognisable paradigm in order to gain the recognition that the field needs.

References

Alexander, D.E. 2013. Approaches to emergency management teaching at the Master's level. Journal of Emergency Management 13(1): 59-72.

Hagerstrand, T. 1968. Innovation Diffusion As A Spatial Process. University of Chicago Press, Chicago, 334 pp.

Harvey, D. 1969. Scientific explanation: the model of natural science. Ch. 4. Explanation in Geography. Edward Arnold, London: 30-43.

Henley, S. 1976. Catastrophe theory models in geology. Journal of the International Association for Mathematical Geology 8(6): 649-655.


Ismail-Zadeh, A.T., S.L. Cutter, K. Takeuchi and D. Paton 2017. Forging a paradigm shift in disaster science. Natural Hazards 86: 969-988.

Kuhn, T.S. 1970. The Structure of Scientific Revolutions. (first edition 1962). University of Chicago Press, Chicago, 222 pp.

Meiland, J.W. 1974. Kuhn, Scheffler, and objectivity in science. Philosophy of Science 41(2): 179-187.

Nickles, T. (ed.) 2003. Thomas Kuhn. Contemporary Philosophy in Focus. Cambridge University Press, Cambridge, 314 pp.

Scheffler, I. 1972. Vision and revolution: a postscript on Kuhn. Philosophy of Science 39(3): 366-374.

Zeeman, C. 1979. Catastrophe theory. In W. Güttinger and H. Eikemeier (eds), Structural Stability in Physics. Springer-Verlag, Heidelberg: 12-22.