Saturday, 5 January 2019

Disaster Studies Outside In

The mausoleum of Ibn Sina, or Avicenna, in Hamadan, Iran

Professor J-C. Gaillard has published one of the most original and profound meditations on disaster risk reduction to appear in recent years (Gaillard 2018). I thank J-C for his provocative and stimulating contribution. It is an honour to disagree with him.

His paper is a critique of disaster studies over the last 40 years. It raises some interesting and controversial questions. For example, is to be a scientist also to be a colonialist? Why should the search for a set of universal truths be regarded as hegemony? J-C’s thesis is that Western (i.e. European and North American) models of science have suppressed other productive ways of thinking. This is one reason why the hazards paradigm has continued to eclipse the vulnerability paradigm. Western technology powers this dysphasia. I am intrigued but not convinced.

As undergraduates, we were all induced to read Thomas Kuhn’s The Structure of Scientific Revolutions (Kuhn 1970). On a shelf in the LSE library there was a row of 29 dog-eared, well-thumbed copies of this book. The next step was to believe Kuhn’s thesis. But ever since I graduated, I have felt that the idea of a Kuhnian scientific revolution was a self-fulfilling prophesy rather than a concrete reality, convenient rather than true. Admittedly, scientists (including social scientists) are apt to imitate their peers when deciding what to study and how to do it, but science is more pluralistic than the paradigm hypothesis implies.

Western science is not 40 years old (pace, J-C) but stems from the work of thinkers two millennia ago, strongly tempered by their counterparts in the Arab world. To mark that fact I made a personal pilgrimage to the tomb of Ibn Sina (Avicenna) in Hamadan, western Iran. He bore some responsibility for the hazards paradigm! Disaster science, born in Canada (Prince 1920) and Russia (Sorokin 1942), is only 100 years old, but it has taken the same path: a search for universal truths about the human condition.

Technology may indeed have been the means of repressing African and Asian peoples, and denying the value of indigenous knowledge, but that is far from the whole story. Asiatic societies have been at least as keen on technological solutions as western ones - indeed, often more so. On the other hand, Asian and African academics have fully embraced the Western model of scholarship, the experimental method of replicability (Harvey 1965).

No one would deny that there is much to learn from local conditions. Western disaster scientists have been investigating them in many contexts, western and eastern, for a century. In this endeavour, there has been no shortage of pluralism. However, much depends on what the researcher makes of local knowledge. Local studies are all very well, but unless they have something to say of a more general nature they have limited utility.

In disaster studies at present, the most prolific country is China. There are some particularly Chinese preoccupations (the effect of drought upon crops, seismic landslides) but most Chinese disaster science is perfectly similar to its Western counterpart.

One of the most negative aspects of the hazards paradigm has been the way that it has discouraged cultural studies of disaster risk. But rather than being forcibly exported, ‘Western culture’ is an oversimplified concept, a ragbag of many different things (and a good many of them were the product of cross-fertilisation with non-Western influences). Western countries have many different cultures. In response to that, colonialism was practised at home before it was ever exported. Readers of this my detect an inconsistency between the idea of the Western cradle of universal science and the plurality of Western cultures. In fact, science is not ‘Western’, it is the property of all humanity, but what counts is how we use it. And we use it in different ways.

Some time ago, I wrote about Baroque culture as a model for the present day, because it thrived on the tension of opposites (Alexander 2000). It is a pity no one has taken up the challenge of investigating the lessons of history given by the baroque period (Maravall 1979). Although history does not repeat itself, it has a tendency to foist the past upon us in entirely unexpected ways. The tension of opposites fuels creativity at the same time as it creates unreconcilable contrasts. That is what we face in scholarship as well as in life.

The real enemy of inclusiveness is not academic colonialism, but competitivity. This is now truly a global phenomenon. Asian universities are no less competitive than are Western ones. We know that there is a paradox: science is better advanced by collaboration than competition, but global models force us into the competitive model.

There is no greater oversimplification than the idea that people who live in North America and Europe are responsible for global phenomena. For this to be true, the mechanism needs to be demonstrated and it reveals itself to be a gross oversimplification. Global trends are global because they are globally supported. Emasculating Western academics would not lead to sudden flowering of disaster scholarship in the rest of the world. And in any case, is this not the Chinese century?

Professor J-C Gaillard responds:-

It is likewise an honour to respond to your rejoinder to my Inside Out commentary for the 40-year anniversary of Disasters. This piece is indeed provocative and I’d be very happy should it contribute to open up a debate on the epistemology and practice of disaster research. Your rejoinder is a first step in this direction and I am therefore very grateful. In fact, I agree with some of your points although I obviously stand by my overall argument.

I, in fact, agree that referring to the West as an homogenous entity can be problematic in some instances. It doesn’t reflect the complex reality of the world, indeed, and I obviously struggle with this in several instances in the paper (e.g. the use of OECD as a proxy and the positioning of China and Japan) but I also feel like this short paper wasn’t the appropriate venue to further this complex debate.

I rather focus on the core of the argument, which is why, to me, disaster research has not changed as much as it claims it has over the past 40 years. For example, one of my concerns is that most of the concepts that have been structuring disaster studies over the past decades indeed come from Indo-European languages and that these concepts have been rolled out in many other contexts where they do not necessarily make sense, which may have skewed our understanding of disasters (if such events/processes actually exist as we usually understand them) and hence how disaster risk reduction is considered.

I also agree that there is no single centre of power at one global level but multiple cores at many different scales. I only mention this in passing in the paper but the processes of marginalisation and unequal power relations between researchers that I discuss at the world scale happen at regional and national levels too, for example between researchers from the capital vs researcher from peripheral/smaller cities, including within wealthy countries of what I call the West. Regardless of the scale of core-periphery relationships in disaster studies these unequal power relations seem to involve consent (in Gramsci’s terms) on the side of those who have less power to make decision. As such, it is definitely a complex relationship that is surely not a one-way process.

In essence, it is the processes and relationships more than the actual location of centres of power and peripheries that matter in my argument. This is why I’m not suggesting to ‘emasculate’ researchers from the wealthiest countries/regions/cities/universities (again, in relative terms at different scales). For disaster studies to live up to its ambition I’m rather calling for more dialogue and fair collaboration where local researchers lead and outsiders support, meaning that the latter definitely have their own role to play still.

Best wishes,


Prof. Ben Wisner contributes the following comments:-

I am also very grateful to JC and to David for launching this important discussion.

Whatever the epistemological arguments might be, in practical political terms the result of using the term 'natural disaster' has two pernicious effects. Firstly, it naturalizes disasters in the popular imagination, drawing attention away from the social, economic and political forces that shape the way we live. Secondly, it acts as a smoke screen that obscures the responsibility governments bear for controlling the CREATION of risk by incautious investments, both public and private, control of corrupt buildiing practices, etc.

It is quite likely true that for those who study disasters and natural hazards the short hand phrase, 'natural disaster' does not affect the way they understand disasters. But they are professionals. Most people do not have the advantage or luxury of years of study of these things. So my objections are far less philosophical and more pragmatic.

That said, I return to my original gratitude for the debate. One thing that struck me in JC's response just now as the phrase, "... concepts that have been structuring disaster studies over the past decades indeed come from Indo-European languages and that these concepts have been rolled out in many other contexts where they do not necessarily make sense." He agrees with David on this, and so do I. What piques my interest is the first part of the statement. Would it be a luxury to spend time searching among the world's roughly 6,000 non-Indo-European languages for terms that bear in interesting ways on the English words hazard, threat, emergency, crisis, disaster? In fact, of the 445 living Indo-European languages, most of us never consider terms and concepts in most of them in our disaster and hazard studies. For example, the Spanish term coming out of Bolivia, 'bien vivir' I find valuable as a way of pointing to desirable security and livelihood policies, both of importance for reduction of disaster risk.


Alexander, D.E. 2000. Confronting Catastrophe: New Perspectives on Natural Disasters. Terra Publishing, Harpenden, U.K., and Oxford University Press, New York, 282 pp.

Gaillard, J.C. 2018. Disaster studies inside out. Disasters doi: 10.1111/disa.12323.

Harvey, D. 1969. Scientific explanation: the model of natural science. Ch. 4. Explanation in Geography. Edward Arnold, London: 30-43.

Kuhn, T.S. 1970. The Structure of Scientific Revolutions. (first edition 1962). University of Chicago Press, Chicago, 222 pp.

Maravall, J.A. 1979. La cultura de la crisis barocca. Historia 16: 80-90.

Prince, S. 1920. Catastrophe and Social Change: Based Upon a Sociological Study of the Halifax Disaster. Studies in History, Economics and Public Law no. 94. Colombia University Press, New York, 151 pp.

Sorokin, P.A. 1942. Man and Society in Calamity: the Effects of War, Revolution, Famine, Pestilence Upon Human Mind, Behavior, Social Organization and Cultural Life. Greenwood Press, Westport, Connecticut, 353 pp.

Sunday, 9 December 2018

"You ain't seen nothing yet!"

Brexit may indeed teach the British people what it feels like to be in a real, all-enveloping crisis, but it is not the greatest threat that we face in the UK. There are at least five other contingencies that could have a greater impact, and they are all linked to each other.

The first is a large volcanic eruption. In 2010, when the UK Risk Register was first published, there was no mention of such a thing, which is logical as Britain has no active volcanoes. Within weeks, ash emissions from the Icelandic volcano Eyjafjallajökull shut down civil aviation over 70% of the continent for almost a week. This was barely a foretaste of what could occur. In the 1820s, Eyjafjallajökull erupted for 13 months, including 25 days on full blast--and it is one of the smaller Icelandic volcanoes.

In April 2010, eight and a half million travellers were stranded by the cessation of flights. A bigger, more sustained eruption could lead to stop-start air transportation and prolonged airport closures for months on end. Besides the bankruptcy of low-cost airlines, there would be profound implications for ground and sea travel, hospitality, business, tourism and cultural activities, even for medicine, as the April 2010 crisis delayed the air-freighting of bone marrow for transplant.

Secondly, there is pandemic influenza. This tends to strike on a 30-40 year cycle. It is now a century since the so-called ‘Spanish ‘flu’ pandemic that emerged at the end of the First World War, infected 500 million people and killed 3-5 per cent of the world’s population. Subsequent pandemics have been disruptive, sometimes highly so, but not nearly as bad. This should not lull us into thinking that modern medicine has solved the problem: human beings have little or no natural immunity to pandemic ‘flu. A highly lethal strain could come in waves over a two-year period, and we would be 7-9 months into the crisis before vaccines became available to the public.

Thirdly, space weather is a phenomenon that could do considerable damage to the lifelines and networks on which we depend for normal life. A coronal mass ejection (CME, associated with sunspots) on the scale of that which occurred in 1859 could be deeply disruptive. This is known as the ‘Carrington event’ after the amateur astronomer Richard Carrington who observed and sketched the sunspots. Hours later, the CME played havoc with the telegraph, which was the only advanced means of communication at the time. Such were the geoelectrical currents that the CME generated that messages could be sent without switching the telegraph apparatus on, but the operators were liable to electric shocks from handling the equipment.

In 2012 a ‘Carrington event’ on the Sun sent waves of plasma into space and they narrowly missed the Earth. Knowledge of what such a phenomenon could do in the modern world is sketchy and a little speculative. Very large electricity transformers may burn out, telecommunications would be disrupted, satellites would be damaged and there would be interruptions and inaccuracies in global positioning systems, which are used in everything from driverless cars to air transportation, and even to manage rubbish collection. One would not want to be landing at Heathrow if there were a vertical error in GPS of more than 50 metres.

Space weather is monitored around the clock by the Met Office, which issues a daily forecast. Currently, it is one of the great underestimated hazards that could reveal our dependency on critical infrastructure. This is the set of networks that are essential to daily life. It includes energy, water sewerage, food distribution, healthcare, banking, government and emergency services. Besides space weather, if any other agent were to cause a widespread, prolonged loss of electricity supply, there would be  consequences for all the other forms of critical infrastructure, as to a greater or lesser extent they all driven by electricity.

Over the last decade, wide-area, prolonged power outages have occurred in various parts of the world at the rate of about one a year. The northeast USA was plunged into darkness in 2003 as a result of a grid failure, and again in October 2012 as a result of superstorm Sandy.

British infrastructure, commerce and industry endure millions of cyber attacks every month. In Ukraine, attackers brought down the electricity grid in December 2015 and severely disrupted banks, airports and railways in mid-2017. The UK power grid is well protected, but if it were to suffer a total shut-down it would take days to be restored. Meanwhile, there is considerable concern that many of the generators that would have to substitute for grid electricity are liable to breakdowns and malfunctions.

Finally, there is a very small possibility that a tsunami would be caused in the River Thames estuary by the spontaneous explosion of a large concentration of Second World War munitions that lies in a sunken ship off the Isle of Sheppey. Given the proximity to the Isle of Grain LNG and petroleum storage facilities, the tsunami might carry burning fuel.

Despite this gloomy story of doom and destruction, there are grounds for optimism. Research is underway into cascading disasters and how to prevent or mitigate them. These are events in which impacts occur in sequences or proliferations. For example, the sudden loss of electrical current would affect many activities and cause many secondary problems. They can be foreseen and prepared for. Where vulnerabilities overlap or interact, escalation points will occur, making the impacts potentially worse than the initial event that set them off. These too can be identified and mitigated in advance of the crisis.

Resilience needs a combination of foresight, adaptability and redundancy. Foresight involves building scenarios and investigating vulnerability. Adaptability requires new ways of getting around shortages, blockages and shocks. Redundancy may be about duplicating procedures and equipment but it is just as much about agile thinking that finds new routes out of a crisis.

Preparedness and crisis response need to be democratised, and that is the great challenge of modern times. We all need to take some responsibility for the risks we run. The only way to bring these under control is to treat the process as participatory exercise, with community groups flanking government and private sector planners. We need to shake off complacency and inform people so that they can make their own choices.

One final thing that we have learned about crisis is that it is heavily influenced by context. In Britain, the hollowing out of the welfare state bodes ill for the next major impact. A society that is weakened and has lost its cohesion is ill-equipped to confront crisis, which will tend, as it always does, to pick off the most vulnerable.

Wednesday, 28 November 2018

What if...?

In its current formulation, emergency planning is geared to events of a known and manageable size. Although, by orchestrating standard operating procedures, it can cope with unexpected events, it is mostly about preparing to manage the impact of known and expected hazards. In this process, it can take advantage of scientific information on the magnitude, frequency, evolution and impacts of the hazards. This is usually accomplished by building scenarios of possible future events. Plans usually require multiple scenarios, as impacts can vary, not only with the size of the hazard, but also with time of day (reflecting aggregate patterns of human activity), season and a variety of factors that change over time in the short term.

The question of what magnitude of event to plan remains open. Most planning scenarios refer to events with recurrence intervals of once a decade, or certainly no less than once every 25 years. As the magnitude-frequency rule states that natural hazards tend to be large in proportion to declining frequency (and lengthening recurrence interval), this excludes the larger, less frequent events. It is held that there is no practical value in preparing for the once-in-10,000-year event, or even the once in a millennium event. As a result of conventions in floodplain mapping, plans that prepare to manage the one-hundred-year flood are quite common, but many other hazards that might occur once in a century are not prepared for at the relevant magnitude. At first encounter, this convention, to restrict the size of the event to be planned for, seems like common sense because it does not commit resources to events that have a very low likelihood of occurring during the lifetime of the plan. Planners, however, remain troubled by the niggling sensation that we would be seriously uncovered if the very large event were to occur. Statistically, faith in the ‘normal distribution’ model of hazard magnitude and frequency has begun to wane. Perhaps this is the result of intensifying climatic hazards, or perhaps it is because very big events have, from time to time, happened, such as the magnitude 9 Tōhoku earthquake and tsunami in Japan in 2011.

What if a VEI-7 or -8 volcanic eruption occurred? This would be an event that ejected between 100 and 1,000 km3 of volcanic material into the atmosphere and onto Earth’s surface. It could involve colossal blast effects and would affect a vast area, and probably the climate of the entire planet. Could one plan to respond to such an event? Indeed, what are the limits of impact beyond which emergency planning is impossible because it would be utterly ineffective?

We need to start with some basic principles. I will not discuss the principles of volcanology or natural hazards, important though they are. Here, I am referring to principles of emergency planning. One of these is that the planning should be carried out for the mass of the population and therefore should be as democratic and participatory as possible. This requires that top-down harmonisation be applied to bottom-up, “grass roots”, emergency response capability. Another is that planning should aim to reduce improvisation to a minimum, not by rigidly imposing constraints on emergency management and response, but by anticipating needs and planning to satisfy them. A further principle is that scenarios should be the basis of planning wherever and whenever they can be constructed reliably. A final principle is that the response to civilian emergencies should not be militarised - or remilitarised, as in most countries the armed forces were once the only source of disaster response. Broadly speaking, the civilianisation of emergency management (from paramilitary civil defence to collaborative, civilian civil protection) is a long-standing trend that should not be reversed. In other words, the solution to a major disaster does not lie in declaring martial law.

Almost by definition, a VEI-7 eruption would overwhelm emergency response capabilities. The same could be true of other kinds of disaster. Apart from nuclear war, these include an influenza pandemic of the scope and ferocity of that which affected at least a quarter of the world’s population (of whom, 11 per cent died) in 1918-1919. The scenario starts to take on the characteristics of a Hollywood disaster movie, one, moreover, that is redolent of science fiction and an intolerable future for human beings on planet Earth.

The question is then what could be done to prepare for and respond to an event of this calibre? Most emergency planners would shrug their shoulders and fatalistically argue that it is too far off the scale (of both time and impact) to be prepared for. This is not quite true, but the process would require a reorientation with respect to current policies, priorities, philosophies and procedures.

One may be tempted to use the Cold War as inspiration, as it was a time when preparations were - ostensibly - made to survive outright nuclear war. Two aspects of this are striking. The first is that most of the preparations were futile. Assumptions about support structures and the habitability of the environment had not been thought through. The second is that there was always the feeling that Cold War preparations would have privileged the rich and powerful and left the mass of the population in the lurch. Indeed, the emergence of civil protection as an alternative to the Cold War’s civil defence model was impeded by the fear among law-makers that if special powers were granted to emergency response forces, they could be used to take over a country and subvert it from democracy to dictatorship. Such fears were not entirely unfounded: in the 1980s when all of this was under scrutiny there were plots to mount coups d’etat in both Britain and Italy.

In the event of overwhelming, massive, ubiquitous disruption, what does emergency planning have to contribute? First of all, unless a major event were imminent, it would need to be carried out in purely theoretical terms, not through the practical commitment of resources. One principle would need to be maintained, namely that emergency planning is about organising in order to make the best possible use of existing resources, and thus reducing improvisation to an unavoidable minimum. Another principle would have to be revised. Emergency planning is normally about planning to manage an emergency by bringing the situation fully under control. In a VEI-7 eruption, that would not be possible. It would be a case of doing the best that one can against overwhelmingly unpropitious circumstances. Limited aims would therefore need to prevail. Hence, the planners would need to decide what aims would be feasible (in the light of the scenario) and how to achieve them.

The larger the emergency, the more important is its ethical framework. For example, in a pandemic with slow and limited development of vaccines, medical ethics should determine who has priority for vaccination. Ethics are the top of the pyramid of response, and they should be dealt with on a priority basis as a means of orientating the framework of all other actions. Thereafter, basic human needs would indicate the priorities. The extent to which these could be satisfied would depend, not only on technical considerations of feasibility, but also on the acceptability of radical modifications to people’s social and economic realities, and above all their livelihoods. Jobs and businesses would be created and destroyed very quickly indeed, and that process would need to be managed with care.

In this process, and in the planning, relatively little can be learned from looking back at the World Wars. However, one important lesson concerns the role of expectations. Life is more complicated now than it was 75 years ago, not only because there are more people on the planet and there is more technology, which has become a more fundamental part of daily life, but also because expectations have changed radically. As most of those expectations would be unable to be fulfilled during a massive disaster, thought needs to be devoted to the question of how to change them.

A further principle of disasters is that decision makers are seldom swayed by scientific evidence. Not only do the politicians routinely ignore it, but in some of the world’s most influential countries a new breed of leader has emerged who actively opposes the evidence. What would the reaction to an emergency plan for a VEI-7 eruption be on the part of a climate change denier? The naive belief that science convinces people because it is objective is nowadays easily discredited.

Nevertheless, if a VEI-7-type emergency were to occur, plans would suddenly become essential documents. A further challenge arises. Given the reluctance of governments even to think about such an event, let alone prepare for it, how would a plan fare that has never been tested and is not familiar to its users? The answer is that we would need to write plans that can be, as far as possible, absorbed and acted upon spontaneously. This is not an ideal approach, but it embraces realism.

In summary, planning for overwhelming disasters is not impossible, but it needs a radically different mindset. Some principles are held in common with ‘normal’ emergency planning, others must take account of totally new circumstances. As always, careful study of consequences is recommended: see my essay on the egg hypothesis.

Sunday, 14 October 2018

Nihilism and Disaster Risk Reduction

In cultural-historical terms, Sicilians divide people into two kinds: furbi and fessi - cunning and gullible. No one wants to be considered fesso. Indeed, not many people are happy to be labelled as furbo either. It is a hard distinction, but dura lex, sed lex. Such is modern democracy that, in Britain for instance, elements of the population have developed a tradition of voting against their own interests. Many who voted for the party of Margaret Thatcher in the 1970s and 1980s voted themselves out of a job and into penury: dura et falax lege. In the Americas, in the Middle East, in Russia, democracy currently enables the electorate to vote against democracy.

This is nihilism. The Oxford English Dictionary offers the following as its second, practical definition of the term: “Total rejection of prevailing religious beliefs, moral principles, laws, etc., often from a sense of despair and the belief that life is devoid of meaning. Also more generally: negativity, destructiveness, hostility to accepted beliefs or established institutions.” It is not quite the same as anarchism, the hostility to government, but, as Alan Bennett once wrote: "We started off trying to set up a small anarchist community, but the people wouldn't obey the rules." The curious thing about anarchism is that it requires organisation to destroy organisation in this way. Nihilism is an innate phenomenon that requires nothing except hostility, grievance and an urge to destroy. It is a form of political self-harm.

In previous posts I have discussed the need to interpret disasters in terms of context. I have made the point that some of the most relevant context has nothing to do with disaster risk reduction (DRR) at all. If people are vulnerable, abating vulnerability to disasters will not necessarily make them any less vulnerable. This is the ‘egg hypothesis’, in which DRR is the yolk and context the white. Sometimes it is a well-fried egg with sharp boundaries between the two parts.

Modern capitalism and neoliberalism have led to a concentration of wealth, as never before, in fewer and fewer hands. Democracy, the means by which we control our own collective future, has rarely if ever seemed more futile. Community, a much vaunted concept, has been comprehensively criticised and on occasion it has been cited as the means by which people are kept in order and their creativity and self-determination are kept in check (Titz et al. 2018). Nihilism is a reaction fo the failure of politicians and political parties to act with probity and resolution in order to improve the living conditions of the electorate. It is a product of ignorance and anger. It cannot be described as rational.

My point in this short essay is a simple one. I hope it will become an agenda for research, but the challenge is a difficult one. Nihilism is extremely important in current affairs at the moment and in the future it will probably have a more profound effect on our lives than many other preoccupations of the current moment. It severely weakens the strength of democracy and the ability of people to respond collectively to stresses and threats as these arise. Klein (2008, 2015, 2018) and Loewenstein (2017) have documented in great detail how disasters are used to impose anti-democratic measures when people are weakened and preoccupied by catastrophe. Nihilism makes this worse and can fatally undermine resistance.

As threat, risk, impact and aftermath, disasters must be dealt with democratically. Decision making in disasters must not be separated from the consensus values that democracy requires of it at other times. Yet there is a highly complex relationship between democracy and disasters (Platt 2012). Nihilism is not the way to simplify it. Let us study the effects and the prognosis for nihilism and disasters, and on the basis of our data and our analysis, let us join the fight for a rational, equitable system of values to replace the nihilism.


Klein, N. 2008. The Shock Doctrine: The Rise of Disaster Capitalism. Penguin, Harmondsworth, 576 pp.

Klein, N. 2015. This Changes Everything: Capitalism vs. the Climate. Penguin, Harmondsworth, 576 pp.

Klein, N. 2018. The Battle for Paradise: Puerto Rico Takes on the Disaster Capitalists. Haymarket Books, Chicago, 88 pp.

Loewenstein, A. 2017. Disaster Capitalism: Making a Killing Out of Catastrophe. Verso, New York, 384 pp.

Platt, R.H. 2012. Disasters and Democracy: The Politics Of Extreme Natural Events. Island Press, Washington DC, 344 pp.

Titz, A., T. Cannon and F. Krüger 2018. Uncovering ‘community’: challenging an elusive concept in development and disaster related work. Societies 8(3): 1-28. doi:10.3390/soc8030071

Friday, 28 September 2018

Submit a Manuscript, Encounter a Bias

An article in Times Higher Education (Pells 2018) discusses the presence of national, racial and gender bias in academic journal publishing. It prompted me to write my thoughts on the matter, as follows.

I have been an international academic journal editor for 33 years and have seen more than 10,000 papers into print. When I deal with manuscripts I struggle to be objective, balanced and fair. However, one of my principles is that my loyalty is to the journal, not the authors, whoever they are. My aim is to build a better journal, in which authors will recognise the quality and want to publish their work. This means being strict about the quality of what I accept for publication. If there are patterns in quality, the contents of the journal will reflect them. That may seem very simple, but there are two other issues. The first is that there is a thread of serious bias among referees. I first noticed it in 1985 when I started my editorial role. Willingness to review papers is related to gender, national origin and whether or not European or North American men are among the authors. In the 1980s and 1990s I termed it ‘academic racism’: that may be an overstatement but it is certainly part of the ‘clannisness’ of academic life. It is not by any means universal, but it is a highly consistent phenomenon. I find papers from authors in Iran, Pakistan, Malaysia and India take much higher than average effort to get reviewed. This may be because of a record of poor science and scholarship in these countries, but it is particularly hard on good and aspiring authors from such places, who must struggle harder to achieve recognition. Fairness means that even second-rate papers need to be reviewed and their shortcomings (gently) pointed out to the authors so that they can improve their work, which will benefit us all. Not many academic referees operate on that principle. The second issue is competitivity. Universities put pressure on their employees to compete, especially in research outcomes. The result is a failure to help struggling academics elsewhere. For instance, westerners, by and large, are reluctant to give positive, constructive help to Africa authors. Some good work is coming out of African universities and it is a struggle to give it the recognition that it deserves. It is also a struggle to ensure that African authors feel included in the international research endeavour. It should not be.

Ultimately, present trends in academic publishing are unsustainable. The best research should involve collaboration in many different ways, and exclusionary policies and actions should be reduced. Fostering a community of scholarship means taking positive action to include those who would join it, and fairly recognising good work, whoever may be its author. I fear that requires a different model of university.


Pells, R. 2018. Male editors ‘more likely to accept papers from other men’. Study finds further evidence to suggest peer review process riddled with gender and racial bias. Times Higher Education, 28 September 2018.

Monday, 17 September 2018

Disasters and the Lessons of History

On 15th August 1495 in Modica (the former Arab Mudiqah) in south-eastern Sicily, Christian fanatics slaughtered 360 Jews in the name of the Virgin Mary. The entire population of the city’s ghetto was wiped out, a fact that was extraordinarily convenient to the murderers, many of whom had imprudently let themselves become heavily indebted to local Jewish money lenders. In the 21st century Modica is embarrassed at this episode of its history and on summer evenings it organises tours of the former ghetto, with a rather apologetic commentary by a local official. History does not repeat itself, but it does repeat its lessons. Modica today is a tolerant and enlightened place, but the lesson of the massacre resonates elsewhere in Italy, Europe and the world. Fanaticism and pragmatism are a potent and dangerous combination - and a very up-to-date one. I mentioned the Arabs and, by the way, Saladin was distinctly more tolerant than were the Normans.

Strictly speaking, history is not cyclical, but there is something cyclical about it. Combinations of circumstances, policies and attitudes occur and recur. Perhaps the main reason that this is so is because of the failure to absorb and practise the lessons that it teaches. For this reason, the present is a particularly dangerous period. Population increase, climate change, environmental transformation and the excessive concentration of wealth are all part of the danger. What makes the present period especially risky is that, more than ever before in the recorded past, we live in an ahistorical age. Technological change has, at least partially, decoupled us from our own history. However, the feeling of freedom from the burden of the past is both dangerous and illusory.

The Baroque period of the late 1600s and first half of the 1700s had many parallels with the present day. Baroque was a culture, and a distinctly European one. I am not referring to the Enlightenment because that was not the whole story. Instead, Baroque culture was built upon the tension of opposites: Enlightenment and Inquisition, for example, or ostentation and extreme poverty. The tension stimulated the creative energy, but it also imparted a massive destructive impulse: generatio and corruptio, as the Ancient Greeks would have had it, but unrestrained. Pluralism is usually lauded, but extreme polarity is not a good thing, or so history tells us. Similar conjunctions of circumstances mean that we live in the New Baroque Age. That gives us the opportunity to ponder the juxtaposition of opposites, especially the tension between enlightenment and ignorance.

Many of the lessons of history are thoroughly uncomfortable. With the shallowness of the technocratic age, commentators tend to call upon them only when they are convenient. As so many lessons are far from being convenient, they tend to be manipulated to make them so. Technological change has severely blurred the margins between fantasy and reality. History, if it appears, is the stuff of revisionism. In the struggle to see the world as it is, to master the great tide of data, information and events, can one achieve vision, or is it merely a mirage? Is the world we experience a reflection of some hard reality, or merely of our own predilections and prejudices?

Since the end of the old Baroque period there has been a constant division and subdivision of knowledge, eventually using the codification, or model, furnished by 19th century pedagogy. The paradox is that, in the age of superabundant information, never has it been harder to see the overall picture. The rigidity with which the subdivisions of knowledge are maintained, the keenness of the defence of intellectual territory, makes it even harder.

Disaster studies are, or should be, the epitome of interdisciplinarity. However, that is not the issue at stake. The issue is not how to combine disciplines, or even how to transcend them, in order to understand disasters. Instead it is how to see disasters in the context of other things. A very simple illustration of why that is important is as follows. Poor communities have high and rising vulnerability to hazards. Ignorance and lack of resources are attributed as the causes of this. Education programmes and investment in hazard reduction schemes are implemented. In reality, hazards are not a priority because they are not the most important problems that the poor communities face. After decades of efforts to mitigate hazards and reduce vulnerability, we need to ask why has success been so limited? The answer is that it is the context, not the hazard, that needs to be mitigated. Throughout the world, public austerity measures have been accompanied by a mass transfer of wealth from the poor to the rich. When people’s general resistance is weakened, no amount of targeted programmes to reduce the impact of disasters will solve the problem. Like the hydra, it will pop up again every time it is annihilated.

Disaster occurs in the context of fiscal stringency, wealth transfer, proxy-wars, insurgency, murderous alternative economies, corruption, environmental degradation and the struggle to master technology before it masters us. In many instances, the root causes of disaster need to be sought outside disaster itself. This tends to make them less tractable. One of the biggest challenges is to understand how factors that have nothing directly to do with disasters affect them. For instance, what price reduction in vulnerability to floods when the real problem is the prevalence of murder? What price safety against earthquakes when the real problems are exploitation of labour and shortage of work? Obviously, an even bigger challenge is to solve the contextual problems and cascade the solution into disaster reduction.

One reason for mentioning history is that the problem of the context of disaster consists of two parts: practical aspects of survival and ideological aspects of the attitudes that determine people’s predilections and choices. The second of these is the most prone to the pseudo-cyclic nature of history. Hence, we need to ask what the connection is between history and the future? As we look back on what were once futures, we can see that there is a connection between the past and what is to come. Not only is it composed of the momentum imparted by decisions taken and actions completed, but it is the sum of mind-sets, and also the resurgence of old ideas in new forms.

The defining issue of modern life is humanity’s relationship with the technology it has created. It is this that leads us into the New Baroque Age, characterised by the tension of opposites. This is my model of it.