Tuesday, 29 April 2008

What Makes an Extreme Natural Event a Catastrophe?


Abstract

This paper reviews a varied set of definitions and measures of disaster. These include the mathematical concept of abrupt hysteresis and bifurcation, the earth science concept of accelerated and widespread change in environmental conditions, the social science concept of radical, if transient, mutation of organizations, the engineering concept of major technological system failure, and the medical concept of mass casualty situations. In each of these schools of thought, the terms 'hazard' and 'vulnerability' are fundamental components of disaster but cannot adequately be defined independently of one another.

Apart from disciplinary orientations, there are other reasons why it has proved very difficult to obtain a consensus on the meaning of the terms 'disaster' and 'catastrophe'. Some scholars regard them as synonymous, while others consider them as descriptive of different levels of impact. Attempts to impose a numerical threshold on disaster have not proved particularly successful. Likewise, geographical extent and other measures of size have not led to a better functional definition. This paper uses examples of differences in the sizes of various important earthquakes and landslides selected from recent history to show how the magnitude and frequency of natural hazards are poorly correlated with their disaster potential. Instead, in most situations of risk, vulnerability--coupled with its opposite, resilience--tends to be a better predictor of damage, destruction and casualties.

The significance of disaster is measured in terms of both its effects and its potential for destruction. In this respect, the catastrophe potential of very large, very infrequent events is the subject of much theoretical, as well as practical, uncertainty. This paper explores some of the dilemmas associated with assigning values of low potential to risks that are extremely infrequent but potentially very damaging, such as asteroid impacts or some of the largest Quaternary volcanic eruptions.

The paper ends by considering trends and predictions for global disaster potential in the forthcoming decades. Some of these are contradictory and most present a bleak picture that may, nevertheless, underestimate human ingenuity in the face of adversity.

Introduction

A natural disaster can be defined as some rapid, instantaneous or profound impact of an extreme environmental phenomenon upon the human socio-economic system (Alexander 1993a, p. 4). Such a definition is, of course, as vague as it is broad, and beneath it is concealed a terminological minefield (Quarantelli 1998). In disaster studies, failure to agree on the meaning of terms stems from uncertainty about the significance of phenomena and the relationships between them. This paper will review what is known and unknown about the underlying causal relationships that result in natural disasters. Special attention will be paid to the question of how to characterise low-probability, high-consequence (i.e. very infrequent) geophysical events in terms of their possible impacts as disasters, not merely their rĂ´les as extreme manifestations of Nature's power.

The basis of hazard, vulnerability and risk

According to a widely used general model,

H x V ( x E ) = R --> D (1)

a geophysical hazard, H, acts upon a situation, V, in which people or environments are vulnerable to damage and loss (conditioned by varied degrees of exposure to the hazard, E) to produce a state of risk, R (UNDRO 1982). Rather like friction, risk is a theoretical concept until it is mobilised, at which time it is transformed into the impact of the event (disaster, D) in terms of some combination of death, injury, destruction, damage, disruption or other form of loss. In an another formulation,

R = fcn { H, V, E, B, Rr, Dr, ... } (2)

risk is a function of hazard (H), vulnerability (V), exposure of vulnerable elements to the hazard (E), background levels of the hazard (B), the release rate of the hazard (Rr), the dose rate of those elements or people that absorb its impact (Dr), and sundry other qualifiers (Alexander 2000, pp. 14-16).

Most of the secondary factors in the relationship described above relate to hazard, rather than vulnerability--i.e., to the causal agent, not the population at risk. However, the latter requires qualification as well, to account for its dynamism over time:

Vt = fcn { Ra - Rm +/- Rp } (3)

where Vt is total vulnerability, which is affected by risk amplification measures, Ra (i.e. risk-taking), mollified by risk mitigation measures, Rm, both of which can be affected either positively or negatively depending on how risks are perceived by the various actors in the scenario, Rp (Alexander 2000, p. 15.

In synthesis, disaster is the outcome of risk, which is a product of physical hazard and human or environmental vulnerability. Whether or not these are truisms, it is essential to remember that in the risk relationship hazards are not hazardous unless they threaten something and people or places are not vulnerable unless they are threatened by something. Thus the concepts of hazard and vulnerability cannot be defined independently of one another.

Several other definitional questions deserve to be considered. First, the term 'natural disaster' has geological roots and stems from analysis of extreme geophysical events. Yet the crucial role of vulnerability signifies that 'natural' is very much a convenience term. According to an emerging consensus, disaster is defined rather more by its effects than by its causes (Hewitt 1997, Ch. 1). Vulnerability is seen as the key to understanding and forecasting impacts, and there is intense interest in examining the complex global, regional and local mechanisms that prevent vulnerability from being reduced sufficiently (Wisner 2001a). Thus the rather bald statistics listed in Table 1 demonstrate, not merely the difficulty of characterising earthquakes as disasters using any single measure such as magnitude, but also the highly varied nature of the impacts of large earthquakes regardless of the inherent physical energy for which magnitude is a surrogate index. [1]

Table 1. The impact of selected earthquakes in the magnitude range 7.1-7.8.

Earthquake Magnitude Hypocentral depth (km) Deaths Cost of damage (million US$)

California 17.10.1989 7.1 19 62 12,000
Romania 4.3.1977 7.2 94 1,500 800
Chile 3.3.1985 7.8 33 200 1,200
Japan 26.5.1983 7.7 33 104 600
Turkey 17.8.1999 7.4 15 15,637 20,000
El Salvador 13.1.2001 7.6 39 1,159 1,255
Bhuj, India 26.1.2001 7.7 45 19,739 800


Secondly, the semantic difference between 'disaster' and 'catastrophe' deserves attention. Some authors (Scanlon 1993) regard a catastrophe as a particularly large and notable disaster; others, including myself, consider the two terms to be synonymous. In any case, the threshold of seriousness at which an event becomes a disaster is not fixed and there is no abrupt change in any variable used to define disaster at which a higher order of effect, or catastrophe, naturally emerges. Attempts to characterise disaster empirically (Foster 1976, Keller et al. 1992, Keown-McMullan 1997) have generally not been widely accepted by the community of disaster researchers.

Magnitude and frequency

On 27 March 1964 the Great Alaska earthquake provoked a landslide of 29 million cubic metres of rock, which slid at an estimated 180 km/hr into the Sherman Valley in the centre of the state. No one was injured, no houses were destroyed and the event remained a geological curiosity, discovered by chance during an aerial photography overflight (Shreve 1966). On 21 October 1966 a landslide 193 times smaller in volume, travelling at a speed 25-30 times lower overwhelmed two schools and a residential district in the South Welsh town of Aberfan (Aberfan Tribunal 1967). Some 144 people, including 116 small children, were killed, in a disaster that is still widely remembered and commemorated. These two rather extreme examples illustrate how physical magnitude alone can be a poor predictor of disaster potential.

However, there are exceptions to this rule and to understand them one needs to consider the magnitude-frequency rule. Originally derived for physical, not human, relationships, this empirical regularity draws a balance between the cumulative effects of many small events and the occasional impacts of uncommon, large ones (Wolman and Miller 1960). Students of natural disaster have tended to discount the combined effects of small emergencies and seek to define a threshold beyond which a single event becomes large enough to have significant effects in its own right. However, not only is the threshold infinitely variable, but also the cumulative effect of smaller events is apt to be misestimated (Alexander 2000, pp. 231-234).

The characteristic magnitude-frequency curve that applies to almost all recurrent natural phenomena is in all probability discontinuous in its upper parts (i.e. very high magnitudes and very low frequencies). No measure of central tendency can be gained from the currently available data, which are incomplete in their upper ranges. Moreover, most estimates of the human impact of huge events are highly speculative and based on questionable assumptions. Thus the prediction that a major earthquake in Tokyo, of the maximum expected physical dimensions, would cost not merely hundreds of thousands of lives but also half of Japanese GNP for the year of occurrence (Shah 1995) has been criticised as wide of the mark (Wiggins 1996). It is certainly open to question: in another context, terrorism scenarios for anthrax infection in the United States predicted up to 1.5 million casualties and when it happened the number of deaths was eight (Ali 2000). [2] There will be more discussion of the magnitude-frequency relationship later in this paper.

Other definitions of disaster

A new vogue is beginning to develop for holistic approaches to disaster (McEntire 2001). It stems from the realisation that human well-being depends, not only on geophysical forecasting and engineering structural mitigation, but also on intangible factors of social and cultural origin (Oliver-Smith 1986, Alexander 2000). The applied dimension of holistic recovery strategies is not yet underpinned by sufficient fusion among the disciplinary approaches to basic disaster studies. These have tended to follow seven different paths and their protagonists have not infrequently been at loggerheads with one another (Alexander 1993a, pp. 12-14):-

(a) Catastrophe theory refers to the mathematics of hysteresis and bifurcation in the trajectories of differentially-derived variables through dimensional spaces (Thom 1975). In reality, only the most elementary of Thom=s theorems can be applied to direct causal relationships in real physical environments, and in many cases only be analogy (Kennedy 1980). Higher order >catastrophes= are nothing of the sort: they are abrupt changes in controlling parameters and cannot be used to describe the losses and disruption caused by disaster.

(b) The earth science definition of disaster largely refers to accelerated and widespread change in environmental conditions (Bell 1999). There is a tendency among earth scientists to refer to hazard independently of what the supposedly hazardous phenomenon threatens. This is inadvisable, as it ignores the crucial roles in creating disaster of vulnerability and feedback. Thus the classic model of causality

H --> V --> D (4)

in which hazard acts upon vulnerability to produce disaster, ignores the role of human action in modifying exposure to the hazard, if not the hazard itself (Hewitt 1983).

(c) The engineering approach to disaster concentrates on major technological system failures. Horlick-Jones (1995) has very rightly conceptualised this as a form of perceived betrayal of society by its leaders, planners and providers. Others have looked with suspicion at the technocratic approach to disaster in terms of its perceived use as a tool of domination (Hewitt 1983, Miller 1985). The principal problems with some engineering views of disaster are firstly the lack of appreciation of the social, economic, cultural and strategic constraints that drive vulnerability up rather than down and secondly the common lack of consideration of the wide variance in human impacts associated with engineering failures (see Zebrowski 1997).

(d) Sociologists and social psychologists have created a taxonomy of disaster based on interpersonal and organisational dynamics (Drabek 1986). They have given much attention to the radical, if transient, mutation of organisations, peer groups, family behaviour and so on during periods of crisis. Work has emphasised the resilience of human societies under stress and their ability to recover and regain stability (Nigg 1995). However, social science studies of disaster tend to be vague about the physical and technological underpinnings of events.

(e) Medical approaches to disaster have concentrated on mass casualty situations, though the definition of 'mass casualty' has not always been easy (Keown-McMullan 1997). They have also developed a sophisticated epidemiology of disaster, though relatively few exponents of this have delved deeply into the social, technological and physical causes of mortality and morbidity (those who have include Noji 1997 and Auf Der Heide 1989).

(f) Students of humanitarian aid have developed a portrait of the modern complex emergency, a phenomenon characterised by a mixture of military, social, economic, political and environmental instability aggravated by recurrent natural disasters and underpinned by regional or global political strategies (Prehospital and Disaster Medicine 1995). Proponents of the idea argue that the complex emergency is the fruit of globalisation, the shifting global power balance, decolonisation and the world arms trade (Copat 1981, Duffield 1996). Opponents argue that all disasters are more or less complex and the roots of the so-called 'complex emergency' are a matter of sustainable development and political stability. However, neither group would dispute the fact that people caught up in complex emergencies evolve patterns of coping and survival, sometimes spontaneously (Kirkby et al. 1997).

(g) Over the last 80 years geographers and anthropologists have developed the human ecology approach to disaster with emphasis on the adaptation of people and communities to natural environmental extremes (Burton et al. 1993, Oliver-Smith 1998). This has been very effective in influencing hazard management policy towards the adoption of a wider range of non-structural solutions to the disaster problem, but many of the characterisations of culture and its role in perceiving hazards have been crude and mechanistic (Palm 1998).

This brief review of schools of thought should be concluded with the observation that there are many more opportunities for interdisciplinary research than there are examples of it. In a previous work, I delved into the causes and effects of 'academic tribalism' and how it has hindered the development of a holistic approach to disasters (Alexander 2000, 30-36). The solution is first to base studies on an impartial assessment of what the most pressing issues are and what sort of solution they demand and secondly to look for the themes that might successfully link up the sectoral approaches. Two of the most important of these are the contextual and cultural underpinnings of disaster.

Context and culture

Disaster has several forms of significance for human communities. First of all, it is--obviously--a source of death, injury, destruction, damage and disruption. Ideas on what is a significant level of these vary considerably, often in relation to mass media 'constructions', or choice of elements to emphasize, of what is significant (Goltz 1984, Ploughman 1995). Secondly, disaster is a marker point in history and a milestone in the lives of survivors (Lifton 1980). Thirdly, it is an indicator of future catastrophe potential.

The four fundamental dimensions of disaster are magnitude (of the causal phenomena), intensity (of the effects of these phenomena), time (duration and frequency) and space (territorial extent and geographical variations in intensity). As most disasters are recurrent, the pattern of magnitudes and intensities distributed in space and time is cumulative (though one must allow for the lapse of individual events with the passage of time--see Alexander 1995). For example, the centralised bureaucracies of China registered some 1092 floods over a period of 2155 years and, despite the vastness of the lands in which these events occurred, natural disaster is undeniably a part of the history and culture of China (Jones 1987). Thus, as cumulative impacts and markers in people=s lives, disasters are absorbed into the stream of history which is transformed into human culture (Alexander 1991).

Culture is a phenomenon, or really a process, that conjoins ancient cultural survivals with modern cultural metamorphosis. Among other things, it interprets the technological developments, which form the impetus for so much modern change, in terms of standards established centuries before and progressively modified ever since. Technology is therefore both an integral aspect of modern culture (Meyer-Abich 1997) and an instrument of its change. The relentless evolution of technology provides sources of both vulnerability and its mitigation: it is a double-edged sword (Alexander 1995).

These considerations lead one to conclude that theories of disaster developed in the 1960s and 1980s are no longer adequate to describe the phenomenon and its dynamics in the 2000s. The >linear model= used by most students of catastrophe treats hazard as the cause and prime mover of risk and impact, through its action upon vulnerability (Burton et al. 1993). The first attempt to introduce feedback into the situation merely reversed the order of importance so that vulnerability took preference over hazard (Hewitt 1983). New models (Mitchell et al. 1989, Alexander 2000, Wisner 2001b) offer a more complex view of feedback relationships based upon the interactions of culture, context and the physical and social aspects of disaster (Figure 3).

One interesting aspect of this is that over history there has been remarkably little 'Darwinism' in the evolution of strategies to combat the risk of disaster. Comparatively rarely has there been a 'survival of the fittest' building, community, infrastructure, administration or emergency service: on the contrary, there has been an endless recreation of vulnerability, matched by geographical inertia in the use of hazardous locations (Alexander 1993, p. 12). One might surmise that over the span of recorded history disaster has done relatively little to put human ingenuity to the test. But could it all be changed by a Holocene-scale major event? The next two sections will examine the possible significance of the largest possible disasters.

What event to plan for: large, medium or small?

Mount Vesuvius in Southern Italy last erupted in 1943-4. It remains active and some vulcanologists believe that the longer the interval of time before it erupts again, the stronger will be the eruption that occurs (Dobran et al. 1994). Vesuvius is a complex volcano. In historical times it has given forth a wide variety of eruption styles, from the Plinian column of A.D. 79 to the pyroclastic flows of 1632 and the outpourings of a'a lava during the 17th, 18th and 19th centuries. A large eruption could produce large volumes of poisonous gases, deep ashfalls, rapid lahars and intense pyroclastic flows. The population of the 17 circum-Vesuvian municipalities exceeds 550,000 and somewhere between 300,000 and three million more people would be affected by a major eruption. Ash falls alone could cause the roofs of some 16,000 buildings to collapse (Barberi et al. 1990). In 1631, some 4000 people died when a pyroclastic flow ran through Portici: the current urban density of this town exceeds 18,000 people per square kilometre and is one of the highest in Europe (Alexander 2000, pp. 116-128).

As Vesuvius is one of the most widely-studied and intensively-monitored volcanoes in the world, there will be warning of an impending eruption and evacuation plans will be put into effect. People will be evacuated by road, rail and sea to all 20 of Italy's regions, and some probably abroad. But the situation is far from comforting: the problems of managing it are legion and it is only a matter of time before disaster strikes. The magnitude of the ensuing catastrophe will be determined by an exact conjunction between what happens volcanically and what is done to mitigate the worst consequences (Alexander 1999).

The size of the next eruption of Vesuvius is open to speculation, but it should be remembered that what is popularly supposed to be large is not necessarily that: the eruption of Mount St Helens in May 1980 emitted some 0.25 km; of magma equivalent, that of Santorini in 1600 BC emitted 11 km; and Krakatau in 1883 17 km;. But the prehistoric eruption of Taupo in New Zealand may have emitted more than 100 km; of magma equivalent (Sigurdsson 1991). Bearing in mind that the Vesuvian case represents just about the maximum event that could successfully be coped with (and that with a fair degree of muddling through) the large event is a thorn in the flesh for the emergency planner. By judicious application of modern technology and means of organisation it might be possible to push up the threshold of coping, but not by much. At any rate, it seems that we cannot begin to prepare for very large events until we have adequately coped with the more run-of-the-mill smaller ones, and in this respect there remains much to do.

What is the significance of the very large event in terms of catastrophe?

According to one set of considerations (IFRCRCS 2001), on average each year 220 natural disasters, 70 technological disasters and three new armed conflicts occur. The second half of the 20th century saw increases of 250 per cent in the number of disasters recorded, 1500 per cent in their economic impact and 500 per cent in both the number of people affected and the number of mass-casualty disasters (Munich Re Group 1999). Even allowing for the probable inflation of estimates in recent years, if the curve of these trends were extrapolated much beyond the year 2020 it would become almost vertical. Hence, something must give. Human adaptability will be challenged and stimulated into action by the cumulative weight of small, medium and large disaster impacts. There is no indication, however, that the balance must necessarily shift in favour of the world=s poorer countries, which currently suffer about 80 per cent of disasters and 90-95 per cent of deaths in disaster (Wisner 2001a).

The unknown factor is the very large, very low frequency and low probability event--the disaster with a Holocene-scale recurrence interval. Asteroid impacts and some of the largest prehistoric volcanic eruptions are potentially the main natural sources of such events. [3] Few guidelines exist as to the amount of attention that emergency planners should give such hazards. In another sphere, the events of 11 September 2001 in the United States have demonstrated that disaster planning has been based too exclusively on the run-of-the-mill event without developing an adequate capacity to plan for truly exceptional scenarios (Alexander 2002a).

Yet on the other hand some very real unmet needs are associated with mitigating the consequences of rather banal, small-scale events. Never has the gap between scientific knowledge and implementation been greater. It is a well-known principle that resources cannot be tied up in waiting for events that have only a tiny probability of occurring in any human lifetime (Alexander 2002b). But does this mean that they should be ignored on an operational level?

Few very large, individual extreme events have left their imprint on community consciousness or the progress of human history. No real evidence exists that disasters have ever 'destroyed civilisation' (cf. Antonopoulos 1992, Alexander 1993b). However, in the event of a major asteroid strike, or equivalent impact, we have no means of predicting casualty patterns, losses or capacity to respond. We have no prior knowledge of the social mutations that might possibly occur. Emergency planning relies on a mixture of theory and experience and both are in short supply regarding gigantic disasters. Despite the many experiences of smaller events, it is doubtful whether one could 'gear up' plans to cope with much larger scenarios. Moreover, there is little evidence either way that the 'disaster movie' situation would prevail and society would break down under the strain. Nevertheless, giant events would require draconian measures, and the current drift of disaster management is--quite rightly--to democratise the whole process of civil protection in order to make individual citizens more responsible for their own safety (Alexander 2002b).

Perhaps it is the millennium that drives thinkers into the arms of the neocatastrophists, but the future role of any very large geophysical event as a catastrophe is not sufficiently understood for speculation to be avoided, and speculation tends to invite catastrophism (Dury 1980). [4] It is quite possible that such an event could override most of the current provisions and mores for coping with disaster. It is equally likely that additional preparation is neither feasible nor desirable, given the probable cost and the low likelihood that it will be utilised during the next century or so. Terrorism scenario-building and strategic warfare modelling have given some basis for estimating the impact of huge events, and no doubt future years will see much refinement of the ability to predict major natural events scientifically. But none of this adds up to a blueprint for action.

In conclusion, the presence of a statistically small risk of cataclysm should not cause us to change our focus on preparing for the impact of smaller events, but perhaps it should induce us to widen our horizon with regard to the dynamics and implications of future natural catastrophes.

References

Aberfan Tribunal 1967. Report of the Tribunal Appointed to Inquire into the Disaster at Aberfan on October 21st, 1966. HMSO, London.

Alexander, D.E. 1991. Applied geomorphology and the impact of natural hazards on the built environment. Natural Hazards 4(1): 57-80.

Alexander, D.E. 1993a. Natural Disasters. UCL Press, London; Kluwer Academic Press, Boston, 632 pp.

Alexander, D.E. 1993b. Aetiology of the Thera eruption and its effects. Natural Hazards 7(2): 187-189.

Alexander, D.E. 1995. A survey of the field of natural hazards and disaster studies. In A. Carrara and F. Guzzetti (eds) Geographical Information Systems in Assessing Natural Hazards. Kluwer Academic Publishers, Dordrecht: 1-19.

Alexander, D.E. 1999. Earthquakes and vulcanism. In M. Pacione (ed.) Applied Geography: Principles and Practice. Routledge, London, 66-82.

Alexander, D.E. 2000. Confronting Catastrophe: New Perspectives on Natural Disaster. Terra Publishing, Harpenden, UK, and Oxford University Press, New York, 282 pp.

Alexander, D.E. 2002a. Nature's impartiality, man's inhumanity: reflections on terrorism and world crisis in a context of historical disaster. Disasters 26(1): 1-9.

Alexander, D.E. 2002b. Principles of Emergency Planning and Management. Terra Publishing, Harpenden, UK and Oxford University Press, New York.

Ali, J. 2000. Lessons learned from tabletop exercises. RAND Bioterrorism Conference, Santa Monica CA, February 8, 2000, Section III: Trans‑Attack Panel: Emergency Response, 5 pp. (http://www.rand.org/nsrd/bioterr/javedali.htm)

Antonopoulos, J. 1992. The great Minoan eruption of Thera volcano and the ensiung tsunami in the Greek archipelago. Natural Hazards 5(2): 153-168.

Auf der Heide, E. 1989. Disaster Response: Principles of Preparation and Co-ordination. Mosby, St Louis, Missouri, 363 pp.

Barberi, F., G. Macedonia, M.T. Pareschi and R. Santacroce 1990. Mapping the tephra fallout risk: an example from Vesuvius, Italy. Nature 344: 142-144.

Bell, F.G. 1999. Geological Hazards: Their Assessment, Avoidance and Mitigation. Routledge, London, 648 pp.

Burton, I., R.W. Kates and G.F. White 1993. The Environment as Hazard (2nd edn). Guilford Press, New York, 304 pp.

COPAT 1981. Bombs for Breakfast. Committee on Poverty and the Arms Trade, London.

Dobran, F., A. Nerl M. and Tedesco 1994. Assessing the pyroclastic flow hazard at Vesuvius. Nature 367: 551‑554.

Drabek, T.E. 1986. Human System Response to Disaster: An Inventory of Sociological Findings. Springer-Verlag, New York, 509 pp.

Duffield, M. 1996. The symphony of the damned: racial discourse, complex political emergencies and humanitarian aid. Disasters 20(3): 173-193.

Dury, G.H. 1980. Neocatastrophism? A further look. Progress in Physical Geography 4(3): 391-413.

Foster, H.D. 1976. Assessing disaster magnitude: a social science approach. Professional Geographer 28(3): 241-247.

Goltz, J.D. 1984. Are the news media responsible for the disaster myths? A content analysis of emergency response imagery. International Journal of Mass Emergencies and Disasters 2(3): 345-368.

Gutenberg, B. and C.F. Richter 1942. Earthquake magnitude, intensity, energy and acceleration. Bulletin of the Seismological Society of America 32: 163-192.

Hewitt, K. 1983. The idea of calamity in a technocratic age. In K. Hewitt (ed.) Interpretations of Calamity. Unwin-Hyman, London: 3-32.

Hewitt, K. 1997. Regions of Risk: a Geographical Introduction to Disasters. Adison Wesley Longman, Reading, Mass., 311 pp.

Horlick-Jones, T. 1995. Modern disasters as outrage and betrayal. International Journal of Mass Emergencies and Disasters 13(3): 305-315.

IFRCRCS 2001. World Disasters Report 2001: Focus on Recovery. International Federation of Red Cross and Red Crescent Societies, Geneva, 248 pp.

Jones, E.L. 1987. The European Miracle: Environments, Economies and Geopolitics in the History of Europe and Asia (2nd edn). Cambridge University Press, Cambridge.

Keller, A.Z., H.C. Wilson and A. Al‑Madhari 1992. Proposed disaster scale and associated model for calculating return periods for disasters of given magnitude. Disaster Prevention and Management 1(1): 7-19.

Kennedy, B.A. 1980. A naughty world. Institute of British Geographers, Transactions (New Series) 4: 550-558.

Keown-McMullan, C. 1997. Crisis: when does a molehill become a mountain? Disaster Prevention and Management 6(1): 4-10.

Kirkby, J., P. O'Keefe, I. Convery and D. Howell 1997. On the emergence of complex disasters. Disasters 21(2): 177-180.

Lifton R.J. 1980. The concept of the survivor. In Dimsdale, J. (ed.) Survivors, Victims and Perpetrators: Essays on the Nazi Holocaust. Hemisphere, New York.

McEntire, D.A. 2001. Triggering agents, vulnerabilities and disaster reduction: towards a holistic paradigm. Disaster Prevention and Management 10(3): 189-196.

Meyer-Abich, K.M. 1997. Humans in nature: toward a physiocentric philosophy. In J.H. Ausubel and H.D. Langford (eds) Technological Trajectories and the Human Environment. National Academy of Engineering, National Academy Press, Washington, D.C.: 168-184.

Miller, A. 1985. Technological thinking: its impact on environmental management. Environmental Management 9(2): 179-190.

Mitchell, J.K., N. Devine and K. Jagger 1989. A contextual model of natural hazard. Geographical Review 79(4): 391-409.

Munich Re Group 1999. Topics 2000: Natural Catastrophes, the Current Position. Munich Re Group, Munich, 126 pp.

Nigg, J.M. 1995. Disaster recovery as a social process. In Wellington After the Quake: The Challenge of Rebuilding. The Earthquake Commission, Wellington, New Zealand: 81‑92.

Noji, E.K. (ed.) 1997. The Public Health Consequences of Disasters. Oxford University Press, New York, 468 pp.

Oliver-Smith, A. 1986. Disaster context and causation: an overview of changing perspectives in disaster research. In A. Oliver-Smith (ed.) Natural Disasters and Cultural Responses. Studies In Third World Societies 36. College of William and Mary, Williamsburg: 1-34.

Oliver‑Smith, A. 1998. Disasters, social change, and adaptive systems. In E.L. Quarantelli (ed.) What is a Disaster? Perspectives on the Question. Routledge, London: 231-233.

Olson, K.B. 1999. Aum Shinrikyo: once and future threat? Emerging Infectious Disease 5: 213‑216.

Palm, R. 1998. Urban earthquake hazards: the impact of culture on perceived risk and response in the USA and Japan. Applied Geography 18(1): 35-46.

Ploughman, P. 1995. The American print news media 'construction' of five natural disasters. Disasters 19(4): 308-326.

Prehospital and Disaster Medicine 1995. Complex, humanitarian emergencies: I. Concept and participants. Prehospital and Disaster Medicine 10: 36-47.

Quarantelli, E.L. (ed.) 1998. What is a Disaster? Perspectives on the Question. Routledge, London, 312 pp.

Scanlon, T.J. 1993. Accident, disaster, catastrophe. Stop Disasters 14: 14.

Shah, H.C. 1995. Scientific profiles of the "Big One." Disaster Research 179 (Internet bulletin) and Natural Hazards Observer (November 1995), Natural Hazards Center, University of Colorado, Boulder, Colorado.

Shreve, R.L. 1966. Sherman landslide, Alaska. Science 154: 1639-1643.

Sigurdsson, H. 1991. The intensities and magnitudes of volcanic eruptions. Earthquakes and Volcanoes 22(3): 142-146.

Thom, R. 1975. Structural Stability and Morphogenesis: An Outline of a General Theory of Models (trans. D.H. Fowler). Addison-Wesley, Reading, Mass.

Tiv, M. 2000. Implications of the duration of strong ground motion: (observations from the U.S. Loma Prieta earthquake of Oct. 17, 1989). In S. Balassanian, A. Cisternas and M. Melkumyan (eds) Earthquake Hazard and Seismic Risk Reduction. Advances in Natural And Technological Hazards Research no. 12. Kluwer, Dordrecht.

UNDRO, 1982. Natural Disasters and Vulnerability Analysis. Office of the United Nations Disaster Relief Co-ordinator (UNDRO), Geneva.

Wiggins, J.H. 1996. A reply to "Scientific profiles of the 'Big One.'" Disaster Research 181 (Internet bulletin) Natural Hazards Center, University of Colorado, Boulder, Colorado.

Wisner, B. 2001a. Risk and the neoliberal state: why post-Mitch lessons didn=t reduce El Salvador=s earthquake losses. Disasters 25(3): 251-268.

Wisner, B. 2001b. >Vulnerability= in disaster theory and practice: from soup to taxonomy, then to analysis and finally tool. International Work-Conference, Disaster Studies of Wageningen University and Research Centre, June, 2001, 24 pp.

Wolman, M.G. and J.P. Miller 1960. Magnitude and frequency of forces in geomorphic processes. Journal of Geology 68: 54-74.

Zebrowski Jr., E. 1997: Perils of a Restless Planet: Scientific Perspectives on Natural Disasters. Cambridge University Press, Cambridge, 306 pp.

Notes

1. In reality, bracketed duration and maximum acceleration are also essential determinants of seismic impact (Gutenberg and Richter 1942, Tiv 2000).

2. On the other hand, of 5000 people who sought treatment in the 1995 Aum Shinrikyo Sarin gas attack in the Tokyo metro, 4000 were suffering from psychosomatic ailments or the so-called multiple idiopathic physical symptoms (MIPS) (Olson 1999).

3. At the time of writing the discovery of the Silverpit crater in the North Sea basin has given a fillip to neocatastrophism, but it should be remembered that it was created at least 60 million years ago and there might not be any recurrence during the next 60 million years. The average lifespan of a disaster plan is less than ten years.

4. A catastrophist viewpoint translates the magnitude-frequency rule into a curve of increasing energy or impact level and thus apportions more weight to high-magnitude, low-frequency events than to the combined effect of events of moderate magnitude and comparatively high frequency (Alexander 2000, p. 230).