Monday 21 November 2011

Learning Lessons from Crises and Disasters

Lessons Learned?

The Oxford English Dictionary defines learning as the acquisition of "knowledge of skill ... through study or experience or by being taught". In defining 'lesson' it distinguishes between "a thing learned" and "a thing that serves as a warning or encouragement". The concept of 'lessons learned' is widely used in disaster risk reduction, a field that offers many opportunities to learn from practical experience and theoretical study. The term has been used in a variety of different contexts, which can be given the following summary classification:-

  • General lessons from major events, particularly large disasters of international importance. Hurricane Katrina, which struck the Gulf of Mexico states in August 2005, led to a significant number of studies that collected observations on how to improve resilience in the affected area (e.g. White 2007).
  • Specific lessons from major events, usually derived by concentrating on particular sectors or disciplines, such as the engineering response to building failure, or the response to disaster of psychologists (e.g. Schumacher et al. 2006).
  • Lessons obtained as a result of monitoring the practice and outcomes of drills and exercises, particularly those designed to test multi-agency response to incidents and disasters (e.g. Beedasy and Ramloll 2010; Fitzgerald et al. 2003).
  • Lessons derived over time from cumulative experience of particular phenomena, practices or problems, such hospital response to repeated mass-casualty events, or organising services to deal with the recurrent threat of pandemic influenza (e.g. Clancy et al. 2009).
  • Lessons that arise from particular situations, especially those in which actions taken could have been improved, and those in which innovations were tried for the first time, such as interventions in the Bam (Iran) earthquake of 2003 or the Indian Ocean tsunami of 2004, or the development of new scenarios for earthquake disaster response (e.g. Plafker and Galloway 1989).
  • In the operation of technological systems, especially those denoted 'high reliability systems' (such as avionics), the occurrence of technical faults and human error has been the focus of attempts to learn lessons and see that the faults or errors do not occur again. Such are the mutations in technology and its operation that there are frequent opportunities to repeat this exercise as the context of faults and errors continually mutates (Krausmann et al 2011).

Despite the widespread use of the term "lessons learned", considerable doubt remains about the extent to which the lessons truly are learned. As Figure 1 illustrates, it is perfectly possible to recognise that particular phenomena, events or situations contain information that could contribute to better practice in the future, but it is entirely a different matter to do something about it. In the worst cases, the lessons go unrecognised. Hardly better is the situation in which they are recorded, archived and forgotten, without any attempt to incorporate them into practice and thus benefit from them.

Figure 1. Disaster risk reduction and "lessons learned".

As a result of these considerations, the test of a 'lesson learned' in is that it should contribute in some way to the solution of problems--in this field, to disaster risk reduction and the improvement of resilience (Figure 2). There must therefore be a direct connection between the existence of a lesson, its recognition by practitioners, decision makers or policy makers, and tangible improvements in practice, decisions or policies.

Figure 2. The "lessons learned chain".

As Figure 2 shows, the process of learning lessons ought to be fairly linear and straightforward. Events and circumstances reveal opportunities to learn and bone fide observers profit by these as part of the common endeavour to improve both decision making and working practices. That is the case in many situations, but it is far from a universal modus operandi. There are, as noted above, many opportunities to learn lessons, and learning should be a constant process which contributes to the development of individual people and the organisations to which they belong. However, there are serious impediments. For example, the United Kingdom has had an Inspectorate of Railways (HMRI) since 1840. It has had a reputation for penetrating, impartial investigation and the public conduct of enquiries and publication of their results. However, most of the findings of HMRI have been given in the form of non-binding recommendations for greater safety, and many of these have taken years, or decades, to be absorbed by legislation. Moreover, the UK railway industry has had an equally long history of resisting costly innovation, even when it would undoubtedly save lives.

Accidental or wilful ignorance are only two of many reasons why lessons are not learned. Many lessons are identified without a context of risk analysis and benefit-cost assessment. While the costs of innovation are the easier part to assess, risks and benefits are often elusive quantities, especially as they tend to depend on perception as much as reality. Hence, the lesson may be that "lives can be saved by adopting a particular practice", but this statement does not in itself indicate whether it is expedient to do so in terms of money spent per life saved, given competition for funds from other sources. In other situations the innovation may be prohibitively expensive, as is often the case for retrofitting buildings in areas of high seismicity.

Another reason for lack of adoption is political expediency. An innovation may make technical sense, but be politically unappetising or unacceptable, perhaps because it is unlikely to garner votes. The negative profile of civil protection, which is fundamentally about emergencies and disasters, is one reason why it rarely enjoys priority in policy making. This could, of course, be turned into a positive bid for more security rather than the negative image of yet more disasters, but politicians have commonly been reluctant to follow that road. The result is the "no votes in sewage syndrome": wastewater treatment is essential to public health, but not an attractive part of a policy platform.

An extension of this problem is cultural rejection of disaster risk reduction. Where human cultures are fatalistic, politicians are unresponsive to the need for greater public safety and there is little public debate of the issues, the terrain is not fertile for learning the lessons of adverse events. If the collective memory of disaster is short, there is even less scope for making the enduring changes needed in order to create resilience, and the result is the perpetuation of vulnerability. This was very evident in, for example, the flash floods and debris flows that killed 19 people in Liguria, northwest Italy, in October and November 2011. In effect, nothing happened that was not well forecast and that had not happened before. Poor official and public response to the events when they occurred compounded the problem, which stemmed from unwise land use and failure to organise adequately against the threats of floods and mass movements.

These considerations indicate that the process of learning lessons about crises and disasters requires a much broader approach than simply accumulating observations on errors, faults and poor quality responses on the one hand, and good and best practice on the other. Moreover, what is 'best' practice under one set of conditions may be less than optimal in another setting. There is thus a need for work that evaluates what is a "lesson to be learned" in the light of its potential to be transplanted from previous conditions to future ones, and, of course, its ability to contribute to better practice, greater resilience and reduced disaster risk. This requires evaluation of cultural and political factors that inhibit or encourage innovation. It necessitates judgement on whether there is universal or merely local value in adopting a new practice or making a modification in existing rules, norms, plans or procedures. In the final analysis it may also require benefit-cost analysis of any changes that are contemplated.

It is often said that we tend to prepare for the last disaster rather than the next one. Although there is much value in learning the lessons of history, in order not to be condemned to repeat its mistakes, any assessment of past or current practice should take account of how it can contribute in a future characterised by constant change in circumstances and the need to adapt to new realities.

One of the most central issues in the process of learning lessons is the relationship between individual learning and the acquisition of knowledge by the organisations in which individuals function. This will be examined in the next section.

Organisational Learning

In analysing the communication processes it is opportune to use the hierarchical classification provided by IFRCRCS (2005) and sometimes attributed in origin to the geographer Yi-Fu Tuan. At the lowest level of the DIKW pyramid, data are basic facts and statistics with little ontological relationship between them. Information involves the description of physical and social situations by combining and interpreting quantities of data. Knowledge refers to the understanding of how things function (or should function). Finally, wisdom is the ability to make decisions on the basis of principles, experience and knowledge (Figure 3).

Figure 3. the DIKW pyramid (IFRCRCS 2005 and other sources).

Some of the processes inherent in this classification occur in isolation as individuals work alone, but many take place in collective situations of social interaction. As Elkjaer (2003) observed, the individual and the organisation in which he or she works are bound together by power relations, such that there is no net distinction between solitary and collective knowledge. Nonetheless, over the two decades 1991-2011 considerable progress has been made in advancing the field of organizational knowledge. Occupational psychologists, management specialists, operations researchers and economists have all been involved in this multi-disciplinary effort to understand how organisations and their members acquire, utilise and retain information.

In an earlier work Polanyi (1966) classified human knowledge into two categories. “Explicit” or codified knowledge refers to knowledge that is transmittable in formal, systematic language. On the other hand, “tacit” knowledge has a personal quality, which makes it hard to formalize and communicate. Nonaka (1994) considered the processes of interchange between the two sources of knowledge and formulated an epistemological-ontological model to characterise them. Spender (1996) noted that, following Wittgenstein, knowledge is composed of theoretical statements whose meanings and practical implications depend on how they are used and in what context that takes place.

In a further development, Lam (2000) broadened the explicit-tacit dichotomy to four categories of knowledge: embrained refers to knowledge that is dependent on an individual's conceptual skills and cognitive abilities; embodied knowledge is derived from practical action and experience; encoded knowledge is conveyed by signs and symbols; and embedded denotes the collective form of tacit knowledge found in organisational routines and shared norms (Figure 4).

Figure 4. A classification of organisational learning (Lam 2000).

Huber (1991) considered the acquisition, distribution and interpretation of knowledge in the light of an organisation's collective memory. He identified the sources of knowledge as follows:-

  • · remembering and codifying experience
  • · research-based learning and searching
  • · vicarious learning ('second-hand' acquisition of knowledge)
  • · storing and retrieving information
  • · scanning
  • · performance monitoring and evaluation
  • · organised self-appraisal
  • · experimentation.

In this context we can distinguish between enduring and perishable knowledge. The former includes fundamental concepts and procedures, consensus knowledge and information that reinforces, sustains and maintains existing relationships and practices. The latter comprises poorly collected and conserved 'transient' data and observations and may be the fruit of the surge caused by demand for knowledge during periods of an organisation's imperative to adapt to rapid and profound change.

Not all knowledge is beneficial. In studying information overload, March (1991) found that too much information can inhibit learning processes. Kane and Alavi (2007) discovered that although information technology can be beneficial to fast learners, it can retard the progress of people who absorb information slowly. Moreover, Simon (1991) added his concept of 'bounded rationality' in an analysis of the limitations of organisational knowledge gathering.

Besides these limitations, the field of organizational knowledge is rich in dichotomies. The primary one is between individual and social knowledge, but others are between traditional and affective knowledge (Weber); facts and values (Simon); optimising and satisficing (Simon again); objective knowledge of bureaucracies and cultural knowledge of clans (Ouchi 1979); objective and tacit knowledge (Polanyi 1966); and incremental and radical learning (March 1991). The process of knowledge acquisition in disasters forces the distinction between enduring and perishable information, as the latter includes knowledge that may disappear if it is not collected at certain key times.

A pattern is thus emerging in how organizations learn from their experiences and their mechanisms of gathering information. However, there is a serious lack of research on how this relates to emergencies, crises, disasters and other extreme situations. According to Lampel et al. (2009), if the impact on the organization is low, reinterpretive learning tends to occur. Lant and Mezias (1992) looked at the roles of both leadership and organisational adaptation in relation to the learning process. However, none of this adds up to a clear picture of exactly how organizations and their members acquire, store, share and utilise information in crises, how they do or do not transform it into knowledge, and what forces act to preserve or delete that knowledge.

In order truly to learn lessons about crises and disasters, theories of organisational knowledge need to be adapted to the special case of learning in and about crisis situations. Organisations need to be studied by developing a learning taxonomy that includes their type (e.g. 'blue-light' services, public administrations, civil society organisations, citizens' groups), size, competencies, experience and orientation. Organisational culture needs to be studied using models developed by anthropologists (e.g. Brislin 1984) and adapted for use in crisis situations (e.g. Alexander 2000).

References

Alexander, D.E. 2000. Confronting Catastrophe: New Perspectives on Natural Disasters. Terra Publishing, Harpenden, U.K., and Oxford University Press, New York, 282 pp.

Beedasy, J. and R. Ramloll 2010. Lessons learned from a pandemic influenza triage exercise in a 3D interactive multiuser virtual learning environment—Play2Train. Journal of Emergency Management 8(4): 53-61.

Brislin, R.W. 1980. Cross-cultural research methods: strategies, problems, applications. In I. Altman, A. Rapoport and J.F: Wohwill (eds) Human Behavior and Environment, Vol. 4, Environment and Culture. Plenum Press, New York: 47-82.

Clancy, T., C. Neuwirth and G. Bukowski 2009. Lessons learned in implementing a 24/7 public health call center in response to H1N1 in the state of New Jersey. American Journal of Disaster Medicine 4(5): 253-260.

Elkjaer, B. 2003. Organizational learning: 'the third way'. Organizational Learning and Knowledge: 5th International Conference, 30 May-2 June 2003, Lancaster, UK, 18 pp.

Fitzgerald, D.J., M.D. Sztajnkrycer and T.J. Crocco 2003. Chemical weapon functional exercise – Cincinnati: observations and lessons learned from a “typical medium-sized” city’s response to simulated terrorism utilizing weapons of mass destruction. Public Health Reports 118: 205-214.

Huber, G.P. 1991. Organizational learning: the contributing processes and the literatures. Organization Science 2(1): 88-115.

IFRCRCS 2005. World Disasters Report: Focus on Information in Disasters. International Federation of Red Cross and Red Crescent Societies, Geneva, 251 pp.

Kane, G.C. and M. Alavi 2007. Information technology and organizational learning: an investigation of exploration and exploitation processes. Organization Science 18(5): 796-812.

Krausmann, E., E. Renni, M. Campedel and V. Cozzani 2011. Industrial accidents triggered by earthquakes, floods and lightning: lessons learned from a database analysis. Natural Hazards 59(1): 285-300.

Lam, A. 2000. Tacit knowledge, organizational learning and societal institutions: an integrated framework. Organization Studies 21: 487-513.

Lampel, J., J. Shamsie and Z. Shapira 2009. Experiencing the improbable: rare events and organizational learning. Organization Science 20(5): 835-845.

Lant, T.K. and S.J. Mezias 1992. An organisational learning model of convergence and reorientation. Organizational Science 3(1): 47-71.

March, J.G. 1991. Exploration and exploitation in organizational learning. Organisation Science 2(1): 71-87.

Nonaka, I. 1994. A dynamic theory of organizational knowledge creation. Organization Science 5(1): 14-37.

Ouchi, W.G. 1979. A conceptual framework for the design of organizational control

mechanisms. Management Science 25: 833-847.

Plafker, G and Galloway, J.P., eds. 1989. Lessons learned from the Loma Prieta, California, earthquake of October 17, 1989. U.S. Geological Survey Circular 1045, 48 pp.

Polanyi, M. 1966. The Tacit Dimension. Routledge, London, 128 pp.

Schumacher, J.A., S.F. Coffey, D.T. Elkin and G. Norquist 2006. Post-Katrina mental health care in Mississippi: lessons learned. The Behavior Therapist 29(6): 124-127.

Simon, H.A. 1991. Bounded rationality and organisational learning. Organization Science 2(1): 125-134.

Spender, J-C. 1996. Organizational knowledge, learning and memory: three concepts in search of a theory. Journal of Organizational Change Management 9(1): 63-78.

White, G.W. 2007. Katrina and other disasters: lessons learned and lessons to teach: introduction to the special series. Journal of Disability Policy Studies 17(4): 194-195.