Lessons Learned the Hard Way

[Smiley Pool/MCT]

It seems every time we turn on the news, a disaster has occurred. With all our knowledge, skill and technology, why can’t we do something to prevent them, or at least keep them from causing such devastation?

Watch video of Mark Abkowitz
discussing risk management

Several years ago I was asked to develop a course on risk management at Vanderbilt based on case studies of actual historical events. The course has since evolved into a popular offering on campus. Each case is researched, debated and reconciled: Could the incident have been prevented? Preventable or not, what could have been done to manage the emergency response more effectively? What actions have we taken since the event to make the world a safer place? Could it happen again?

While man-made accidents, intentional acts and natural disasters may seem dissimilar, a closer look at how these events unfold reveals a remarkable similarity. Collectively, they provide a wealth of information about how disaster situations evolve, what can go wrong, the aftermath of these events, and whether we remain vulnerable to the recurrence of a similar event.

My Vanderbilt students, colleagues and I have drawn 12 key lessons from a number of actual disasters that can be applied to improve the way we manage risks as individuals, communities, businesses and public servants. It is not necessary for history to repeat itself.


Lesson 1:

Risk factors work together to generate an event with disastrous consequences.

Most systems and processes are designed with a built-in margin of safety. If a single risk factor goes awry, such as a certain procedure not being followed, usually a system or process is in place that will protect us from an adverse outcome. When disaster occurs, multiple risk factors are present, working collectively to erode that margin of safety and causing a situation to spiral out of control.


Lesson 2:

Communication failure is a risk factor in every disaster, whether the event is caused by accident, intentional act or nature.

The inability to share important information that is timely and accurate is a common denominator in every case we reviewed. In each instance this risk factor either caused the event to occur or contributed to the severity of the outcome.

Communication failure is a complex problem because it involves man and machine. Failure can be attributed solely to an equipment problem such as system overload, poor reception, inter-operability of different communication devices, or lack of technology.

In other situations failure can occur because certain individuals neglect to pass along vital information or do not think it is important to do so. Failure can occur within an organization, between organizations, or between authorities and the general public.


Lesson 3:

Never short-change planning and preparedness.

Along with communication failure, by far the most common risk factor is a lack of planning and preparedness. While managing this risk factor is vital to preventing man-made accidents and intentional acts from occurring, it is perhaps even more important in controlling the consequences of events, including those due to natural causes.

Effective planning and preparedness is based on the consideration of what might go wrong, the likelihood of its occurrence, and the potential consequences. Risk-mitigation strategies then can be devised to address these scenarios.

New Orleans U.S. Coast Guardsman Shawn Beaty searches for Hurricane Katrina survivors from a helicopter over New Orleans. [U.S. Coast Guard]

August 2005—Hurricane Katrina

Hurricane Katrina caused nearly 2,000 fatalities and an estimated economic loss of $125 billion, in addition to displacing hundreds of thousands of people. The destruction and loss of life cannot be attributed entirely to the storm itself. Numerous failures of the city’s flood-protection system, due to poor design and construction, deferred maintenance, and a lack of funding, left New Orleans vulnerable. As the city filled with water, insufficient emergency planning and preparedness, and the inability of responders to communicate, compounded the hurricane’s effects. Moreover, drilling for fossil fuels and engineering of the Mississippi River had destroyed wetlands that could have buffered the storm’s surge.
New Orleans[Joseph F. Nickischer/ISTOCK]


The Edmund Fitzgerald was the largest carrier on the Great Lakes when it launched in 1958.

Nov. 10, 1975—Wreck of the Edmund Fitzgerald

Under the command of 37-year veteran captain Ernest McSorley, the SS Edmund Fitzgerald sank during a freak early-winter storm in Lake Superior. The ship was loaded with iron-ore pellets for a trip from Superior, Wis., to Detroit, Mich. Of the 29 crew members aboard, none was ever found.

A variety of risk factors has been cited as contributing to the Fitzgerald‘s loss, including captain’s pride. The sinking helped lead to numerous changes in maritime regulations, industry practice and technology.

Two hours after the trip began Nov. 9, the ship encountered the SS Arthur M. Anderson, a cargo vessel traveling along a similar route with Capt. Jesse Cooper at the helm. By early evening the National Weather Service (NWS) had issued gale warnings for Lake Superior. Early the next day, the NWS upgraded its forecast to a storm warning. Both captains changed course and headed northward. That afternoon changing winds left the vessels exposed to large waves, and Cooper radioed that his ship would alter its course again. McSorley responded that he would stay on his current course although the Fitzgerald was “rolling some.”

An hour later Cooper radioed to McSorley that he thought the Fitzgerald might be too close to an area of shallow water known as the “Six Fathom Shoal.” As the Fitzgerald was taking on heavy seas over the ship’s deck, Capt. McSorley did not call for help even after it had been clear for several hours that his ship was in serious trouble.


A boat rests on top of a house in the distance as a woman surveys the rubble of Banda Aceh, Indonesia, after a 2004 tsunami virtually leveled the city. [Lynsey Addario/Corbis]

Dec. 26, 2004—Sumatra-Andaman Tsunami

The second-largest earthquake ever recorded spawned a massive tsunami that struck the coasts of Indonesia, Thailand, Sri Lanka, India and several African nations. (The largest ever recorded occurred in Chile in 1960.)

Damage from the Sumatra-Andaman tsunami resulted in more than 300,000 people declared dead or missing and more than 1 million left homeless. The many countries that suffered from exposure to the tsunami had neither the knowledge nor the means to institute an effective warning system. A reasonable disaster-preparedness plan and early-warning system might have averted most of these consequences.

Tsunamis have been fairly unusual throughout recent history in the Indian Ocean, with the last major occurrence more than 120 years ago. Because giant earthquakes often occur in groups—seven of 10 occurring in the 20th century happened in a 15-year span, and five of those were clustered in one geographical area—it is reasonable to expect other major quakes in southern Asia in the near future. Countries bordering the Indian Ocean are working with the United Nations on early-detection and public education systems to avert future disasters.

Lesson 4:

Economic pressure is a risk factor contributing to most man-made accidents and some intentional acts, and can play a role in ineffective preparedness for natural disasters.

One of the most important repercussions of economic pressure is a decision to forgo investment in planning and preparedness due to a lack of available resources. While resource limitations are a common management challenge, assigning available resources to the right priorities is an entirely different matter.

Often economic pressure and schedule constraints go hand in hand. A lack of resources can stimulate the need to hasten a project, while time-sensitive deadline can result in limiting the level of quality control.


Lesson 5:

Not following procedure is a common catalyst for man-made accidents and a reason for ineffective response to many natural and intentional disasters.

The source of this risk factor can come from either ignoring known procedures or from lack of proper training. Development and implementation of standard operating procedures is the foundation on which successful organizations are built. Imposing a structure and discipline to the performance of repetitive tasks ensures that they are done properly every time. When these procedures are not followed or errors in judgment are made, the consequences can be serious.

Not following procedures can be the catalyst for a tragic event, and the same risk factor can plague those people attempting to respond to an incident in progress.

In our haste to get people on the job or to fill in where help is needed, formal training often is deferred or not offered at all. In other circumstances, retraining is not provided at a time when personnel need to be exposed to new methods and practices. These oversights, while part of a general problem of not following procedures, can be attributed to an unawareness of what procedures to follow rather than a failure to apply procedures that had been taught. The outcome, however, remains the same.


Bhopal Children run through the streets of Bhopal, India, after the Union Carbide Corp. leaked a poisonous gas. [Pablo Bartholomew / Getty Images]

Dec. 2, 1984—Nightmare in Bhopal

A chemical plant in Bhopal, India, owned and operated by a subsidiary of Union Carbide Corp., accidentally released 40 tons of methyl isocyanate gas (MIC). Plant workers had allowed water to seep into the MIC tanks, causing a reaction that led to the release. Poorly maintained safety systems failed to contain its movement.

A toxic cloud drifted over residents of Bhopal while they were asleep and eventually covered an area of more than eight square miles, affecting a population of nearly 900,000 people. As many as 4,000 men, women and children died that night while in bed or trying to escape the fumes. Estimates of those injured or disabled are as high as 400,000. Within three days, estimated fatalities had risen to between 7,000 and 10,000 people. As many as 15,000 more have since reportedly died from residual exposure.

The Union Carbide Corp. was one of the earliest U.S. companies to establish a subsidiary in India, beginning in 1934. India was seeking to attract foreign investors to strengthen its economy and often did so, like many other developing countries, by relaxing safety standards or ignoring violations. Union Carbide, without any objection from the Indian government, applied different safety standards than those used in its West Virginia plant that manufactured similar products.

The Bhopal disaster involved such a large number of risk factors—including lack of planning and preparedness, poor communications, hands-off management, understaffing, and a culture and company that placed economic priorities over safety—that the occurrence of a catastrophe was not so much a matter of “whether” but “when.” It led to the worst disaster in the history of the chemical manufacturing industry and served as a bellwether event for the industry and a catalyst for safety reform.

Lesson 6:

Design and construction flaws are the bane of man-made accidents.

Every man-made case we reviewed suffered from a problem that was related either to design or construction. Some of these flaws were readily apparent and widely known. In other situations the protection system in place was thought to be sufficient, until it was demonstrated to be unreliable.


Lesson 7:

Do not underestimate the significance of political agendas.

Without question, this is the key message associated with intentional disasters. In every case studied, a strong political motivation existed for creating events of mass destruction. Al Qaeda—the international terrorist organization allegedly behind the attacks on the U.S. Navy guided-missile destroyer USS Cole in 2000, the World Trade Center in 2001, and the London transit system in 2005—had openly declared its contempt for U.S. and U.K. foreign policy.

Assailants in the cases of a sarin gas attack on the Tokyo subway system in 1995 and the truck bombing of the Alfred P. Murrah Building in Oklahoma City in 1995 were similarly politically motivated, albeit for different reasons.

Perhaps more surprising, however, is the extent to which political agendas also appear as risk factors in man-made accidents. Emerging economies’ disregard for safety conditions and governmental posturing can also create high-risk conditions.


Lesson 8:

Arrogance among individuals is perhaps a far more significant risk factor than previously imagined.

Individuals in a position of authority and organizations with a mandate to perform a certain operation are particularly susceptible to becoming arrogant over time. While a certain amount of arrogance can be healthy when channeled into strong team leadership, it can be just as easily abused.

In the cases reviewed, we witnessed several instances of individual and organizational arrogance that likely contributed to adverse outcomes. Did the Hyatt Regency contractors believe that attention to detail was a waste of their precious time? Did the Russian government and NASA diminish the value of human lives to preserve their status?

United Flight Cranes lift the tail section of the downed United DC-10 onto a flatbed truck at the Sioux City airport. [James Finley, AP/Wide World Photos]

July 19, 1989—United Flight 232

Flying debris severed all three hydraulic systems on United Airlines Flight 232 while en route from Denver to Chicago, leaving the pilot without any control of the DC-10 aircraft. Through the integrated effort of a well-trained cockpit crew and a highly coordinated emergency response, the plane was able to make a crash landing at the Sioux City, Iowa, airport. Exemplary risk-management practices both in the air and on the ground meant that of 296 passengers and crew on board, 184 survived.

Survival was aided, too, by the lack of thunderstorm activity during a time when it is typically frequent, occurrence of the incident on the one day of the month that the Iowa Air National Guard was on duty, and the presence of an off-duty instructor pilot as a passenger on the aircraft.


Lesson 9:

Lack of uniform safety standards across different nations creates an uneven risk-management playing field and conditions ripe for exploitation.

While attempts are being made to promote uniform human-health and environmental-quality standards throughout the world, there remains a wide disparity in how countries value public safety. As a result, in places where safety is treated as a second-class citizen, more frequent incidents with more severe consequences are likely.

This problem typically is due to a strong desire on the part of developing countries to promote economic activity, creating incentives to attract foreign investment that often lack safety considerations. In other instances the problem may lie in a more casual regard for what constitutes a reasonable level of safety. Ignorance or lack of resources of a country or region can render safety to be a less prominent concern.

Mt St Helens The eruption of Mount St. Helens was the worst volcanic disaster in U.S. history. [U.S. Geological Survey]

May 18, 1980—Eruption of Mount St. Helens

After two months of increasing seismic activity, Mount St. Helens in Washington State erupted in full fury, leaving a path of destruction. The blast and ensuing landslides, mudflows and eruption cloud killed 57 people, destroyed 27 bridges, ruined 200 homes, and toppled about 4 billion board-feet of timber. Nearly all wildlife within a 15-mile radius was wiped out.

Before 1980, concerns about Mount St. Helens and other peaks within the Cascade Range had caused the U.S. Geological Survey to request additional funding for volcano monitoring and hazards studies, but when these requests went unfulfilled, the agency focused its limited resources on Hawaiian volcanoes, which were thought to present a greater threat. Available monitoring technology was not able to predict the type, magnitude or affected areas of an eruption, leaving geologists unaware that a massive explosion was about to take place.

After the eruption of Mount St. Helens, the federal government dramatically increased funding for volcano monitoring and research.

Lesson 10:

Regardless of how well risks are being addressed, “luck” can change your fortunes one way or another.

Circumstances beyond human control, often called luck, always influence the extent to which a potentially catastrophic situation becomes a reality. Sometimes, due to poor risk management, bad luck allows a vulnerable situation to unravel. In other instances, as with United Flight 232, good luck enables a well-managed situation to prevail against seemingly long odds.

Some people believe that you make your own luck through effective planning and preparedness. If you consider a variety of disaster scenarios and devise strategies to limit their likelihood and severity, then—when faced with bad luck—there is a better chance your contingency plan can offset an unfortunate roll of the dice.


Hyatt Two skywalks at a new Kansas City hotel collapsed onto a crowded dance floor in 1981, killing 114 people and injuring 216 others. [Pete Leabo, AP/Wide World Photos]

July 17, 1981—Hyatt Regency Walkway Collapse

A tea dance in the atrium of the Hyatt Regency Hotel in Kansas City, Mo., ended in tragedy when the second- and fourth-floor skywalks collapsed onto a crowded dance floor, leaving 114 people dead and another 216 injured.

The hotel had opened its doors just a year earlier. Flaws in a simple design change made to a support mechanism went unnoticed, allowing the skywalk to buckle at the worst possible moment. The fourth-floor walkway fell 30 feet to the floor below, but not before landing on the second-floor sky bridge, causing it to collapse as well. More than 70 tons of debris fell, crushing or trapping hundreds of partygoers, some of whom could not be reached for more than seven hours.

The engineering contractor had failed to follow the formal design-review process, allowing a flaw in the hanger-rod configuration to go uncorrected. Design of the connections in the walkways was never even checked, despite the project engineer’s written assurance to the contrary. Seven weeks before scheduled completion, a worker noticed deformation of the walkway and reported it to the architect’s on-site representative, but the report never received attention. The following February two more observations were made of deformation in the walkways, but both were discounted.

Lesson 11:

It usually takes a disastrous event to convince people that something must be done.

We are so engrossed in our daily lives that an important problem often is ignored until an event of disastrous proportions wakes us up and makes us take notice. Only then are public officials, industry leaders and community activists tuned in to the need for reform and prepared to take appropriate action.

This is a consistent theme in all the cases we reviewed. Consider the creation of the U.S. Department of Homeland Security in the aftermath of the Sept. 11 attacks, enactment of the Oil Pollution Act in response to the Exxon Valdez spill in Alaskan waters, and the adoption of the American Chemistry Council’s “Responsible Care” guidelines in the face of the Bhopal gas leak. It is an unfortunate truth that we must suffer to a certain degree before help is on the way.

What is remarkable about the lessons learned from these cases is that they are easy to understand and make practical sense. Moreover, they can be put into daily use by individuals and organizations. Simply put, the foundation of successful risk management is planning, preparedness and communication. That forms the basis for establishing sound daily practices and creating opportunities for learning and knowledge building.

In managing your daily activities, do not impose unreasonable economic and schedule pressures on what you are trying to accomplish. Pay attention to the details of how things are designed, built and maintained. Recognize that certain individuals and organizations may be politically motivated or arrogant in ways that could be detrimental to your safety. And recognize that risk factors often work together to create a crisis situation, so be on the lookout for circumstances where these factors can become intertwined.

This prescription, if followed, will take you a long way toward a safer tomorrow, whether “luck” is on your side or not. However, adopting this approach is not a guarantee that one will be safe everywhere, all the time—which leads us to the final lesson.

Space Shuttle U.S. space shuttle Challenger lifts off from Kennedy Space Center, 72 seconds before explosion kills its crew of seven. [NASA, AP/Wide World Photos]

Jan. 28, 1986, and Feb. 1, 2003—The Challenger and Columbia Disasters

The U.S. space shuttle program suffered a serious setback in 1986 after the shuttle Challenger disintegrated shortly after takeoff, killing the entire crew. Although the official cause of the disaster was mechanical malfunction, it was discovered that NASA and its contractor knew about the existence of the O-ring design flaw but allowed the flight to proceed.

NASA subsequently took corrective actions to ensure that such institutional failures would not allow for another shuttle disaster.

But in 2003 the shuttle Columbia tore apart during re-entry into Earth’s atmosphere. The post-accident Columbia investigation revealed risk factors eerily similar to those of its Challenger predecessor. Once again, design flaws—this time associated with the insulating foam on the external tank—were known ahead of time, yet the launch was not stopped.

Chernobyl A Ferris wheel near Chernobyl was scheduled to be unveiled for May Day festivities in 1986. Instead, it has grown rusty in the still-contaminated ghost town of Pripyat, built in the 1970s as a home for Chernobyl workers. [photo by Sergey Dolzhenko/Corbis]

April 25–26, 1986—Meltdown at Chernobyl

A planned experiment gone badly at the Chernobyl nuclear power plant in the former Soviet Union created a reactor core explosion that sent a huge radioactive cloud into the atmosphere. Thirty-one people died from immediate radiation poisoning, 130,000 residents were evacuated, and radiation effects were felt across most of Europe and beyond.

For 12 days after the accident, an immense plume of radioactive material more than 400 times the magnitude released at Hiroshima spewed from the explosion site into the upper atmosphere, where weather patterns carried it north over Russia, then northeast over Poland and Scandinavia. Within days, Danish and Swedish nuclear monitors began detecting elevated levels of radiation, but the first media coverage did not occur until April 29, when a German newscast reported that there had been a major nuclear explosion at Chernobyl. Though the Soviet government initially denied the allegations, increasing international pressure finally caused Soviet leaders to acknowledge what had taken place.

The Chernobyl accident caused staggering economic losses for the USSR and the former Soviet nations. Long-term human health effects of radiation exposure are now being realized, and a large area around the plant site remains off limits to human habitation.

In the mid-1980s, as the Soviet Union was locked in a decades-old power struggle with Western Europe and the United States, participants on both sides attempted to gain political, economic and military advantage, with new technologies being developed to strengthen their cause. One of these was the nuclear reactor for creating electricity. Many believe the Chernobyl meltdown was the catalyst for ending the Cold War.

Fireman A fireman calls for more rescue workers to make their way into the rubble of the World Trade Center. [U.S. Navy photo by Preston Keres]

Sept. 11, 2001—The World Trade Center Attacks

Terrorists affiliated with the international al Qaeda organization hijacked two commercial airliners, crashing the planes minutes apart into the two World Trade Center towers in New York City. Each aircraft was directed into a different tower, resulting in the eventual collapse of both buildings and destruction of other infrastructure in the vicinity.

Nearly 3,000 people died in the towers and on the ground, including more than 400 firefighters and police officers. The terrorists exploited weaknesses in U.S. aviation security and communications gaps in the U.S. intelligence system.

The federal government implemented a number of changes as a direct result of the attacks. Little more than a month later, the Patriot Act was enacted, aimed at bolstering counterterrorism resources, improving border security, and undermining terrorist funding sources. The Homeland Security Act of 2002 was passed one year later, establishing the U.S. Department of Homeland Security within the executive branch of the federal government.

Since September 2001, terrorists have made numerous attempts to carry out large-scale attacks against the United States, all of which have been disrupted by U.S. and allied efforts.

Lesson 12:

Risk cannot be entirely avoided. Nothing can be designed or built to perfection, nor to last forever.

Every minute of every day, somewhere in the world, people are hurt, property is damaged, and the ecology is harmed. Sometimes the impact is felt by a few people at a specific location, while in other circumstances the impact can involve mass casualties over a large expanse.

No matter how hard we try to create a safe environment, it is not humanly possible to make life entirely risk-free.

Even if we had unlimited resources to invest in safety, we could not guarantee that nothing bad would happen. Consequently, we must recognize that life involves inherent choices among alternative risks. The key to managing these risks successfully is being able to identify them and establish priorities among them. Then we can direct our attention on reducing, not eliminating, those risks of greatest concern.

Simply put, we need to become more tolerant of certain risks and recognize that sometimes bad things will happen even when we put our best foot forward … and that is just a fact of life.


Mark D. Abkowitz is a professor of civil and environmental engineering at Vanderbilt University. This article has been adapted from his book Operational Risk Management: A Case Study Approach to Effective Planning and Response with permission of the publisher, John Wiley & Sons Inc. © 2008 by Mark D. Abkowitz. All rights reserved. Derek Bryant provided research that forms the basis of the case-study narratives.

Storm Fire
Storm King Mountain bears the scars of a Colorado wildfire that began in South Canyon. [Raymond Gehman/CORBIS]

July 1994—South Canyon Fire

What began as a relatively small Colorado wildfire July 3 grew into a dangerous blaze during the ensuing days while firefighting resources were allocated to other fires in the district. Once fragmented resources began to arrive, the fire could not be easily contained, and firefighters found themselves with no escape routes if a sudden reverse in the direction of the blaze occurred.

The afternoon of July 6, a cold front created a wind shift and subsequent “blowup” that trapped and killed 14 firefighters. Management, leadership and communication within the firefighting community contributed to the tragedy.