image alt text

From BP’s Deepwater Horizon disaster, to deadly component failures at Toyota and GM, to technological meltdowns at major stock market participants like Knight Capital (now KCG Holdings), Goldman Sachs, and NASDAQ, catastrophic failures have devastating effects on the environment, businesses, and customers. And the potential for failures of this kind is growing as both the complexity of our systems and the probability of extreme “trigger” events increase.

But thanks to decades of research, a large body knowledge exists today about the prevention and management of catastrophic failure in complex, uncertain environments. Research on normal accidents, predictable surprises, and dangerous cognitive biases in decision making have taught us about how complex systems can unexpectedly unravel and cascade uncontrollably toward disaster. There is also a great deal of useful research on potential interventions and countermeasures, such as building high-reliability organizations, adding more slack to our systems, overcoming cognitive biases through disciplined decision-making processes, and rethinking how we regulate complex systems, like electronic trading.

However, as is often the case, while relevant knowledge does exist, it might not be used in practice. Do practitioners actually learn and implement insights from this rich and readily actionable research literature? An examination of the catastrophic incidents above seems to indicate that they are not.

What about future business leaders? Can existing insights from research be passed onto to current MBAs, people who will one day manage large, complex organizations—organizations in which the next major catastrophe might already be brewing?

Some are skeptical. In The Dismal Science, Harvard economist Stephen Marglin writes: “Generations of students have paid, and are still paying good money to the leading business schools of the world to learn how to apply probabilities consistently. I do not wish even to hint that students don’t get their money’s worth, but I am very sceptical that their ability to deal with uncertainty is much enhanced.” And, of course, overconfidence and a lack of sensitivity to uncertainty create fertile ground for catastrophic failures. Yet, perhaps because the cost of disaster is rising, there is increasing recognition that teaching a systemic and cognition-based perspective on catastrophes to future managers is vital.

I recently visited the Rotman School of Management at the University of Toronto to speak to MBA students about Knight Capital’s trading meltdown and NASDAQ’s handling of the Facebook IPO in a new course called Catastrophic Failure in Organizations. The course, taught by my friend and collaborator András Tilcsik, looks at several high-profile cases of catastrophic failure—from the Tenerife airport disaster and the Three Mile Island nuclear accident to the fall of Enron and BP’s Deepwater Horizon oil spill—and leverages the rich experiences of guest speakers like the former chief of major accident investigations at the National Transportation Safety Board as well as the school’s new dean, Tiff Macklem, who previously served as Senior Deputy Governor of the Bank of Canada during the financial crisis. Unlike “anecdote-driven” scholarship, these speakers contribute to a rigorous systematic analysis of past accidents and meltdowns and present interventions that can reduce the risk of catastrophic failure.

Similar important initiatives exist in several other schools. Since 2007, the former astronaut Jay Apt has been teaching MBA students at Carnegie Mellon’s Tepper School a course on Catastrophic Risk Analysis. The goal is to help students understand, analyze, and manage low-probability but high-impact events in the presence of irreducible uncertainty. At Harvard, Herman (Dutch) Leonard, who holds an appointment at both Harvard Business School and the Harvard Kennedy School, offers courses on enterprise risk management and crisis leadership. One MBA course, titled Enterprise Risk Management: Strategy and Leadership in the Face of Large-Scale Uncertainties, is specifically designed for students who, over the course of their careers, will lead organizations that face large-scale risks, such as earthquakes, storms, terror incidents, or brand crises. On the research side, the Wharton School houses a Risk Management and Decision Processes Center, which brings together scholars working to understand the management of potentially catastrophic events, particularly those that involve health, safety, or the environment.

These initiatives represent an important first step. Yet at this time, the courses offered on these critical issues are typically second-year MBA electives, rather than part of the core MBA curriculum. Still, at the very least, such courses can highlight the wicked problem of catastrophic risks, suggest some solutions, and legitimize this crucial topic as an area of study for future managers.

I believe that reducing catastrophic potential is in the hands of the managers who work on these wicked problems at organizations that drill for oil to those that mine bits for insight. The complexity of these systems means that regulators struggle to make an impact (and may make things worse), and the increasing impact of failure means that firms can no longer be glib about the risk landscape in which they operate every day.

My hope is that these initiatives can do more than just break novel instructional ground. As I developed my presentation for the Rotman MBAs, many of whom have substantial professional insights that they bring to the classroom, András wrote me about the ultimate goal for the course: “to instil some degree of meta-knowledge about the risk of catastrophic failures—knowledge about the limits of what we can predict and understand—and a sense of humility that goes along with that knowledge.” I couldn’t agree more with this aspiration. As Nassim Taleb and his coauthors remind us, the risk of catastrophic failure lies not simply in our physical environment or in our technologies: “The biggest risk lies within us: We overestimate our abilities and underestimate what can go wrong.” Perhaps we can start to change that.

Chris Clearfield is a principal at System Logic, an independent consulting firm that helps organizations manage issues of risk and complexity. Follow him on Twitter, and check out his other writings.

As originally published in Forbes.