System Logic’s mission is to help organizations reduce their exposure to the potentially catastrophic risks that stem from the interaction of human, organizational, and systemic vulnerabilities.

We do this in two independent and parallel ways. We develop customized training for leaders and teams on decision making, biases, and risk. The trainings provide practical techniques that can be used both to manage risk and to make better decisions. Research shows that these techniques not only reduce risks but can also foster agile organizations and encourage innovation.

In addition, we use a proprietary, research-based methodology to help managers identify hidden risks within their organizations. We then leverage the organization’s own risk-management expertise to formulate actionable mitigation strategies.

The following reading list introduces leaders to practical advice that they can use to overcome common human, organizational, and systemic challenges. While based on rigorous research, most of the following articles are specifically targeted to practitioners. As much as possible, we have chosen content that is open and accessible. Finally, we have included a selection of works that delve into the root causes of catastrophic failures so that interested readers may engage with the ideas that have inspired our own practice.

Should you wish to discuss training or implementation strategies for any of the ideas covered below, we would love to help. Get in touch by email or phone. ([email protected], 646-543-4250).

I. Managing to Avoid Biases

Cognitive biases–fundamental limitations in human thinking–can result in decisions that are systematically and predictably flawed. These readings help leaders understand how to recognize and overcome the biases that teams and individuals bring to the table.

A 12-question checklist that can reveal the cognitive biases in teams

Kahneman, D., D. Lovallo, and O. Sibony. 2011. Before you Make that Big Decision. Harvard Business Review, 89(6):50-60.

The premortem: A technique to unearth hidden risks

Klein, G. 2007. Performing a Project Premortem. Harvard Business Review, 85 (9): 18-19.

Six common mistakes to avoid when thinking about risk in organizations

Taleb, N. N., D. G. Goldstein, and M. W. Spitznagel. 2009. The Six Mistakes Executives Make in Risk Management. Harvard Business Review, 87(10): 78-81.

II. Building Resilient Systems and Organizations

The way a system is built and the way an organization is structured can profoundly affect how resilient each is to unexpected shocks. From the role of redundant technology to the benefits of communication and learning from failures, these articles and books explore how a considered approach to systemic and organizational features can increase robustness.

Build a better mousetrap: how to design a resilient technology platform for your organization

Black Swan in the Server Room: Avoiding Disaster in Disaster Planning. System Logic White Paper, 2014.

Overcome overregulation in finance

Clearfield, C., Tilcsik, A., and Berman, B. 2015. Preventing Crashes: Lessons for the SEC from the Airline Industry. Harvard Kennedy School Review.

Predict the future: how to foresee the next catastrophe

Watkins, M. D., and M. H. Bazerman. 2003. Predictable Surprises: The Disasters You Should Have Seen Coming. Harvard Business Review, 81(3): 72- 85.

How paying attention to failure can pay dividends

Tinsley, C. H., R. L. Dillon, and P. M. Madsen. 2011. How to Avoid Catastrophe. Harvard Business Review, 89(4): 90-97.

Building resilient organizations that effectively manage unexpected crises

Weick, K. E., & Sutcliffe, K. M. 2007. Managing the Unexpected: Resilient Performance in an Age of Uncertainty. John Wiley & Sons.

III. The Big Picture: The Root Causes of Catastrophic Failure

The root causes of catastrophic failure are human, systemic, and organizational. These works go beyond applications relevant for practitioners. They provide in-depth case studies and lay the theoretical foundation for many of the insights on systemic failures that form the basis of our work.

A systemic view: The role of complex interactions and tight coupling

Perrow, C. 2011. Normal Accidents: Living with High Risk Technologies. Princeton University Press.

What can mountain climbing teach us about air crashes and nuclear meltdowns?

Roberto, M. A. 2002. Lessons from Everest: The Interaction of Cognitive Bias, Psychological Safety, and System Complexity. California Management Review, 45(1): 136-158.

Ideas in Practice: A tour of the nuclear weapons “sausage factory”

Schlosser, E. 2013. Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety. Penguin Group US.

How small errors combine into major catastrophes

Weick, K. E. 1990. The Vulnerable System: An Analysis of the Tenerife Air Disaster. Journal of Management, 16(3):571-593.

© Copyright 2017 by System Logic LLC.