Cascade effects in business continuity planning
- Published: Thursday, 10 March 2016 09:31
Cascade effects occur at all levels of disruption and crisis, from tactical to operational to strategic; and have implications for business impact analysis and risk assessments. Geary Sikich explores the subject.
The definition below was presented in a recent questionnaire conducted by Project Snowball in the European Union. As used in the questionnaire the term ‘cascading effects’ is defined as:
“Cascading effects are the dynamics present in disasters, in which the impact of a physical event or the development of an initial technological or human failure generates a sequence of events, linked or dependent from each other, that result in physical, social or economic disruption.
“They are associated more with the magnitude of vulnerability than with that of hazards. Low-level hazards can generate broad chain effects if vulnerabilities are widespread in the system or not addressed properly in sub-systems. Adapted from Pescaroli & Alexander, 2015 [EC FP7 FORTRESS project].)”
This definition seems to propose two variants for cascading effects; the first being described in paragraph one and the second differentiating ‘low level’ vulnerabilities/hazards. I am not sure that the Project Snowball definitions accurately reflect the work of Pescaroli and Alexander. It does give the impression of linear thinking with regard to cascade effects. The following article seeks to bring a different perspective regarding cascade effects.
The linear cascade fallacy
Cascade effects occur at all levels of disruption and crisis, from tactical to operational to strategic. I would disagree with the definition above as it describes low level events as having ‘chain effects’ and focuses on vulnerability and hazard to the exclusion of risk, threat and readily identifiable consequences; as these are no more than cascade effects on a tactical level.
Additionally, the general tendency is to associate cascades as linear events (i.e., dominoes falling) this a false premise to build upon. Events are not linear and our response/reaction to events further alters the cascade effect, rendering it nonlinear. An excellent example of nonlinearity is Per Bak’s sand pile experiments. Bak found that, as a grain of sand was added to a sand pile, the sand pile reflected self-organized criticality (SOC) each sand grain affecting the others, creating the sand pile. At a certain point it was found that this self-organized critical state was disrupted by the addition of another grain of sand. However, one could not predict which grain of sand would cause the sand pile to collapse, nor could they predict the magnitude or direction of the collapse. This would be more representative of the cascade effects during a disruptive event/crisis. Things build up and then reach a critical state resulting in cascading effects as the disruptive event/crisis unfolds.
When thinking of event cascades, it is perhaps helpful to structure the cascade in a multi-dimensional manner using spheres, boxes and/or some other representation of dimensionality. Since events are multi-dimensional we need to take a holistic view of our analysis process as the event unfolds. In this respect one can begin to link event touchpoints that could be affected by the actions taken in response to the initiating event. You may have to simplify the process in order to prevent your analysis diagram from beginning to look like a bowl of spaghetti. However, your analysis should not result in a one-time representation, but rather should evolve as things change. Mapping analysis in this way can facilitate internal and external discussion, coordination and cooperation. Some analysis elements could include, but not be limited to: impact of the event and depth of penetration (within the organization, value chain, public sector, etc.), touchpoints affected, nonlinearity, opacity, reactivity and velocity (not necessarily all inclusive).
Let’s return to Per Bak and the sand pile effect for a moment. When assessing the potential effects of a cascade event, we first have to recognize that we will never have all the information required to make a perfect decision. Therefore, recognize that all decisions are flawed due to the inability to have all the necessary information to make a decision and due to the fact that as you make a decision and implement the action, the result is a reaction by independent and/or semi-independent entities (i.e., within the organization, value chain, public sector, etc.), essentially all touchpoints affected by the action implemented.
Sounds complex? It is. Events are not simple linear steps over time. Events are complex, evolving situations that need to be shaped by the desired response in order to achieve mitigation and eventual closure.
Combining the analysis from initiating event with continual analysis during the decision implementation process can facilitate a more comprehensive and timely decision making and coordination process between and among all entities involved in and affected by the event.
The effect of opacity
Opacity for the purposes of this article is defined as: ‘the quality of being difficult to understand or explain; obscurity of meaning.’ Nassim Taleb, in his book, ‘The Black Swan’ describes a trio of opacity:
- The illusion of understanding
- Retrospective distortion
- Overvaluing facts and authorities.
Taleb further states that: “in a complex world, the notion of ‘cause’ is itself suspect; it is either nearly impossible to detect or cannot really be defined.”
Due to complexity, opacity and the speed of unfolding events we often delude ourselves into thinking that we have a handle on the situation and that things are under control. When we look back on the timeline of a disruptive event we are tricked by our biases into retrospective distortion: not seeing events as they unfold clearly. This leads to second guessing decisions and actions, rather than moving forward and adjusting based on the current situation. Also you should note that:
- Not all variables are observable;
- Reaction alters the cascade effect;
- There are unlimited possibilities.
No matter how detailed your plans, procedures, etc. the initial shock of an event will cause disruption. Overcoming this disruption early is a key to minimizing cascade effects. One way of achieving a reduction in reaction time and dealing with uncertainty is to develop and implement exercises that are designed to identify issues, touchpoints, etc. rather than just provide a demonstration of linear problem solving. Get the team talking – vertically and horizontally within the organization. Then branch out to include customers, vendors, other value chain entities, government and local community groups.
Begin to look at and assess the nonlinear risks that have high consequence to your organization. Rethink the way you conduct your business impact analysis, risk assessments, etc.
Geary Sikich is a seasoned risk management professional who advises private and public sector executives to develop risk buffering strategies to protect their asset base. With a M.Ed. in Counseling and Guidance, Geary's focus is human capital: what people think, who they are, what they need and how they communicate. With over 28 years in management consulting as a trusted advisor, crisis manager, senior executive and educator, Geary brings unprecedented value to clients worldwide.
Geary is well-versed in contingency planning, risk management, human resource development, ‘war gaming,’ as well as competitive intelligence, issues analysis, global strategy and identification of transparent vulnerabilities. As a thought leader, Geary leverages his skills in client attraction and the tools of LinkedIn, social media and publishing to help executives in decision analysis, strategy development and risk buffering.
Geary has a passion for helping executives, risk managers, and contingency planning professionals leverage their brand and leadership skills by enhancing decision making skills, changing behaviors, communication styles and risk management efforts. A well-known author, his books and articles are readily available on Amazon, Barnes & Noble and the Internet.
Apgar, David, “Risk Intelligence – Learning to Manage What We Don’t Know,” 2006, Harvard Business School Press.
Bak, Per, “Bak's Sand Pile: Strategies for a Catastrophic World”; Agile Research and Technology, Inc (February 28, 2011), ISBN-10: 098307450X, ISBN-13: 978-0983074502
Gardner, Dan, “Risk: The Science and Politics of Fear,” 2009, Virgin Publishing, ISBN 978-0753515532
Heur Jr., Richards, J. “Psychology of Intelligence Analysis,” 2007, Pherson Associates; 2nd edition
Jones, Milo and Silberzahn, Philippe, Constructing Cassandra: Reframing Intelligence Failure at the CIA, 1947–2001, Stanford Security Studies (August 21, 2013) ISBN-10: 0804785805, ISBN-13: 978-0804785808
Kahneman, Daniel, “Thinking, Fast and Slow,” 2011, Farrar, Straus and Giroux, ISBN 978-0-374-27563-1
Kami, Michael J., “Trigger Points: how to make decisions three times faster,” 1988, McGraw-Hill, ISBN 0-07-033219-3
Posner, Kenneth A., “Stalking the Black Swan,” 2010, Columbia Press, ISBN 987-0-231-15048-4
Sikich, Geary W., “What is there to know about a crisis,” 2001, John Liner Review, Vol. 14, No. 4.
Sikich, Geary W., "Integrated Business Continuity: Maintaining Resilience in Times of Uncertainty," 2003, PennWell Publishing
Sikich, Geary W., “It Can’t Happen Here: All Hazards Crisis Management Planning”, 1993, PennWell Publishing
Sikich Geary W., "The Emergency Management Planning Handbook", 1995, McGraw Hill
Sikich, Geary W., "Risk and the Limitations of Knowledge” 2014
Tainter, Joseph, “The Collapse of Complex Societies,” Cambridge University Press (March 30, 1990), ISBN-10: 052138673X, ISBN-13: 978-0521386739
Taleb, Nicholas Nassim, “The Black Swan: The Impact of the Highly Improbable,” 2007, Random House – ISBN 978-1-4000-6351-2, 2nd Edition 2010, Random House – ISBN 978-0-8129-7381-5
Taleb, Nicholas Nassim, Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets, 2005, Updated edition (October 14, 2008) Random House – ISBN-13: 978-1400067930
Taleb, N.N., “Common Errors in Interpreting the Ideas of The Black Swan and Associated Papers;” NYU Poly Institute October 18, 2009
Taleb, Nicholas Nassim, “Antifragile: Things that gain from disorder,” 2012, Random House – ISBN 978-1-4000-6782-4Vail, Jeff, “The Logic of Collapse,” 2006, www.karavans.com/collapse2.html