Is your organization’s planning brittle?
Geary Sikich provides five questions that you can work through to assess whether your organization’s planning is resulting in brittleness.
Good intentions do not make for the creation of a robust/resilient organization. Business continuity plans, disaster recovery plans, emergency response plans, etc., that are standalone documents generally reflect good intentions – whether to meet regulatory requirements or to address governance. I would have to say that this should be a wake-up call for executives in all industries. We live in a complex and interdependent world. Complex systems are full of interdependencies that are hard to detect. The result is nonlinearity in responses to events, especially random events/shocks.
The odds of rare events are simply not computable. Model error swells when it comes to small probabilities. The rarer the event; the less tractable, and the less we know about how frequent its occurrence.
Here are five questions that you can use to assess whether your organization’s planning is resulting in brittleness:
We cannot calculate the risks and probabilities of events, shocks and especially rare events, no matter how sophisticated we get with our models. Brittle and inflexible plans reflect an organization’s failure to “think beyond what is readily seen.” Planning that is brittle generally defaults to thinking that what is not seen is not there and what is not understood does not exist. The organization and its planners often mistake ‘noise’ for information, especially in the era of big data and information overload. Noise reflects random information that, until it can be assembled with other information to create something meaningful, is essentially meaningless.
Accountability is a critical element in assuring that plans are effectively developed; are well managed and executed when an event occurs; and that recommendations, findings, suggestions for improvement, etc. get implemented. This requires that we develop a ‘FutureForward’ thinking process. In a complex system interdependencies abound; the notion of cause becomes suspect as it is nearly impossible to detect or it is not really defined. Isolation of causal relationships (direct/linear) is not feasible due to opacity and nonlinearity.
For example, my first question, “Do the organization’s plans stand in siloes of excellence?” begets the aspects of accountability, threat identification, business impact analysis and much more. Isolated plans that are not linked to a single accountable entity will result in fragmented response, confusion and missed opportunities.
In looking at the second question, “Are activation and implementation of plans independent and uncoordinated?” the effect of nonlinearity comes into play as the unintended consequences resulting from fragmented implementation can further exacerbate the impact of an event. Here again we see that the effect of poor communication and lack of understanding ‘intent’ come into play as complexity is underestimated and interdependencies that have gone unseen reveal themselves.
Brittle plans and the organizations that built them do not often survive catastrophic events. Take a look at the third question, “Does the organization face critical junctures of survival every time an event or certain shocks affect it?” While seemingly straightforward, this question is not very easy to answer. This is in part due to transparent vulnerabilities; and unseen and unforeseen risks. The enterprise may not have the capacity to withstand the extra stresses of an event. You may begin to realize your planning is brittle if your organization has employed linear cause and effect models; if mistakes are rare but largely irreversible; if volatility disrupts stability, creating shockwaves of negative reaction (to include terminating the planners).
There are unlimited ways to buffer and alter the risk exposure that an organization has, but all involve decision making and an understanding of the consequences of the decision to act on a risk issue. Key decision makers must maintain risk within acceptable limits. However, all decision making is flawed. This is the result of bias, universally poor information acquisition, processing, management and application. We are trapped in a complexity that we have created by virtue of the explosion of connectivity, speed, sources not properly vetted, information overload, lack of information and ‘flow.’ Flow relates to perception, intangibles and bias. Combined with the above factors and many more, we make flawed decisions. Flawed decisions are not necessarily ‘bad,’ nor are they ‘good.’ Flawed decisions flow in all directions, creating a need for embedding a culture of risk and business continuity management, to create a responsive and resilient framework that prevents collateral damage and buffers us against unilateral damage. We fail to incorporate lessons learned because of failure bias and the speed at which events change the operating horizon.
My fourth question touches on a sensitive area for most planners; “Does analysis of ‘worst case’ scenarios underlay the basis for planning?” Business impact analysis, SWOT analysis, risk matrices, risk heat maps, etc. all fall into the trap of historical analysis. The occurrence of extreme events cannot be predicted from a review of past history. Worst case events when they happen, exceed the worst case that is known at the time and which may have been the basis of planning. The Japanese are very familiar with earthquakes, tsunamis and industrial accidents, etc. They are good planners, yet, no one could have anticipated the sequence of events that led to Fukushima. The worst case event had to be a surprise, as it had no precedent. Selection bias comes into play when we develop worst case scenario based plans. We just do not see the unseen and therefore it must not exist.
My fifth question, “Do the plans reflect the strategy, goals and objectives of the organization?” is one that I find most intriguing. Most planners, whether they be business continuity, disaster recovery, emergency planning, etc. most often fail to consider the goals and objectives of the organization. The result: brittle plans fall apart during implementation and fail to adequately address the post-event environment (reentry, recovery, restoration, altered business operations). Errors and consequences are almost always fatal for the planners, plans and in many instances, the organization.
Will your organization be faced with ‘post-traumatic stress’ or will it experience ‘post-traumatic growth’? Plans that are brittle fail because they lack the ability to overcompensate for anticipated stresses and cannot see the unanticipated stresses that will emerge creating cascading side effects.
Errors and their consequences are information that needs to be made actionable to the organization. The problems with assimilating information after experiencing an event, either directly or indirectly, are threefold. The first is accountability. I mentioned accountability earlier in this piece. I will repeat it again: accountability is a critical element in assuring that plans are effectively developed, managed and executed when an event occurs. Secondly, we tend to mislabel. What we cite as ‘Lessons Learned’ perhaps we should be labeling ‘Information Identified.’ The results of most post-event analysis are that most lessons tend not to be actually learned but rather identified and then buried in a post-action report. Once things have calmed down and the trauma of the event has subsided; the post-event report is filed appropriately and the organization returns to operational opacity. When we return to operational opacity we are free to repeat the same mistakes over and over. Lessons Learned usually do not have any funding unless a regulatory initiative requires compliance. The third point is that most organizations don't have a culture of FutureForward thinking; instead we have opacity.
Many years ago, I asked a question of a group of business planners (business continuity, disaster recovery, emergency response, etc.) if business continuity was a way of doing business for their organizations or if it was an adjunct to the business of the organization. The answer was not a surprise – it was an adjunct and many cited that they were happy to be ‘flying under the radar,’ as it gave some assurance of job security (no one noticed them, therefore they did not exist). There is a tendency to mistake the unknown for the nonexistent. We cannot calculate the odds of an event occurring, due to model error swelling when it comes to small probabilities. The rarer the event; the less tractable, and the less we know about how frequent its occurrence.
Brittle plans tend to reflect brittle thinking: that what is not seen is not there and what is not understood does not exist. Ask yourself, “Is what I see all there is?”
About the author
Geary Sikich is a seasoned risk management professional who advises private and public sector executives to develop risk buffering strategies to protect their asset base. With a M.Ed. in Counseling and Guidance, Geary's focus is human capital: what people think, who they are, what they need and how they communicate. With over 25 years in management consulting as a trusted advisor, crisis manager, senior executive and educator, Geary brings unprecedented value to clients worldwide.
Geary is well-versed in contingency planning, risk management, human resource development, “war gaming,” as well as competitive intelligence, issues analysis, global strategy and identification of transparent vulnerabilities. Geary began his career as an officer in the US Army after completing his BS in Criminology.
A well-known author, Geary’s books and articles are readily available on Amazon, Barnes & Noble and the Internet.
Apgar, David, Risk Intelligence – Learning to Manage What We Don’t Know, Harvard Business School Press, 2006.
Davis, Stanley M., Christopher Meyer, Blur: The Speed of Change in the Connected Economy, (1998).
Kami, Michael J., Trigger Points: how to make decisions three times faster, 1988, McGraw-Hill, ISBN 0-07-033219-3
Klein, Gary, Sources of Power: How People Make Decisions, 1998, MIT Press, ISBN 13 978-0-262-11227-7
Levene, Lord, Changing Risk Environment for Global Business. Union League Club of Chicago, April 8, 2003.
Orlov, Dimitry, Reinventing Collapse, New Society Publishers; First Printing edition (June 1, 2008), ISBN-10: 0865716064, ISBN-13: 978-0865716063
Sikich, Geary W., Managing Crisis at the Speed of Light, Disaster Recovery Journal Conference, 1999
Sikich, Geary W., Business Continuity & Crisis Management in the Internet/E-Business Era, Teltech, 2000
Sikich, Geary W., What is there to know about a crisis, John Liner Review, Volume 14, No. 4, 2001
Sikich, Geary W., The World We Live in: Are You Prepared for Disaster, Crisis Communication Series, Placeware and ConferZone web-based conference series Part I, January 24, 2002
Sikich, Geary W., September 11 Aftermath: Ten Things Your Organization Can Do Now, John Liner Review, Winter 2002, Volume 15, Number 4
Sikich, Geary W., Graceful Degradation and Agile Restoration Synopsis, Disaster Resource Guide, 2002
Sikich, Geary W., Aftermath September 11th, Can Your Organization Afford to Wait, New York State Bar Association, Federal and Commercial Litigation, Spring Conference, May 2002
Sikich, Geary W., Integrated Business Continuity: Maintaining Resilience in Times of Uncertainty, PennWell Publishing, 2003
Sikich, Geary W., It Can’t Happen Here: All Hazards Crisis Management Planning, PennWell Publishing 1993.
Sikich Geary W., The Emergency Management Planning Handbook, McGraw Hill, 1995.
Sikich Geary W., Stagl, John M., The Economic Consequences of a Pandemic, Discover Financial Services Business Continuity Summit, 2005.
Tainter, Joseph, The Collapse of Complex Societies, Cambridge University Press (March 30, 1990), ISBN-10: 052138673X, ISBN-13: 978-0521386739
Taleb, Nicholas Nassim, The Black Swan: The Impact of the Highly Improbable, 2007, Random House – ISBN 978-1-4000-6351-2
Taleb, Nicholas Nassim, The Black Swan: The Impact of the Highly Improbable, Second Edition 2010, Random House – ISBN 978-0-8129-7381-5
Taleb, Nicholas Nassim, Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets, 2005, Updated edition (October 14, 2008) Random House – ISBN-13: 978-1400067930
Taleb, N.N., Common Errors in Interpreting the Ideas of The Black Swan and Associated Papers; NYU Poly Institute October 18, 2009
Taleb, Nicholas Nassim, Antifragile: Things that gain from disorder, 2012, Random House – ISBN 978-1-4000-6782-4
•Date: 16th Jan 2013 • US/World •Type: Article • Topic: BC general
To submit news stories to Continuity Central, e-mail the editor.
Want an RSS newsfeed for your website? Click here