On 6th November 2018 the Business Continuity Institute issued a new survey-based report entitled ‘The Continuity & Resilience Report: Raising the impact of Business Continuity’. The report has received significant criticism from business continuity consultant and author David Lindstedt. In this article, Continuity Central’s editor, David Honour, examines David Lindstedt’s reaction to the report and offers an assessment of some of the points made.
The BCI’s Continuity & Resilience Report: Raising the impact of Business Continuity was launched at BCI World and is the result of 853 responses to a survey. 43 percent of respondents were based in Europe, 22 percent in North America, and 10 percent were based in Australasia.
The BCI summarises the Continuity and Resilience Report as follows:
“This study aims at highlighting the role of business continuity and its relationship with other management functions within the organization, such as information security, risk management or physical security. The findings address a range of issues to measure the impact of business continuity, such as its levels of investment, top management buy-in and its role during a crisis.
“It emerges that business continuity plays a central role across different scenarios, such as adverse weather, cyber attacks or the outbreak of a new pandemic. Furthermore, organizations tend to increasingly appreciate its value over time, as they can see return on investment.”
The report states that the results of the study show that employing business continuity over time supports:
- The reduction of the cost of the response;
- The improvement of employee morale;
- Customer retention.
It also claims that the top three benefits of business continuity are:
- Faster recovery: claimed by 87 percent of respondents;
- Safety and accountability of staff: 80 percent of respondents;
- A reduction of the costs of disruptions: 77 percent of respondents.
In his article ‘The BCI Report: Echo Chambers, Disturbing Graphics, and Status Quo’ written in response to the report, David Lindstedt initially focusses on the latter finding concerning the top three benefits of business continuity, stating:
"The claim is based on a single opinion question asked of BC, ERM, and other preparedness practitioners. It was a simple survey question where preparedness planners were asked to select any number of perceived benefits. This is not a finding. This is an echo chamber, filtering on a narrow subset of people to reemphasize specific beliefs in support of preexisting positions. It may be of passing interest to learn what preparedness planners believe are the most beneficial aspects of their work, but this is not evidence upon which to invest resources, develop public policy, define regulatory requirements, or justify the costs of a BC program. Just because preparedness practitioners believe their efforts provide specific benefits does not make it so."
David Lindstedt’s claim that these survey results are an echo chamber have some validity – 52 percent of the respondents cited business continuity as their functional role; and it is probably fair to say that business continuity professionals may have a vested interest in highlighting the benefits that business continuity offers. However, it is probably also fair to say that those closest to the day-to-day management of business continuity may be those who are best placed to observe the benefits that business continuity brings. What might be helpful in future would be to present the same list of possible benefits that business continuity brings to a group of business continuity professionals and to a group of business leaders who are not involved day-to-day in business continuity and observe what gaps there are in perceptions between the two groups. However, on balance I don’t think it is justifiable to write the survey findings off as simply an echo chamber; the business continuity professionals provide a valid group to report on the results of business continuity programs; but the results need to be caveated based on the potential bias of the survey group. In fact, the BCI attempts to address this to a limited extent in the report, with figure 5 (below) showing how Risk managers; Crisis and emergency planners; and Cyber, InfoSec and IT disaster recovery professionals who responded to the survey view the top three benefits of business continuity:
The business impact analysis
On page 13 of The Continuity & Resilience Report the authors state that “Business continuity measures can help understand vulnerabilities, through a Business Impact Analysis (BIA) or a risk and threat assessment.”
In his response, David Lindstedt states that:
“This is an entirely unwarranted claim. Nowhere in the question or the data is there any mention of, or accounting for, the BIA or a risk and threat assessment. This is a poor, unreliable, and misleading inference."
David Lindstedt’s comment seems to be a fair one. Given the discussion that has taken place in the past couple of years in the business continuity profession about the role of the BIA and the claims by Adaptive BCP (an organization that David Lindstedt is involved with) that the BIA is an unnecessary process, this statement in the Continuity & Resilience Report is essentially a political one which doesn’t seem to have any basis in the survey results. This is the one and only reference to the BIA in the report and the statement isn’t linked to any finding from the survey.
Return on investment
On page 11 of the report the authors write:
“Most of those organizations that employ business continuity arrangements have done so for more than five years. Also, as figure 3 reveals, those organizations that have adopted business continuity for the longest time are those that tend to invest the most in it. This could be due to the return on investment brought by business continuity, such as facilitating the procurement process and improving the overall efficiency of an organization.”
Figure 3 is reproduced below:
In response to the claim that continued investment in business continuity could be due to identified return of investment David Lindstedt writes:
"Less damning, but equally disingenuous is the Report’s claim that increasing investments in BC programs ‘could be due to the return on investment brought by business continuity, such as facilitating the procurement process and improving the overall efficiency of an organization’. The data show that BC programs obtain additional funding over time – and that is the entirety of what that the data can possibly show. There is no additional information to even begin to examine why investments increase. There is absolutely no basis to infer that return on investment (ROI) could be a significant cause of funding. Increasing investments in BC programs could be the results of Illuminati investment bankers for all anyone can tell from the data."
To be fair to the authors of the report, they do make clear that the statement about ROI is an inference and they do attempt to justify this with a footnote reference to a ‘BCAW ROI Paper’: however, this reference is too vague to help in tracking down the actual source of the inference. David Lindstedt is correct in stating that the data doesn’t justify the claim.
Continuing the ROI theme, figure 9 from the report shows that of the respondents that are able to quantify the cost of a significant disruption, 51 percent said it was less than $50,000. David Lindstedt picks up on this stating that:
“51% of the survey respondents said that average cost of a “significant disruption” in the last five years was less than $50,000. If the average cost of a BC program is roughly $170,000 (Ibid, p.11), it may be more cost effective for half of these organizations to simply eliminate their BC programs if they can anticipate experiencing fewer than four significant disruptions per year!”
“According to the data, it could be more cost effective for organizations to simply eliminate their BC programs.
“Here again is a missed opportunity for informative research. The more important question to ask would have been: What is the difference in the average cost of a significant disruption between organizations that do, and do not, have a BC program in place? Alternatively, the research could also have focused on the question: Which existing BC practices provide the most benefit for the cost? The answers to these empirical research questions are vital to the profession. We have yet to show, in any reliable research study, to what degree BC programs actually reduce the cost of disruptions."
What I found most surprising from this section of the report was the finding that 44 percent of respondents ‘were not aware of the average cost of disruptions to their organizations in the past five years’. When one of the most commonly reported difficulties that business continuity managers face is obtaining support from the C-suite, then surely having some idea of the cost of disruptions and the benefits that business continuity may have brought in reducing these costs over time is an essential tool in the obtaining support toolbox? Having said that, it is probably unhelpful to link business continuity return on investment purely with the financially quantifiable aspects of incidents. The report survey found that ‘safety and accountability of staff’ was seen as the second highest benefit of business continuity (see above): this non-financial benefit is a very strong reason in its own right for investing in business continuity.
The report itself shows an interesting trend in the change in non-financial impacts of incidents compared to longevity of business continuity programs. Figure 8 (shown below) shows that the need to reprioritize strategic investments; difficulties maintaining employee morale; and challenges with customer retention all appeared to decrease the longer an organization had business continuity in place. However, this could be a simple case of confusing cause and effect. The decrease in those three factors could be down to the increasing organizational maturity of the organizations that had long-standing business continuity programs. This maturity may make the organizations inherently more resilient. No conclusion can be made either way from the data presented in the report.
Potential evidence of the complete ineffectuality of traditional BC practices?
David Lindstedt reserves his strongest criticism for a section of the report based on a graph shown in figure 10, page 15, of the report.
Commenting on figure 10 the report’s authors say that “Looking at the relationship between business continuity longevity and financial losses, there is a slight but appreciable downward trend that shows the longer organizations have business continuity for, the lower the losses.”
David Lindstedt has a different take, stating:
“What are we looking at? Potential evidence of the complete ineffectuality of traditional BC practices.
“The above graph from page 15 of the Report presents the correlation between how long a BC program has been in place at an organization and the number of (reported) disruptions costing over $100,000 per year in percentage. What the graph seems to indicate is that BC programs have almost no effect on reducing the number of costly disruptions. In other words, despite the fact that 77% of preparedness practitioners report that their efforts provide “a reduction of the costs of disruptions” (Ibid, p. 6 and 12), this chart suggests otherwise."
Again, this does seem to be a fair point that David Lindstedt is making. From the results it would appear that business continuity longevity had less than 1 percent impact on the number of disruptions greater than $100,000 experienced. But maybe this is not surprising given that many business continuity strategies are aimed at responding to rather than preventing incidents.
Make your own mind up…
The above does not address all the points made by David Lindstedt and does not cover all the findings from the BCI Continuity and Resilience Report. To read both items for yourself go to:
- The Continuity & Resilience Report: Raising the impact of Business Continuity
- The BCI Report: Echo Chambers, Disturbing Graphics, and Status Quo
I have to agree with David Lindstedt’s characterization of the BCI report as an echo chamber of opinion. It should not take a survey to determine that Business Continuity professionals believe that Business Continuity programs are effective. Diversifying the opinions among other related professions can reduce bias, however they remain opinions. As both David Lindstedt and David Honour point out, we should be able to produce some data to back those opinions.
David Honour points out that the authors of the report “…make clear that the statement about ROI is an inference…” My reaction is that as an inference, it is weak support of the contention at best, and wishful thinking at worst. I could just as easily infer that fear alone (fear of loss, fear of liability, fear of competitive disadvantage, etc.) is the primary motivator for increasing investments in Business Continuity and its related fields of Disaster Recovery, Emergency Management, and Cyber-Security.
The pitfalls of applying ROI to Business Continuity are many. The poster child for Business Continuity effectiveness is certain predictable and cyclic weather related events, such as hurricanes and cyclones. These events lend themselves to organized efforts to prepare with detailed action plans, backup systems and provisions. The frequency of such events in certain locations makes it easier to measure ROI in those cases. However, the Business Continuity program ROI for the mitigation of less predictable, less frequent business disruptions cannot be measured with any accuracy in the short term.
In any case, the primary focus of Business Continuity is rapid mitigation and recovery from disruptions. Except in the most catastrophic events, mitigation and recovery will occur, making it even more difficult to measure the degree to which Business Continuity planning reduces the time and cost of recovery. Clearly we have a sense that it does, and can provide anecdotal evidence, but to what degree over what time periods and in relation to program costs? Difficult to measure.
In the end, I have to agree with David Lindstedt that the opinions and circumstantial evidence presented in these survey results are best characterized as an echo chamber of self-congratulatory, feel-good sentiment.
It’s true that the BC profession could do more to quantify data. However, it’s ironic that David Lindstedt criticizes the profession for having too little data (i.e. in his response to the Report) and also for having too much data (i.e. for conducting BIAs and risk assessments). Yet his own paper on the ROI of BC used completely random figures (numbers of staff, iterations of program designs, iterations of documents with senior managers, and more) to claim an 11x productivity increase with Adaptive over traditional. Personally, I didn’t really follow the flow of the “traditional” model he attacked: it certainly bore little resemblance to the programs that I have run.
I’d like to pose a point about the $50K cost of disruption. What David is missing is the discussion around “did we keep the cost below $50K BECAUSE we had a good BC program”, as well as “did we have fewer incidents - of any cost - BECAUSE we had a good BC program”. I don’t know to what extent the Report collected data on that (or could collect in the future), but that could be an interesting exploration.
No-one disputes that BC can - and should - continue to improve. No-one disputes that there are some very poor practitioners, and some very poor programs. However, the approach of Continuity2.0 and now Adaptive is akin to saying “The standard of some drivers on our roads is poor. Therefore, rather than improve training and legislation, and make vehicles safer, we should simply tear up all existing road practices. In the future, under a scheme called Driving2.0 (Adaptive driving), anybody of any age can drive whatever vehicle they want, with no license or insurance, as fast as they want. We’ll take down all the traffic signs and stop signals and you can use any lane, any direction, as long as you think it will get you from A to B”. Oh, but wait! Adaptive Driving doesn’t have a route map, so we don’t even know where “B” is. Oh well, let’s just go for a fast drive.
Chris Green FBCI
I read the article on David Lindstedt's comments on the BCI Continuity and Resilience Report with interest.
Whilst I don't agree with all his points I think there is a wider point here about the proliferation of so-called ‘insight’ based on a number of survey questions, which has irritated me for several years.
The BCI, along with many other organisations, regularly publish such reports based on the responses from on-line surveys. These are then packaged up into reports from which are expected to derive knowledge, whereas they are really a group of people working in loosely related disciplines giving their opinion on a subjective list of questions. It's no surprise that a question asking BC professionals whether they thought that enough money was being invested in their area would conclude that over 50% would say "no".
I would like to see fewer of these reports in future and more reports where actual experience, thought-leadership and genuine new ideas are proposed.
Rob Osborn MBCI
I agree with Rob Osborn. I get rather annoyed with 'insights reports' which quote lots of statistics which have come from poorly worded questions and are crowd sourced; relying on the data from whoever can be bothered to reply. I think they lack any research vigour which you would expect from any academic piece of work and so I take most of these reports with a 'pinch of salt'. As far as I can see many of the reports are gathered by this method.
I have to admire David for bothering to read them in detail and even more for making the effort to write his views. Most of us I suspect go ‘yeah yeah whatever’ have a skim read to see if there is anything interesting or new and move on. Most of them don’t seem to tell us anything new.
I think if the organizations like the BCI want to do polls / surveys then the existing methodology is fine. But I think if they want to publish research and look at detailed or complex figures I think some more scientific vigour is required!
Charlie Maclean-Bristol, FBCI