The art of the disaster simulation By Mark Armour Art - definition: skill acquired by experience, study, or observation (i.e. the art of making friends) No simulation is perfect. There are technical details that get missed. There are uninvolved and disinterested parties. There are complete no-shows. There are overzealous leaders who threaten to hijack the event. There are challenges to the scenario. There are rabbit-holes that waste time and activities that are poorly executed. There’s equipment that doesn’t work. There’s food that fails to arrive. Schedules get thrown off and individuals go missing-in-action. Coordinators get second-guessed and late arrivals demand to be brought up-to-speed. No two simulations are alike. The types of tests we conduct and how we go about them can vary greatly. They can involve stepping through detailed procedures. They can be about big picture decision-making during a crisis. They can be loose discussions or tightly controlled activities. They can include small teams or a multitude of individuals across a wide spectrum of job functions, disciplines, locations and management levels. They can be carefully guided or freewheeling. Each and every simulation provides opportunities and challenges not only for participants but for seasoned business continuity practitioners. All tests can be improved upon and everyone involved can learn something different as a result of their contribution. What may be an exceptional experience for one may be a major disappointment for another. Simulations are not a science. In fact simulations are, for better or for worse, an art. Value - definition: relative worth, utility, or importance Great art challenges the observer. Lasting works leave an impression or elicit strong feelings from those they touch. Most art works carry a value, one which is greatly subjective and can vary drastically from one seemingly similar piece to another. It is true that one person’s work of beauty can end up on another’s trash heap. So it goes with simulations. The value of a properly planned and executed simulation goes far beyond simply working completed plans. An exercise is the only opportunity, outside of an actual event, to put plans into action. The exercise process provides the chance to incorporate a wealth of additional activities to further enhance the program. With that in mind, we need to devote as much attention to our follow-through as to our preparation. It is also important to be mindful of the value others place on the activity by their level of participation. Rank-and-file engineers and developers may be required to attend, but yours is usually just one step in seeing their job to completion. Line-level managers may have other things riding on their attendance (requirements and incentives) while senior managers may actually see the benefit, and consequently, have the highest expectations. Simulation - defined: examination of a problem often not subject to direct experimentation Without going into the details of each, there are a number of different types of simulations one can conduct - from table-top and component recovery to functional and even full scale exercises. Each provides its own unique set of challenges and opportunities. Preparing for and conducting any one of these involves largely the same steps, though some require much more time and effort than others. Remember, too, that the more time you devote to preparation the more attention will be required following the exercise. Putting on a proper simulation takes time to master and, like any skill, you will improve your abilities with each one. The first step is in managing the simulation process. Like many components of business continuity, testing is largely a matter or proper project management. There are, of course, peripheral activities that require their own specific skill-sets. You want management to be involved and you require buy-in from various support areas, whether internal or external. You need to properly sell the vision of the exercise, explaining to participants the benefits of their time and effort. Properly motivated participants make all the difference in an exercise. Part of the process also involves a bit of criticism. Remember that the primary point of any test is to identify the problems and a good simulation coordinator keeps this in mind at all times. Every test is not only an opportunity to evaluate an organization’s preparedness, but also to improve the next exercise. Bottom line: never be satisfied. Opportunity - definition: a good chance for advancement or progress Every test provides a forum for identifying multiple opportunities to improve your overall program. The first step in preparing your exercise should be to identify your objectives: Test the plan Re-prioritize Training and awareness Certainly, there is nothing like an exercise for acquainting managers with the plans that support them. Building a plan is one thing, trying to put it into action is another and this is frequently the time to ensure management can actually implement what is in place. In many cases this may be upper management’s first exposure to such materials. Maintenance Participation Include audit. This demonstrates transparency and fosters a positive working relationship. This is an opportunity to work with them outside of a formal audit setting. By proactively soliciting input and feedback you ensure that their concerns can be incorporated into the program, thereby improving your chances of positive findings when audit day does come around. External agencies There may also be support organizations outside of emergency response. You may feel confident that they’ve been thoroughly vetted to perform their duties in the event of a crisis, but has that capability ever been tested? Exercises represent just one more chance to bring those critical service providers into the fold and work with you on a comprehensive response. Prove value Expectation - definition: basis for considering probable or certain; assurance Failing to properly prepare your participants or neglecting to show them what success looks like is a sure fire way to hobble your exercise from the start. Be realistic about the time and effort necessary from those involved. This means making them aware that a good test should result in problems and issues. Remember that your participants are system owners and business unit managers. They expect positive results and they will attend your simulation with the same belief that success is measured by speed and accuracy. If those involved expect your simulation to result in a smooth recovery, then there is every chance that that is what you will get. Most people would rather feign success than risk the perception of failure. If that’s the case then there’s little need to conduct the test in the first place. Take the time to emphasize the ‘positives’ of a ‘negative’ finding. Remind everyone that their job is to find problems. If necessary, provide incentives for identifying issues. Old school Budget your time Conduct - definition: to direct the performance of (i.e. to conduct an orchestra) Any orchestra conductor will tell you that a great performance only comes about through regular rehearsal and preparation. Only proper planning and practice can get the simulation coordinator through all the nuances and potential stumbling blocks a simulation presents. Active participation of those invited will also go a long way to ensure the results are top-notch. Presentation There is a common misperception that good public speaking is an innate talent rather than an acquired skill. Nothing could be further from the truth. Great presenters are not born. Know your weaknesses then prepare accordingly. You will be surprised at the results. Resources Having your own set of tools is important but make sure those involved have everything they need as well. To the extent possible, ensure the materials being used are consistent and specific to the task at hand. Pen and paper are a good start. Custom printed materials work far better. Tailored forms allow you to direct your participants to consider specific components of their plan. Be prepared If, despite your best efforts, things do go wrong, remember that those eventualities are also part and parcel of an actual event. Respond as best you can and remind your participants that it is all part of what they can expect following an actual disaster. Support roles Delegate someone to act as the exercise scribe. This person should record the overall activities while also evaluating the events and the coordinator’s abilities. During smaller scenarios, this may be the same as your support person. If not, make a point of ensuring that those acting in other support roles keep your scribe informed as to the issues being experienced and questions being asked. Be open to criticism and listen to what your support folks are telling you. Opportunities always abound for the business continuity professional to take corrective action and apply lessons learned to future simulations. Result - definition: something obtained by calculation or investigation; also : beneficial or tangible effect: fruit All of this effort and all of the resources devoted to it are for naught if you do not dedicate commensurate time and effort to capturing your results. This includes the good and the bad. The benefit of recording the good is obvious: demonstrate to management, audit and your participants that your program is working. The positives attributable to you and your team are also those elements you want to ensure you keep for future activities. Keeping track of what you did well is a means of ensuring you do the same thing again (and again and again). The successes can also be used as a benchmark. Look to improve upon them in future sessions. Be ready to sound alarms in the event you fall short the next time around. If recording positive results is important, tracking your negatives is a necessity. Identifying issues and shortfalls is the primary purpose of any test. At a minimum, you want to make sure you identify where plans and strategies are not adequate to recover in the time necessary. More broadly, you also want to find where the weaknesses are within the organization and where opportunities for improvement lie. If you already have a sense of where problems exist and have adequately prepared then your exercise should bear them out in the results. Lastly, you’ll want to evaluate the exercise itself. Just as you want to repeat the things you do well, you want to identify those things that went poorly so you can make appropriate modifications the next time. Capturing lessons learned The live session Conversely, a second session held a short time following the exercise allows individuals to think of things that hadn’t occurred to them during the simulation. Sometimes the extra time allows everyone involved to more clearly formulate and communicate the difficulties experienced. You may even find that, given some time, participants will develop and propose solutions. In addition to group sessions, it’s also possible to conduct one-on-one sessions with each of the people involved. While this can be resource-intensive it can sometimes yield the most informative results. Members may be willing to speak up more in a private session then in the middle of a group. This also allows you or your team to devote the time necessary to discussing the input. The survey Combined approach Evaluate - definition: to determine the significance, worth, or condition of, usually by careful appraisal and study. By now you’ve established goals, honed your scenario, prepared your presentation, set expectations, conducted the session and gathered lessons learned. Now is the time to turn that feedback into meaningful steps to take. Program improvement Admittedly, turning varied input from multiple sources and media into concrete action steps can be a daunting task. Where necessary, get clarification from the individuals who provided the feedback. Involvement and interest Remember that this starts with setting the proper expectations. This means going in with the proper mindset yourself. A mountain of issues should not be looked at as a failure. Quite the contrary – this is indicative of active participation and points to a willingness to take on problems. Each difficulty is an opportunity to improve the program and such challenges should be readily acknowledged and forthrightly addressed. Embrace those pointed questions! Implement - definition: to give practical effect to and ensure of actual fulfillment by concrete measures. Now is the phase where you put it all into action. This can be simultaneously the easiest and the most difficult part of the entire process. It is the easiest because many of the activities can be handed off – to your participants, your support partners and your team-members – to be carried out. It is the most difficult precisely because you don’t always have direct control. Activities can sometimes languish following an exercise. Participation in the test may be mandatory but follow-through is not the first priority once individuals get back to their desks. Follow-up Regular reminders along with a checklist of accomplishments can help reinforce expectations across the groups. Escalation As the practitioner it is contingent upon you to communicate the importance of the activity. This means making leadership aware of the strategy and plans in place as well as (and sometimes most importantly) the time, effort and cost involved in implementing that strategy. Why would senior management put forth such expense when the result falls short of expectations? Making the necessary corrections ensures that the expenditure is justified and the results are maximized. Future exercises Again, it’s important to recognize that not everyone is alike and not all people will see the same benefits of the exercise you’ve just been through. This is an art after all. Making the effort to educate your participants and spending time understanding their perspective will go a long way in helping you make your exercises meaningful to the most people. In summary - Identify opportunities to strengthen your program and highlight deficiencies. - Set expectations among your participants. This includes yourself and your team. - Conduct your session by being as prepared as possible and through regular practice. - Gather results from your participants by as many means as realistically possible. - Evaluate input for improvement opportunities and action items. It sounds simple but – like art itself – there is always more to it than meets the eye. Good luck and happy testing! Author •Date: 21st June 2012 • US/World •Type: Article • Topic: BC testing & exercising
To submit news stories to Continuity Central, e-mail the editor. Want an RSS newsfeed for your website? Click here
| |