I tackle the notion of risk appetite in this month’s column using some metaphors with which you might be familiar. You don’t get to pick your auto insurance coverage by expressing the number of accidents you are willing to accept, yet that’s how a lot of organizations think about cyber risk. Fortunately, the cyber insurance industry is going to force us all into thinking about risk in dollars, the same as everyone else, because that is the lowest common risk denominator.
You can read more here.
I was reading up on cyber deterrence today and ran across this little gem in relation to nuclear deterrence:
Because of the value that comes from the ambiguity of what the US may do to an adversary if the acts we seek to deter are carried out, it hurts to portray ourselves as too fully rational and cool-headed. The fact that some elements may appear to be potentially “out of control” can be beneficial to creating and reinforcing fears and doubts within the minds of an adversary’s decision makers. This essential sense of fear is the working force of deterrence. That the US may become irrational and vindictive if its vital interests are attacked should be a part of the national persona we project to all adversaries.
–Essentials of Post Cold War Deterrence (1995)
My latest column for @ISACA was published today. In it I talk about the benefits of using verbal risk labels (things like high, medium, and low) and give some examples where this is helpful in the treatment of Type 1 Diabetes. This is an important concept for those like myself that are dedicated to quantitative risk. Its important to translate the quantitative values into buckets that allow for easy decision making.
You can read more here.
Sometimes, the organization you work for will need to make budget cuts. And sometimes that means cuts to the security budget. How that should be handled is the subject of my latest @ISACA column.
I really enjoyed Bruce Schneier’s recent post on Code Yellow. It inspired me to write about it in the context of personal self defense (and its parallels to the Japanese term zanshin).
I disagree with Bruce’s opinion that being in Code Yellow generally is a bad thing (at least, that’s the impression I got from his piece). Like much in life, there is a balance between seeing danger in every shadow and being alert and aware in our daily lives. For instance, how many people are not living in the moment due to the smartphones in their pockets and what are they missing out on? What danger are they placing themselves in?
Cybersecurity can have a similar problem: jumping at those shadows can be dangerous, but not acknowledging that there could be danger in that shadow can be just as bad as many attacks are dependent upon catching the victim unawares. It does take practice however to strike the balance between paranoia and alertness, but its one that must be worked at. Organizations with a mature risk management function can successfully negotiate the trade off of conducting their business and not drowning in losses. Being “In Yellow” really is the job of the risk function of an organization; its the equivalent of that voice in your head reminding you of the bad things that could happen so that you can make a well-informed decision.
My @ISACA column was published today. You can read it here.
My @ISACA column was published today. Read it here.
I realized they edited the full submission I made (I could tell because it sounded a little off from what I recalled). Below is the full post:
Depending on your point of view, risk management is either a very easy or a terrifically difficult job. If you approach IT risk management from a controls perspective (as in, “This asset doesn’t have all the controls listed here. That’s a risk.”), then risk management is very easy for you. Simply add the missing control and everything’s back to normal. If anyone objects to your solution, it’s very easy to show them the worst that could happen, and paint them as an irresponsible steward of your organization in order to get the funding you need.
If, however, you feel that the control deficiency calls for some analysis, then risk management is much more difficult. In order to analyze the risk, you need to conduct research to understand which assets reside on that system, how often it is attacked from various threat communities, and the cumulative strength of the remaining controls. This approach involves building a model of attack sequences with associated probabilities and losses and considering the risk scenario in the greater context of the organization’s goals, objectives, and overall risk posture. In other words, this approach is risk analysis in support of well-informed risk management.
It is certainly easier to respond emotionally with phrases such as “I feel like this is a high,” or “I think our customers would be upset,” or even, “Our CEO could end up in jail!” Its a very rare scenario where we hear, “The analysis has shown…” Imagine buying insurance where the agent tells you they “feel” like you are high risk but are unable to tell you why. At best, emotional responses like these support misallocating company resources on unnecessary controls. At worst, it may make it difficult for your company to effectively compete in an evolving marketplace. Practicing risk professionally means eschewing an emotional response in favor of risk analysis. An emotional response to risk is not a valid substitute for critical thinking.