I wrote a piece for RiskLens* recently that talks about how to utilize FAIR for building and justifying an information security budget and strategic initiatives. Its an interesting problem space as there is a need to have the appropriate level of abstraction (program level versus technology level) but its also a very solvable problem to add risk reduction justification to these annual budgetary exercises.
Fun story: one time I did this exercise years ago, I actually rated one initiative as *increasing* risk. It started an interesting discussion but the lesson is that not everything will result in less risk to your organization. Budgeting is a complicated amalgam of math, politics, and priorities; be sure to bolster your budgeting process with some risk arguments.
Click here for the RiskLens article: How CISOs Use FAIR to Set Strategic Priorities for Spending
*I am a professional advisor for RiskLens
I tackle the notion of risk appetite in this month’s column using some metaphors with which you might be familiar. You don’t get to pick your auto insurance coverage by expressing the number of accidents you are willing to accept, yet that’s how a lot of organizations think about cyber risk. Fortunately, the cyber insurance industry is going to force us all into thinking about risk in dollars, the same as everyone else, because that is the lowest common risk denominator.
You can read more here.
I was reading up on cyber deterrence today and ran across this little gem in relation to nuclear deterrence:
Because of the value that comes from the ambiguity of what the US may do to an adversary if the acts we seek to deter are carried out, it hurts to portray ourselves as too fully rational and cool-headed. The fact that some elements may appear to be potentially “out of control” can be beneficial to creating and reinforcing fears and doubts within the minds of an adversary’s decision makers. This essential sense of fear is the working force of deterrence. That the US may become irrational and vindictive if its vital interests are attacked should be a part of the national persona we project to all adversaries.
–Essentials of Post Cold War Deterrence (1995)
My latest column for @ISACA was published today. In it I talk about the benefits of using verbal risk labels (things like high, medium, and low) and give some examples where this is helpful in the treatment of Type 1 Diabetes. This is an important concept for those like myself that are dedicated to quantitative risk. Its important to translate the quantitative values into buckets that allow for easy decision making.
You can read more here.
Sometimes, the organization you work for will need to make budget cuts. And sometimes that means cuts to the security budget. How that should be handled is the subject of my latest @ISACA column.
I really enjoyed Bruce Schneier’s recent post on Code Yellow. It inspired me to write about it in the context of personal self defense (and its parallels to the Japanese term zanshin).
I disagree with Bruce’s opinion that being in Code Yellow generally is a bad thing (at least, that’s the impression I got from his piece). Like much in life, there is a balance between seeing danger in every shadow and being alert and aware in our daily lives. For instance, how many people are not living in the moment due to the smartphones in their pockets and what are they missing out on? What danger are they placing themselves in?
Cybersecurity can have a similar problem: jumping at those shadows can be dangerous, but not acknowledging that there could be danger in that shadow can be just as bad as many attacks are dependent upon catching the victim unawares. It does take practice however to strike the balance between paranoia and alertness, but its one that must be worked at. Organizations with a mature risk management function can successfully negotiate the trade off of conducting their business and not drowning in losses. Being “In Yellow” really is the job of the risk function of an organization; its the equivalent of that voice in your head reminding you of the bad things that could happen so that you can make a well-informed decision.
My @ISACA column was published today. You can read it here.