Today an article I worked with help from Kevin Chalk was published in the ISSA Journal. When I am able, I will post the text here for review. It should be in your inbox if you prefer to read e-versions of articles. Not sure when they get mailed out.
It’s a great piece on how to apply some soft skills (and some decidedly not so soft skills) in the furtherance of conducting a supplier review. You will never know as much about the supplier as they know about themselves, so this is a good approach to trying to uncover where there may be a lack of truthfulness in certain responses.
We wrote about using the Reid Technique, which is a standard in law enforcement interviewing and interrogation. Its worth exploring to see if it fits into your own security and risk program.
Information technology audit is a relatively recent addition to the professional world of auditing. A review of the history of IT audit leads one back to the Electronic Data Processing Auditors Association (EDPAA), which is the forerunner of what would eventually become the Information Systems Audit and Control Association (ISACA)1. Although EDPAA published control objectives in the 1970s, what would eventually become ISACA’s flagship publication (Control Objectives for IT; COBIT) was published in 19962. In large part, this publication defines controls for IT systems, but is grounded in the definitions of controls codified by The Committee of Sponsoring Organizations of the Treadway Commission Internal Control-Integrated Framework (COSO)3. Clearly, IT auditing was happening before these organizations codified the practice as reliance upon IT systems was identified as critical to organizational success. Indeed, the authors of the original COBIT document identifies their impetus for creation thusly:
“In recent years, it has become increasingly evident to regulators, lawmakers, users, and service providers that there is a need for a reference framework for security and control in information technology (IT).”2
Continue reading A Cooperative Model for Security, Audit, and Risk: A collaborative approach to risk-based audits
I participated in my second risk management podcast for the Open Group that was published today. I like this one better than my previous one–I tried to talk slower in this one anyways ;-)
I was happy with the topics that we discussed, most notably that as regulators become more aware of the capabilities of quantitative risk assessment techniques they will begin demanding them from those they are reviewing. Of course, Jack and Jim were great as well and the conversation was expertly moderated by Dana.
My @ISACA column was published today. Read it here.
I realized they edited the full submission I made (I could tell because it sounded a little off from what I recalled). Below is the full post:
Depending on your point of view, risk management is either a very easy or a terrifically difficult job. If you approach IT risk management from a controls perspective (as in, “This asset doesn’t have all the controls listed here. That’s a risk.”), then risk management is very easy for you. Simply add the missing control and everything’s back to normal. If anyone objects to your solution, it’s very easy to show them the worst that could happen, and paint them as an irresponsible steward of your organization in order to get the funding you need.
If, however, you feel that the control deficiency calls for some analysis, then risk management is much more difficult. In order to analyze the risk, you need to conduct research to understand which assets reside on that system, how often it is attacked from various threat communities, and the cumulative strength of the remaining controls. This approach involves building a model of attack sequences with associated probabilities and losses and considering the risk scenario in the greater context of the organization’s goals, objectives, and overall risk posture. In other words, this approach is risk analysis in support of well-informed risk management.
It is certainly easier to respond emotionally with phrases such as “I feel like this is a high,” or “I think our customers would be upset,” or even, “Our CEO could end up in jail!” Its a very rare scenario where we hear, “The analysis has shown…” Imagine buying insurance where the agent tells you they “feel” like you are high risk but are unable to tell you why. At best, emotional responses like these support misallocating company resources on unnecessary controls. At worst, it may make it difficult for your company to effectively compete in an evolving marketplace. Practicing risk professionally means eschewing an emotional response in favor of risk analysis. An emotional response to risk is not a valid substitute for critical thinking.
When consulting on a security issue, one of the questions that makes me grind my teeth more than any other is some variation of, “What’re our competitors doing?” My initial reaction is always, “Who cares?” Its really just a useless way to think about security and risk.
In my experience, no one asks this question because they are looking for a way to spend more on security, layer in additional controls to reduce fraud, or simply to reduce risk. No, this question is almost always asked as an offensive against perceived unreasonableness by information security. Its a political tool or a negotiating tactic to cause you to back down. Which should be enough of a reason to dismiss it outright, but there is more nuance to this that causes it to be distasteful.
Your IT risk decision-making is not a commodity market. Sure there are security commodities, however the decision making cannot be outsourced to other organizations. Think about it, what if you dutifully came back with an answer to this question indicating that not only are our competitors doing not just what you are recommending but significantly more. Their budget for this is 5 times what you were planning to spend.
Would they then immediately write a check for that difference? Offer an apology to you and then shuffle out the door defeated? No, of course not. Nor should they. The risk tolerance, assets, lines of credit, cash flow, customers, budget, product mix, public profile, threat agent action, loss scenario probabilities are not yours. Simply put your competitor’s risk tolerance and appetite is not yours. As a result, you need to make the best decisions you can with the best (quantitative) data that you have at your disposal. Of course you should seek inspiration from various sources, if you can get it. I love the notion that security folks are a chatty sort that dish endlessly about the goings on in their companies. Security professionals should be fired for such action — you don’t want chatty security people working for you. Information sharing regimes, processes, and protocols exist, but data sharing at that level tends to be categorical which isn’t often useful enough to answer the question being posed. There is one exception to my rant however and that is legal. They probably are the ones who would advocate that budgets and controls be increased to reflect the posture of other organizations. Except legal won’t fund anything, so you have to go to the business anyways.
Compliance is out of control. Its pervasive in our society now and there is no going back. Allow me to explain.
My kid attends pre-school. They go outside daily to play, so we were asked to provide some sunblock. Makes sense, our family is pale so we are used to that routine. We brought it in, signed a legal release (sigh), and we were good to go.
Or so we thought.
We receive an email later in the day saying that they cannot use an aerosol can and we need to provide sunblock that is a cream. Now this wasn’t communicated to us previously so that’s disappointing but the real issue is the promulgation of the phrase, “It’s our policy…” The use of this term is quickly becoming a death of a thousand cuts.
How far is this to be taken? Would they have compelled my kid to go outside in the sun to burn, while the unopened sunblock sat idly by, not protecting them from an inappropriate amount of UVA/UVB? Would they have sat self-satisfied that policy boxes were checked while children roasted in the midday sun?
“It’s our policy that we don’t use aerosol cans to apply sunblock. It might get in their eyes.”
Well its not pepper spray; its not meant to be sprayed in the eyes. Everyone knows the trick about spraying it into your hand and then apply it to your face. I’m about ready to build my own set of personal policies (“That’s unfortunate, but its my policy that children not burn in the sun when sunblock is within arm’s reach”), effectively pitting policy against policy in a byzantine Mexican standoff of bureaucracy and drudgery.
Since I see the world through a risk lens, I see this as a failure in risk management. Which would have exposed this organization to greater risk? The remote possibility of face spraying, or the near certitude that skin will burn? In this case, the robotic adherence to policy actually INCREASED risk in the organization by promoting what is effectively negligence.
Thankfully, the outside activity that day took the kids through a shady grove, so no sunburn ensued, but this is a great example of where compliance regimes exceed risk tolerance and that actually increases risk.
I participated in a panel discussion podcast for the Open Group during their recent conference in Newport Beach.