My latest @ISACA column was posted recently. This time I tackled a hard issue in the human factors space: awareness training. Specifically, I explored the notion that having a good security team may actually impede the effectiveness of a security awareness program. I did this through the application of some concepts from the bystander effect.
You can check it out here: Security Awareness and the Bystander Effect.
I wrote a piece for RiskLens* recently that talks about how to utilize FAIR for building and justifying an information security budget and strategic initiatives. Its an interesting problem space as there is a need to have the appropriate level of abstraction (program level versus technology level) but its also a very solvable problem to add risk reduction justification to these annual budgetary exercises.
Fun story: one time I did this exercise years ago, I actually rated one initiative as *increasing* risk. It started an interesting discussion but the lesson is that not everything will result in less risk to your organization. Budgeting is a complicated amalgam of math, politics, and priorities; be sure to bolster your budgeting process with some risk arguments.
Click here for the RiskLens article: How CISOs Use FAIR to Set Strategic Priorities for Spending
*I am a professional advisor for RiskLens
I was very fortunate to have the opportunity to share my thoughts on KRIs last week on The FAIR Institute’s website. I used the metaphor of Sentinel Species (think canaries in coal mines) to serve as an indicator of risk, but not of risk itself. That important distinction is one that I strongly feel is a difference we aren’t making in our identification and use of KRIs.
You can read the full article here.
I was very honored to have had the chance to share my quantitative cyber risk journey with the broader security community last week at the RSA Conference. My session had over 100 people in attendance (quite a feat at 8AM on a Wednesday!) and the questions and followups were so good they lasted until we were kicked out of the room. The book signing afterwards caused the bookstore to sell out of copies of Measuring and Managing Information Risk.
I shared some more thoughts on the conference with the FAIR Institute here (where you can also read thoughts from other FAIR practitioners). Lastly, my session slides are available here. Be sure to reach out if you are interesting in learning more; I’ve already had one follow-on session with someone who was unable to attend.
I’m very pleased to announce that I have been awarded an ISACA Global Achievement Award, specifically the John W. Lainhart IV Common Body of Knowledge Award. Full citation below:
ISACA John W. Lainhart IV Common Body of Knowledge Award
Scope: Recognizes an individual for major contributions to the development and enhancement of the common body of knowledge used by the ISACA community.
Jack Freund, Ph.D., CISA, CISM, CRISC
“For contributions in developing the CRISC Certification and for ensuring the integrity and quality of the CRISC Certification exam content.”
I’ll be granted this award on 28 May at the ISACA EuroCACS conference in Edinburgh, Scotland.
The full list of this year’s award winners is here.
I’m very proud of the time I’ve spent working on the CRISC certification (almost 8 years now) and it’s astronomical growth since its launch. I truly do believe that it is a very high quality IT risk certification that employers can rely on to ensure that their staff has core IT risk knowledge. I’m very humbled to have my small contributions acknowledged in this way.
I recently accepted a position with RiskLens as a professional advisor. I’m looking forward to working with Jack Jones again as well the great team they have assembled there. My immediate project there will be advising on the product roadmap and assist them with taking their amazing quantitative risk platform to the next level.
Official announcement here.
I had a great time this week at Risk.Net’s Cyber Risk NA conference this week. I moderated a panel on Modeling Cyber Risk with Jack Jones (EVP RiskLens), Ashish Dev (Principal Economist at the Federal Reserve), Manan Rawal (Head of US Model Risk Mgmt, HSBC USA), and Sidhartha Dash (Research Director, Chartis Research).
We only had 45 minutes and ran out of time before we could get to all the topics I had on my list, so I wanted to included some notes here of things we covered:
- I opened with a scenario where I asked the panelists if they were presenting to the board would it be more honest to disclose the following top risks: 1) IOT, GDPR, and Spectre/Meltdown or 2) Our Top Risk is that we aren’t modeling cyber risk well enough. Most everyone chose option 2 :-)
- We talked about whether there was a right way to model
- Poisson, Negative Binomial, Log Normal
- Frequentist vs Bayesian
- Which model for scenarios makes more sense: BASEL II categories or CIA Triad?
- Level of abstraction required for modeling
- Event funnel: Event of interest vs incident vs loss event
- Top Down vs. Bottoms Up
- What are key variables necessary to model cyber risk (everyone agreed that some measure of frequency of loss and impact/magnitude are necessary)
Things we wanted to get to but ran out of time:
- What is necessary to get modeling approved and validated by Model Risk Management
- Should you purchase an external model or build your own?
- Can we use our Cyber Models for stress testing/ CTE calculations?
- Do we combine cyber scenarios with other operational risk scenarios?
- One audience question that we ran out of time for was “How was the FAIR approach different than LDA & AMA and how does it address their weaknesses (Frequency and severity correlation)”
- This was a good question but to be fair, FAIR wasn’t designed to be a stress testing model. However, many of the inputs used for FAIR are also used for LDA and AMA.
- There were lots of other audience questions about the use of FAIR which is always encouraging!