Welcome to 2020!
I kept busy last month, even with the holidays. Here are some updates:
I wrote a piece for ISACA about how much spending is being done in aggregate for cyber security and how we need to rationalize the controls we are spending on.
The FAIR Institute called this my manifesto here :-)
I’m also really excited that my article on Cyber Risk Prospectuses was published over in ThreatPost. I’ve been talking about this topic for about a year now. I’m not a fan of us pretending that we work for companies that won’t get hacked. It’s not if its when and being clear about how long before we expect that loss is important. The FAIR Institute summarized my point succinctly: “Admit you will probably get breached.”
Time for another cyber risk roundup!
As a part of my new role with RiskLens, I’ve been publishing several articles. Included here is a recap of my work over the past month:
- The ZombieLoad speculative execution bug raised the specter of a possible 40% hit in performance. I gave a plan to evaluate this new bug in the context of risk trade-offs here and here.
- For the @ISACA newsletter, I wrote about the importance of understanding business processes when conducting risk analyses. The specific list of business concepts I thought were important are included in the article here and RiskLens promoted it here (where they called me a security nerd LOL).
- I presented on integrating FAIR into the HITRUST CSF model along with Jason Martin from Highmark Health. The slides from this presentation are here.
- The new DHS Binding Operational Directive requires accelerated patching for critical and high security vulnerabilities. My thoughts on this are here for Homeland Security Today.
For this months @ISACA Tips column, I wrote about the conundrum of defining and assessing emerging risk. Its an interesting space to assess; technologies and trends so cutting edge that they sorta defy precision assessments, yet also so important as to require them.
You can check it out here.
“There is a certain uselessness in saying an organization does not want to accept high risk.”
My latest @ISACA article was published and as I was re-reading this line it resonated with me even more. You have to have more fidelity in how you define risk appetite for it to be useful. More tips on how to do that in the full article here.
I really enjoy reading Duncan Watts work and I was blown away by how he assailed the concept of common sense that we all rely upon so readily:
What we don’t realize, however, is that common sense often works just like mythology. By providing ready explanations for whatever particular circumstances the world throws at us, common sense explanations give us the confidence to navigate from day to day and relieve us of the burden of worrying about whether what we think we know is really true, or is just something we happen to believe.
Questioning our perception of reality is pretty heavy and you can spend a lot of time working through that. But in my article I use this idea to break out of the crutch of using common sense to manage risk.
You can read the full article on the @ISACA Newsletter site here.
My latest @ISACA column was posted recently. This time I tackled a hard issue in the human factors space: awareness training. Specifically, I explored the notion that having a good security team may actually impede the effectiveness of a security awareness program. I did this through the application of some concepts from the bystander effect.
You can check it out here: Security Awareness and the Bystander Effect.