HBR has a great article up from a few days ago: “The Behavioral Economics of Why Executives Underinvest in Cybersecurity.” It’s an interesting read. If you’re too lazy or swamped to go read it, the gist is that natural human biases are such that the default state of behavior when it comes to investing in security is below that which is required to meet real-world risks.  For example, since we don’t always get attacked, natural forces lead to a reduction in funding and resource allocation over time.  It’s the truth – anybody who has worked in or around security will recognize the dynamic immediately.

What got me a little fired up are the suggestions that they have about how to deal with the situation:

  • Appeal to emotionality – leverage “effect bias” to make issues emotionally impactful rather than drawing on dry “facts and figures”.  From the article: “…cybersecurity professionals should take into account people’s tendency to overweight information that portrays consequences vividly and tugs at their emotions.
  • Reframe mental model –  “Some CEOs may think that security investments are for building an infrastructure, that creating a fortified castle is all that’s needed to keep a company safeCISOs should work with boards and financial decision makers to reframe metrics for success in terms of the number of vulnerabilities that are found and fixed.”  So basically, reframe the discussion around positive outcomes, and try your best to curb their native lack of understanding.
  • Survey peers – leverage peer pressure and “social proof” to curb overconfidence.
  • Highlight the “weakest link” – Loudly and (semi)publicly highlight issues to help thwart inattention to the problem space.

What irritates me about this isn’t that they’re wrong.  In fact, I’m sure they’re a) right and b) that these methods work (probably pretty well.)  And, as such, the savvy practitioner would do well to leverage them accordingly.  Instead, what irritates me about this is that this is essentially how you’d treat a child. Like, is it me, or couldn’t this list basically also serve as a map for how you’d get your second-grader to do their homework?

Fundamentally, I expect more – and better – from senior leaders.  I expect a degree of maturity where they can make an objective determination about risks and issues without appealing to their emotions (this, by the way, is the reason that FUD works) or without having to draw on what “the other guy” is doing to illustrate why they’re remiss in not doing something similar.  Rather than leaning into the natural inclination of executives to be “babies in a suit”, isn’t there a way to cultivate maturity, rationality and objectivity?  Maybe I’m asking too much… or maybe these behaviors aren’t really as infantile as they appear to me on the surface.  But really, I’m not a huge fan of “dumbing it down” for people unwilling or unable to step up.