Bruce Schneier, a security expert, posted a characteristically stimulating think-piece on his blog yesterday. Ostensibly, its purpose was to promote an acceptance of "security engineering" as a specialist discipline. Its effect, however -- on me, at least -- was to underline how much we remain slaves to our gut instincts, both when it comes to IT security and on other pressing technological issues.
It's high time we checked our hunches at the door.
Schneier kicks off by demonstrating -- I'd say convincingly -- that our intuitions about security screening at airports don't stand up to serious examination. He goes on to explain that feeling secure, while it's a real enough psychological state, is not the same as being secure. Knowing how secure we are -- as individuals or enterprises -- is achievable. Risk assessment is a data-based science.
This goes against deep-seated instincts, of course. Crossing the street is much more dangerous than flying, but few are panicked by the former, and many by the latter. It turns out to be remarkably difficult to put our faith in the data, and not just when it comes to security.
What we're dealing with, yet again, is the general problem of "system 1" versus "system 2" thinking, as defined by the Nobel prize-winning psychologist Daniel Kahneman. "System 1" thinking evolved because it's fast, easy, and -- in primitive environments -- effective. The complexity of the modern world, not to mention the modern security environment, benefits from "system 2" thinking: logical and based on data (facts).
Unfortunately, it's easy to reel off a list of topics where "system 1" thinking (which has strong political appeal, too -- think "sound bites") is retarding progress on some important technology-related questions.
"System 1" thinking says: If I invented it, it's mine; you need to pay me to reproduce it or use it. "System 2" thinking introduces all kinds of complications. If it's mine now, how come my ownership expires after a certain period? Is it truly detrimental to me -- or actually a benefit -- to have my intellectual property distributed widely? If we're so wired about copyright, how come we're all using tools that make copying and sharing other people's work easier than ever?
"System 1" thinking says: You're an experienced manager, and you have been in this situation before. Do what always works. "System 2" thinking should persuade us that situations evolve rapidly in real-time, that the data desribes the situation more accurately than our gut feeling, and that predictive analytics are a better guide to decision making than what happened to work in the past.
Back to Bruce Schneier. A week ago, he posted a short note on his blog under the heading "Exaggerating Cybercrime," linking to a lengthy critique of the $1 trillion figure often bandied about as the annual cost of cybercrime. Schneier is correct, of course, to applaud the application of "system 2" review to what is essentially yet another sound-bite. It would be way too neat if $1 trillion were the right figure.
The only problem is the critique doesn't tell us what the correct figure is -- and it's possible nobody knows. This doesn't stop the enterprise applying data-based, "system 2" thinking to the security environment.
Understand the value of the information on your networks. Apply a cost-benefit analysis when estimating the risk of loss. Monitor systems to identify suspicious activity. Analyze and understand breaches when they take place.
"Security engineering" is a challenging alternative to "feeling" secure, or indeed "feeling" insecure, but it's the rational option.
— Kim Davis , Community Editor, Internet Evolution