Risk homeostasis, also referred to as risk compensation, is the human tendency to maximize acceptable risk in any environment. Most often studied in the automobile safety context, risk homeostasis is important factor in understanding how people manage risk in complex variable situations. Human ethology is a never-ending chasm of confusing behaviors compounded by increasing Darwin award issuance.
The best analogous illustration of risk homeostasis is the river delta fallacy, with risk being represented by the water. Reproduced is the text as published in the excellent book Target Risk 2: A new psychology of safety and health:
A river empties into the sea through a delta.
The delta has three channels, all of equal size.
Therefore, damming two of the channels will reduce the flow of water to the sea by two-thirds.
Reproduced from Target Risk 2 – A new psychology of safety and health ISBN 978-0969912439
As the river meets the sea, the water flow subdivides into the classic delta shape. If we were to dam up two of the streams, would it change to overall flow by 2/3? Of course the answer is “no”, but we often think that in eliminating some of the water flow, the water will be slowed down. But water will find a new path or the existing paths will become larger. Risk moves in the same way.
This is the same bad logic your customers and users will apply. “If X hasn’t caused me any trouble the last 100 times I have done it, then it shouldn’t cause me trouble now.” When we become comfortable with risky behavior, we also become “numb” to it. Familiarity, even in dangerous situations, breeds risk blindness.
Likewise, when a security feature has been added (like an anti-virus or a “virus immune” operating system like OSX), we have a higher risk appetite based upon the added “protection” we now have. This compensation tends to cause more problems because we are now less careful. Humans tend to take risk to its fullest potential, right up to the edge of consequences. Like the authors of “Target Risk 2” phrase it, you will “risk your life to the fullest”.
Being aware of this behavior is important when talking to people about information security. Just because a security technology is available or installed does not necessarily correspond to less overall risk. I hear many people talk about good security practice in terms of being “paranoid”, but that isn’t the answer, either. Paranoia is a delusional condition, a state of disrealization – good information security practice shouldn’t be distorting reality in the least; it should always make sense.
I think the most important thing to understand (as a information security person) is that your people (employees, etc. in your organization) want to do the best job they can in the least work inducing way possible. If this includes using someone else’s credentials (with permission) because that person has higher (faster, “better”, or whatever) access, then so be it. If this means writing down a complex password and sticking in the “secret” spot under the keyboard, you better expect it. People are busy and people are stressed – they just want to get the job done right and then go home. Information security is an abstraction for most folks, anyway.
The challenge will always be your people. Despite our best efforts and complex, fail-safe systems, our people will always find a way around the security provisions. When they do it enough, it will no longer feel “wrong”, it will become just become “normal”. This is the real danger: ignoring the warning bell so often it just no longer rings. Training users in security practice is something that has to happen routinely and often. Defeating risk homeostasis takes repetition and effective, direct communication about information security practices.
I often think of that scene in Jurassic Park when the laconic Dr. Ian Malcolm (played by Jeff Goldblum), in reference to the “security” of producing only same-sex dinosaurs on the island to prevent unauthorized breeding, states “No, I’m…I’m simply saying that life, uh…finds a way.” You had better bet your users (or worse – the hackers) will find a way around that new $200K layer-7 firewall environment. The only constant in information security is information insecurity.
These are my favorite shoes. I have exactly seven pairs of dress shoes, but these are the ones. They are super nice, and I can wear them for like 16 hours with no complaints from the ol’ foot department. They are made from the best materials and they are usually the nicest shoes in the room (if one pays attention to these types of things). I feel like anything is possible when I wear these shoes:
But none of that really matters to you- and here are three reasons why:
So here is the real question with regards to custom shoes:
Why would you want something that was specifically designed for someone else?
I’m guessing you don’t. Especially something like a shoe- it has to fit you, or it’s worthless. I remember the last time I tried to wear a pair of shoes that didn’t fit, and by noontime, my feet felt like they were about to catch on fire. And these were really nice shoes, like hand-made-by-artisan-cordwainer nice. But it didn’t matter- the shoe didn’t fit. (BTW, I sent those shoes back to the artisan cordwainers and slapped on my favorite shoes. You just can’t beat ‘em.)
So, besides establishing my unhealthy relationship with nice (albeit admittedly ephemeral and totally indefensible) shoes, I have also illustrated an important point: certain things just have to fit or they are worthless. Information security solutions are also like shoes. Any security product vendor will sell you a “solution” that will simply fit into your technology environment and just go to work. But this approach never really seems to pan out.
To some degree, every IT shop is a one-off gig. We use “off the shelf” tech and then we tweak it a little to make it all work with the rest of our systems. There is no standard business environment, despise what that sales rep just told you. “Best Practices” are just that- the best situation possible. Unfortunately, we don’t get always get the best practice- sometimes (hopefully rarely) its pretty darn crappy and just barely works. But, hey, it still works. I think most IT folks are serious pragmatists at heart. We don’t care about what is behind the curtain, so long as it just keeps working.
The challenge is to fit all these pieces together in a way that yields a good return. In information security the stakes are sometimes higher, but it’s the same process. Find a good solution, modify it to fit and make it work. But this won’t happen naturally; you will need a good team that understands this process, too. None of our fancy firewalls or complex log analysis systems work without skilled people.
Technology systems sometimes are a reflection of their designers: people are complex systems as well. Rarely do you find a person that is the exact composition of skills and temperament that you need. We all have strengths and weaknesses that work together to make a stronger whole. Consider Chobham armor used on tanks- while the exact composition is a secret, we do know it is a blend of very hard and very elastic materials. Used alone, each component would defeat some attacks, but not others. But in unison, the components work together to deliver a formidable defense against a range of attack vectors. Blend your security team in the same way.
In considering security technology and the security team, you have two defined parts that must fit together. These parts are not interchangeable with other environments. We can use parts of each - but as a whole, the fit will be poor. Don’t be lured into the trap of blind “best practice”. Adapt security to your environment and the parts that work inside of it.
Understanding that information security is an intelligent blend of people and technology is basic, foundational infosec knowledge. Understanding that real, effective information security is a relationship between reality, people, technology and the information environment is what defines a truly skilled information security professional.
Now put on your good shoes, and get to work.
It often helps me to better understand a concept if I think about the words I use to express or describe that concept - especially dense or involved subjects. Likewise, I also (occasionally) experience frustration when I listen to people talk about information security. As I parse their words, I am left confused or doubtful about the subject matter. I think we subscribe to popular trope or misconceptualizations about information security before we take the time to actually (or fully) understand the technology and the subsequent impact.
Technology is an interesting thing: it develops so fast we lose value in our conceptual investments. So we use our “old” words to describe new technologies and concepts, and sometimes we use our “new” words to describe existing (or “legacy”– see, I do it too) concepts. The problem with all of this descriptive effort is that the technology and methodologies are still changing, even as we are trying to learn the “current” words in their correct context. It even extends to relatively static concepts, like job titles or roles.
In thinking about this problem, I quickly assembled examples of these words and how we use them. I separated the words into distinct epistemological categories for accuracy’s sake.
When we think about our information being stored “off-site”, we refer to it as being “in the cloud” – which is a particularly bad word (phrase) choice, I think. The supposed word root of “cloud” is the same Middle English word for “rock”, like the heavy stuff mountains are made of. The reference is to the similar shape cumulus clouds have with mountain ranges, but still – rock? The word “cloud” inspires visions of ethereal collections of bits patiently waiting to be returned to their terrestrial owners. How lovely.
Of course we know that it’s just our digital stuff being delivered to us via the network or the public Internet. But we also attach other meanings to the word cloud. We imply a lack of trust when speaking about the cloud – well, sometimes.
If it’s our data, we don’t trust the cloud -unless we just lost all our data, then we pray the cloud still has it. Contrasting this relationship, we prefer that our customers use the cloud for their stuff because we don’t want to deal with it – unless its Protected (PII) data, then we don’t trust the cloud (or the user using the cloud) again. This word causes me much despair.
Information is a word used to describe the human-relevant data we try to protect. Which is accurate – “information” means “the act of informing”. But we also use information interchangeably for “data” which is troublesome. Data, by itself, is meaningless. If I was to type “2 6 3 7 1”, it would have no meaning to you. But if I told you that number data is the combination to my vault, it would become information. We protect information, not data. To make it even more defined (and relevant), we protect the ascription of data – the process of information creation.
My personal favorite: “virus” - the infosec catchall. A customer calls into the help desk and reports a "virus" - this could mean multiple things, all of which probably include some undesirable computer operation. Infosec pros will sometimes attribute a problem to “viral” activity – unexplained network traffic, strange log entries, etc. All we need to say is “it’s a virus” and everyone understands that we are saying, “this is a bad thing that is hard to fix”.
A computer “virus” is executable code that can replicate and distribute itself. But we also tend to place all other types of malicious (or indeterminate) code into this category. I really dislike all the nonce words used to describe malicious code, e.g., malware, adware, spyware, worm, etc., but we don’t have the proper vocab to describe these problems or the taxonomical subtleties, ergo, “virus”.
We usually say “firewall” when we are talking about a device that acts on our network connection, usually in an effort to make it more "secure". Customers complain that the “firewall” is blocking their access, and we blame the “firewall” when a disallowed packet sneaks into our networks. But “firewalls” are so much more than a rack-mounted box o’ security.
We all know about the various generations of firewalls and the difference between application and SPI firewalls. We just aren’t comfortable with more descriptive terms – like what the “firewall” is actually doing. I suppose that falls under the “sources and methods” category we are all hesitant to discuss.
Infosec is shorthand for “information security”. We use the term “infosec” to describe information security professionals or the act of securing information. A good application of “infosec” will make everything more secure – or so we think.
Infosec also implies a superhuman ability to somehow defy human nature and the ability to prevent humans from doing exactly what they want to do - which eventually happens anyway. We want “infosec” to deal with it so we don’t have to worry about it. Centrally managing human behavior rarely works in the long term.
Information security is a process that each person must observe – it cannot effectively be done by a third party alone. We don’t blame the National Highway Traffic Safety Administration when someone is killed because they failed to wear a seatbelt. Why do we still expect infosec pros to prevent information accidents?
infosecurity Magazine Webinar Week 2013 – Maximize Your Employability as an Information Security Professional
I participated in this webinar/panel discussion Thursday 11 April. It was a lot of fun and the audience posed quite a few challenging questions to the panelists. The audience count was somewhere over 200 at some point – not bad for an international webinar that was happening between 6 and 8pm in UK/Europe, depending on locale.
Here is a photo of me sandwiched between some really smart guys:
This web panel was a great experience, and I would definitely be honored to do it again.
EDUCAUSE Security Professionals Conference in St Louis, Missouri
I presented with a three-man team of security pros at the EDUCAUSE Security Professionals conference on Monday, 15 April – the seminar was entitled “Hack In The Rack” and we spoke about the information security community, engaging new members and using a sandboxed hacking environments to better achieve that end.
Albert Stadler opened the session with an intro to community engagement and got the group thinking collaboratively from the start. We answered several good questions and came home with much more to consider.
Marshal Graham did a wonderful job of guiding our group of 20 attendees thru multi-OS capture-the-flag scenarios. Of course, each CTF ended with Marshal getting console (or root) and administering a severe pwnage to each of the victim machines.
Overall, this was a great conference with top-notch presenter support. EDUCAUSE always provides excellent value for the money, and we hope to present again soon.
Honestly, we have ruined the internet.
I hear the sentiment “I’m glad the internet wasn’t a thing when I was growing up” echoed a lot, especially in my peer group (30-40 year olds). Which is an interesting statement, given my peer group consists of well-educated, middle Americans who work in technology and rely upon the industry for our primary incomes. My job didn’t exist 30 years ago, and is largely made valid by internetworked information systems.
The internet has added a tremendous amount of value to our lives, but the consequences have made things a lot more complicated. All information has become more (often instantaneously) available to anyone with an internet connection. And as my peers have become parents, we’ve started to perceive the world in new ways. With the understanding of what technology can offer, the memories of our own formative experiences and the fears about our children’s potential formative experiences, the whole thing gets really complicated and icky.
People are well acquainted with the capital-I “Internet” we all claim to use, and the “real” internet experience that is somewhere in between. If you have spent any serious amount of time on the net, you have probably bumped into some pretty eye-opening things. While that sentence sounds really prudish and insular, I think it holds up- if you haven’t learned anything by accessing information on the internet, you are probably doing it wrong. Categorically speaking, this includes innocuous places like Wikipedia, not just the prurient, traditional “I would never go there”-type sites. The internet is a reasonably complete cross-section of human nature.
Despite our reliance on the internet to deliver all the stuff that keeps us from being continually bored out of our collective skulls, we are also profoundly uncomfortable with the internet. Which is even more telling, I think. We complain about the destinations we visit that don’t have wireless internet access, but at the same time, we don’t want anyone else knowing what we are actually doing online. If you back away and think about all the stupid crap you type into Google (or Bing, or whatever), it becomes somewhat more clear- the only place we are totally honest about ourselves is online.
Which is funny in the sense that we often proffer the aphorism “don’t believe everything you read online.” We don’t trust anyone on the internet, but we expect the internet to be a complete source of relevant information for that weird pain we’ve been having. We also get righteously angry when people lie about themselves online- they even made a movie about it.
We tend to use a clever “handle” online to anonymise ourselves (and share how individualistic we are – everyone one is jealous of a particularly good screen name), because we don’t want anyone to really know who we are, let alone what we are looking up or posting online. So the whole experience becomes a really weird exercise in paranoia. If you doubt any of this this, search for “NSA eavesdropping” online and read the results – its clear we are all pretty concerned that someone else might be watching us online.
All of this (I think) is rooted in the base honesty people share online. Think about how many real people honestly know you, like family, spouse, etc. I mean the really dark stuff- addictions, mistakes and regrets- the parts that you work hard to keep invisible in day-to-day life.
One, maybe two? Probably “none” is a more accurate answer. We don’t want people to really know us. We each know the truth about ourselves, and it isn’t pretty.
Yet, if you can conceptualize some thing, you can find it online with a video or photo included - regardless of how horrifying it may be. Honest people being honest about what they want. Which, honestly, is typically extremely disturbing, like you-won’t-sleep-tonight-grade disturbing. I think a good word to describe it is “vulgar”, for etymological reasons. The latin root word of “vulgar” is “vulg” which simply means “us”. As in “we”- we are the internet, and we aren’t comfortable with it.
I think I understand why we are glad we didn’t grow up with the internet: nobody wants to imagine a childhood with a huge network of instantly available, super paranoid, truly honest people that are accessible at any minute of any day. We aren’t comfortable with ourselves in that context and we certainty aren’t comfortable with our kids accessing that level of honesty.
We never voice that thought - it’s more of a tacit understanding. The internet generally sucks, and we all know it. I’d even posit the internet is bad for us. Like booze and cigarettes - So good, but so bad. We can’t seem to pull ourselves away.
Maybe instead of information security talking so much about the information, infosec should start talking about the information. We spend so much time designing, installing and maintaining systems to protect all of this information, so much of which is totally useless crap.
The internet is going to change dramatically in the next ten years. The internet is dangerous and people in positions of power understand that fact. The fun, post-endless-pictures-of-lolcats days of the internet won’t last forever. And it’s our fault, like it or not: my generation ruined the internet for everyone, forever.
As soon as Google, Facebook and the like correlate our data with the real-life versions of us, it’s going to get really real, really fast. The fun will be over. We will all know the truth about everyone else, and honestly, I’m not looking forward to it.
I don’t have a fix, and I’m afraid it’s already too late. In the interest of “convenience” and “usability” the internet will become just like real life- except we just spent the last 15 years being honest about ourselves in one version of our lives, and not the other. Are we ready to balance the equation?
All wisdom is predicated by pain. Not that it should be that way: we live in a world full of bromides and trite aphorisms. You can barely get on the Internet without running into a reassuring quote on an encouraging-photo-background. Our parents told us “not to”, and we tell our kids “not to”. And yet, we must always find out the hard way. Every. Single. Time.
So it goes with everything in life. Technology is little different. We tend to spend money on problems after they cause us pain. ...<< MORE >>
Predicting the future is usually futile and often embarrassing, especially in technology - but there is some value in talking about what the future might hold, for better or worse. IT managers should always be thinking about how their teams should address and reflect upcoming changes in technology and the associated skills to make it all work together.
Each IT department is unique and every IT environment is customized to some degree, incorporating a blend of off-the-shelf and in-house solutions. The blend of custom and packaged solutions make ...<< MORE >>