Security Awareness Blog

Can't Patch Stupidity? Look in the Mirror

A theme I sometimes hear from people in the the security community is you can't patch stupid. That "End Users" are too dumb or ignorant to be secured. Wow, I can't think of a more unfounded, prejudice statement. First, "End Users" are people like you and me, so I suggest we start calling them that. Second, many of the people I see organizations trying to secure are very intelligent. These organizations include people such as engineers, accountants, scientists, lawyers, researchers, doctors and a myriad of other smart people. In one extreme example I know a security awareness officer whose organization is so highly educated that the average employee has 2.5 PhDs. Finally, most people I talk to are motivated, they want to do the right thing and be secure. So if we are working with people who are both smart and motivated, what is the problem?

I think we the security community need to take a long look in the mirror. You will quickly see that we are the problem. Think of people as another operating system, the HumanOS. Now think, what have we done to secure this operating system? Very little. We've spent the past twenty years investing in and focusing on just technology. Now we need to take a step back and start focusing on the HumanOS. We also need to understand that we simply cannot dictate to people what to do. We need to understand who our audience is and how to effectively engage them, which requires a set of skills most security professionals lack. Ultimately it is our responsibility to help our employees, not make fun of them. Until we recognize this fact, people will continue to be the weakest link, and it will be our fault.

 

7 Comments

Posted April 7, 2015 at 1:58 PM | Permalink | Reply

HJohn

Reminds me of the IT Support Chorus from years back:
"I"m not allowed to run the train,
the whistle I can't blow.
No when listens when I say
we're going too fast or slow.
I'm not allowed the touch the brakes
or even ring the bell.
But when the train jumps the tracks
see who catches he''"."
I will say even if we are faced with a user that doesn't get it, that doesn't mean we're not better off if the user is made more aware. Falling short of an ideal does not mean the ideal isn't worth striving towards.
On the more negative side, it's easier to hold a user accountable if you can prove they were educated, and to protect IT from undue scrutiny, and by extension their all-important credibility. I know this sounds like passing the buck, but we all know (as is the case in they humorous chorus above) that when the wrong party is held accountable it is counter intuitive.

Posted April 7, 2015 at 6:10 PM | Permalink | Reply

sahil

While I totally agree with you on the point that little has been invested to understand the so called weakest link, I believe a majority of end users are not motivated to behave securely. Some of the reasons could be ''" security gets in their way of work, they don't know the right thing to do, doing the right thing seems to burdensome, they don't understand the risks etc. Just think of young graduates from college who form a majority of work force for a lot of software and professional services company.
I believe it is for important awareness teams to identify the user sets and what motivates them to do the right thing. This should be coupled with identifying ways to reduce the additional effort that security usually brings in. That might help us in getting in the right direction.

Posted April 15, 2015 at 11:07 AM | Permalink | Reply

lspitzner

Great post and I could not agree more, motivation is key. One of the greatest successes organizations are having with motivation is focusing on how people personally benefit from the training. Think about it, people use the same technologies and face the same risks at both home and work. If we focus on how personally benefit, they are not only more engaged but now they have secure behaviors both at home and work. You are so correct, motivation is a key part of behavior change.

Posted April 7, 2015 at 10:17 PM | Permalink | Reply

pretext

It's us humans whom create technologies that need patching, so it seems likely that we'll continue to be the weakest link in the end.
Relatively non-technical folks will operate technology after being encouraged (often misguided) to place high levels of trust in it ''" thus making things worse.
Using pictures of green padlocks to imprint false feelings of safety makes a nice example of something that's our collective fault. It assumes users are stupid and can't make trust decisions, although this might be preferred over not assuming anything while still allowing technological advancement of sorts.
Arguably it's our trust in technology to help sustain human life that needs patching, but it seems complicated to implement proper version and revision control. Besides, technology is entertaining and often works to some degree.
I've so far been unable to patch my own stupidity, but only managed to detect and prevent future vulnerabilities on occasion.

Posted April 16, 2015 at 10:55 AM | Permalink | Reply

sahil

lspitzner ''" Thanks! I lead the security awareness initiatives at a 50000 organization and I would like to share more about motivation and information security awareness in general. Is there a way I can do it on this forum?

Posted April 17, 2015 at 2:19 PM | Permalink | Reply

lspitzner

Absolutely! We love to get the community involved and help others share ideas. Shoot me an email at lspitzner@sans.org and let me know what you are thinking.

Posted April 16, 2015 at 2:55 PM | Permalink | Reply

DonJ

OK, I will not comment on intelligence or stupidity in the workplace because I think you know what's being said in the comment about patching and stupidity, but why not talk about this instead. As a security professional who works very closely with other administrators, we can all agree on if something needs patching or not, if that's true then why can administrators and management find every excuse NOT to patch it, even when the evidence is out there that it could be dangerous not to? Talk about stupid.
Second thing. It does not matter what your credentials or education level is, you don't know everything, because those very same employees with the 2.5 PhDs will be the first to screw up something they do not know anything about and complain because their machines are down'' it all goes both ways'' just saying.