Opinion: Don't chastise people for not doing a bot's work

I spend a lot of time consulting across a variety of companies. Often I'm there doing what we consider "mentoring" and that means I'm there on and off for longer periods. Because of that, I often have to do the same compliance "training" that their own employees do.

The first thing I'd comment on is that unfortunately this sort of compliance ends up being counted against the company's training budgets. Let's be clear:

That's not training

Most of the staff in the organizations see it as falling somewhere between an annoyance and a joke. The company makes the staff do these "courses" to keep the company out of trouble, not so that the staff actually learn anything. Worse, it's often so the company can blame the staff when they do the wrong thing. The company really doesn't think the staff don't understand conflicts of interest, or email policies. They just want to be able to avoid staff later saying that they didn't know they were doing the wrong thing.

One set of annoying courses forces staff to follow security policies like frequent password resets, etc. "to keep the company secure". Yet time and again, cyber security research has shown that forcing password resets frequently actually reduces security. (See the current NIST guidelines for details on passwords. Here's an article as an introduction). So the company is actually forcing people to take actions to reduce the company's security.

But the ones that annoy me the most are the ones were staff are asked to do things that the company's systems should be doing instead of the staff. Here's a hint:

If you need to run a course to tell people not to follow links in emails where the link address doesn't match the displayed URL, why not get an email system that does that instead?  If it's easy to teach people to do it, then teach a machine to do it instead.

Don't blame people for doing something wrong that a system or a bot should be doing in the first place!

 

 

Leave a Reply

Your email address will not be published.