Listen to the article (AI powered narration)

Published on July 28, 2022

Have you ever been on the popular social media platform Reddit? If you have, chances are you’ve come across the community (or subreddit as they’re called on the platform) r/explainlikeimfive.

The idea behind this community is simple: People wishing to learn something that they have difficulty understanding or have never read about make a post asking the community to ELI5, or explain [it to them] “like I’m five.” Those who know, or think they know the answers, reply with explanations that are simple enough for a five-year-old to understand.

Usually. Some people seem to have met kids who know more about quantum physics and computing than most adults you’ll meet.

Jokes aside, the simple language and explanations shared in that community help lay people gain a basic understanding about anything under the sun—quantum computing, neuroscience, why seafood is popular in East Asia, and more.

Pop-science authors and speakers also do something similar in their books and shows. They simplify scientific concepts (to varying degrees) and put them in a language most people and even kids can understand.

How does this tie into the discussion about people and security, you ask?

You say “MITRE ATT&CK,” they say “???”

As discussed in part one of this series, security best practices and policies serve no purpose if your people aren’t aware of them—or don’t understand them.

In many enterprises, security awareness training is often seen as a check box to be ticked before the next compliance audit. This isn’t to say that the people conducting the training aren’t serious about educating their colleagues. They are. However, their colleagues may not always understand the importance of the training or how the security policies apply to their job roles.

Many times, users may not understand which of their actions may directly or indirectly be putting the organization at risk. Plus, the generic nature of most training leaves them with a vague understanding of the importance of security practices but no clear direction on what exactly to be careful of or avoid.

Another problem is that security team members may not be the best at communicating security practices in a language the average employee will understand. As a result, when you sit down for a session on detecting phishing attacks, for example, you’ll often find it… a bit heavy, to say the least.

Some steps are easy to remember and follow. Others may just leave non-tech folks scratching their heads or be too tedious to practice regularly. Take the seemingly common advice of checking the sender’s email address to see if it’s from a legitimate domain. When you’ve got over a hundred emails in your inbox, there’s no way you’re taking out time to do that.

Security teams can’t expect their colleagues to have the time to check whether the lower case “L” in an email address is actually a capital “I.” And even if they do check, they may not be able to detect an issue. For example: “FirewaIl” here uses both letters mentioned above, but you wouldn’t be able to tell unless it shows up in a Serif font or you were really looking for it.

What’s more, your non-technical colleagues may also be unable to remember any of the other steps for more than a few weeks after training. If they don’t understand how to follow a best practice (because it’s too technical) or don’t know why it’s important, they’re going to skip it or forget it.

Then there’s the fact that these trainings are typically an annual activity at best.

It’s hard for anyone to remember things that aren’t relevant to their daily life, especially if that knowledge isn’t refreshed regularly. This is especially true for seemingly complex topics like security.

As Joshua Crumbaugh, CTO of PhishFirewall, put it in his videocast with ManageEngine, “If we want our people to not be intimidated by cyber[security] and take it seriously, we gotta make it less intimidating, and that means simplified education that’s continuous and entertaining.”

Research shows that humor can help people retain facts and information better. Be it news items or “dry” courses like statistics, content-related humor helps people retain information better. And if statistics can be made funny and easier to remember, why not security?

A group of people laughing in the middle of a fun session (possibly on security)

Perhaps it’s time that security and compliance teams spiced up their training sessions with some (security-related) humor. For this, they can rope in people from internal communications or learning and development.

They should also look at ways of helping people refresh their knowledge regularly. Multiple training sessions in a year may be too tedious. Games, quizzes, or other activities could offer fun ways of keeping security tips fresh in people’s minds.

As the saying goes, team work makes the dream work. In this case, it could also help reduce at least a little work for security folks in the long run.

The hope is that if your people retain security tips better, they’ll be more aware and less likely to fall for common cyberattacks or cause security incidents.

Cybersecurity training: Now personalized for you

Another thing to look into is customizing the training or tips to various people or job roles. As per an NBC article, some neuroscientists believe that connecting things to parts of your daily life, or to other things you already know, makes them easier to remember.

Thus, role-customized training can help employees understand security tips better. This can be tedious to put together at first. However, it could help change security policies from an abstract set of rules to a set of concrete dos and don’ts that are easier to follow.

Another way to personalize or customize training is to leverage AI to socially engineer your people into making the organization more secure. As Joshua puts it, “If social engineering is the biggest problem, then social engineering is also the answer.”

Organizations can target their own employees with “phishing campaigns” tailored to their actions and interests. The idea behind this is to condition people into not clicking risky emails and links. The concept is simple: If you keep making someone fall for the same trick again and again, over time they may start avoiding it.

Of course, doing this manually is a gargantuan task, and the goal is to reduce the security team’s work. That’s where AI comes in. AI-based solutions can process large amounts of data with ease, allowing them to easily customize phishing campaigns to each employee’s interests.

Hackers are already using AI for similar purposes and getting results. As per the MIT Sloan report quoted in the article above, phishing emails generated by AI using personal information got open rates as high as 60%. So, it only makes sense that the good guys pick up AI-based tools too.

But what about the walk-ins?

Of course, this still leaves the problem of physical infiltration. Despite what the news would have you believe, people are still mostly willing to help others—at least in daily, low-risk situations such as holding the door for that “colleague” whose hands are full with coffees or files.

The only recourse here is to ensure your awareness sessions cover these threats as well. Of course, people may not like being mean and not helping a colleague. So, show them a safer way of helping—such as offering to hold the coffee or files for them so they can swipe their card freely. And if they “forgot” their card, that’s a problem for reception or the admin to handle, isn’t it? It takes a tad bit longer, but it keeps smart social engineering pros from conning their way into your premises.

With any luck, the steps outlined in this series could turn your people, the weakest link in your perimeter, into your first line of defense, hopefully allowing your security team to take a breather—until the next critical zero-day vulnerability comes to light.

Want to get more insights on the human element in cybersecurity? Check out our videocast, Social engineering and Zero Trust – the human element in cybersecurity.

Akash

Akash

Enterprise Analyst, ManageEngine

Akash Kapur is an enterprise analyst at ManageEngine. He has an interest in all things technology and science, and tries to keep abreast of the latest happenings in the fields of science and technology—especially cybersecurity, from a lay person’s perspective.

With prior experience in a FinTech firm and an MBA in marketing, he’s learned about the fields of finance, technology, and marketing.

 Learn more about Akash
x