Phish Scales: Weighing Your Risk

Phish Scales: Weighing Your Risk

With phishing campaigns now the #1 cause of successful breaches, it’s no wonder more and more businesses are investing in phish simulations and cybersecurity awareness programs. These programs are designed to strengthen the biggest vulnerability every business has and that can’t be fixed  through technological means: the human factor. One common misconception that may employers have, however, is that these programs should result in a systematic reduction of phish clicks over time. After all, what is the point of investing in phish simulations if your employees aren’t clicking on less phish? Well, a recent report from The National Institute of Standards and Technology actually makes the opposite argument. Phish come in all shapes and sizes; some are easy to catch while others are far more cunning. So, if your awareness program only focuses on phish that are easy to spot or are contextually irrelevant to the business, then a low phish click rate could lead to a false sense of of security, leaving employee’s unprepared for more crafty phishing campaigns. It’s therefore important that phish simulations present a range of difficulty, and that’s where the phish scale come in.

Weighing Your Phish

If phish simulations vary the difficulty of their phish, then employers should expect their phish click rates to vary as well. The problem is that this makes it hard to measure the effectiveness of the training. NIST therefore introduced the phish scale as a way to rate the difficulty of any given phish and weigh that difficulty when reporting the results of phish simulations. The scale focuses on two main factors:

#1 Cues

The first factor included in the phish scale is the number of “cues” contained in a phish. A cue is anything within the email that one can look for to determine if it is real of not. Cues include anything from technical indicators, such as suspicious attachments or an email address that is different from the sender display name, to the type of content the email uses, such as an overly urgent tone or spelling and grammar mistakes. The idea is that the less cues a phish contains, the more difficult it will be to spot.

#2 Premise Alignment

The second factor in the phish scale is also the one that has a stronger influence on the difficulty of a phish. Essentially, premise alignment has to do with how accurately the content of the email aligns with what an employee expects or is used to seeing in their inbox. If a phish containing a fake unpaid invoice is sent to an employee who does data entry, for example, that employee is more likely to spot it than someone in accounting. Alternatively, a phish targeting the education sector is not going to be very successful if it is sent to a marketing firm. In general, the more a phish fits the context of a business and the employee’s role, the harder it will be to detect.

Managing Risk and Preparing for the Future

The importance of the phish scale is more than just helping businesses understand why phish click rates will vary. Instead, understanding how the difficulty of a phish effects factors such as response times and report rates will deepen the reporting of phish simulations, and ultimately give organizations a more accurate view of their phish risk. In turn, this will also influence an organization’s broader security risk profile and strengthen their ability to respond to those risks.

The phish scale can also play an important role in the evolving landscape of social engineering attacks. As email filtering systems become more advanced, phishing attacks may lessen over time. But that will only lead to new forms of social engineering across different platforms. NIST therefore hopes that the work done with the phish scale can also help manage responses to these threats as they emerge.

If You Want Risk Management To Stick, You Have To Stay Positive

If You Want Risk Management To Stick, You Have To Stay Positive

Remember the sales contest from the movie, Glengarry Glen Ross?

“First prize is a Cadillac Eldorado….Third prize is you’re fired.”

We seem to think that, in order to motivate people, we need both a carrot and stick. Reward or punishment.  And yet, if we want people to change behaviors on a sustained basis, there’s only one method that works: the carrot.

One core concept I learned while applying behavior-design practices to cyber security awareness programming was that, if you want sustained behavior change (such as reducing phish susceptibility), you need to design behaviors that make people feel positive about themselves.

The importance of positive reinforcement is one of the main components of the model developed by BJ Fogg, the founder and director of Stanford’s Behavior Design Lab. Fogg discovered that behavior happens when three elements – motivation, ability, and a prompt – come together at the same moment. If any element is missing, behavior won’t occur.

I worked in collaboration with one of Fogg’s behavior-design consulting groups to bring these principles to cyber security awareness. We found that, in order to change digital behaviors and enhance a healthy cyber security posture, you need to help people feel successful. And you need the behavior to be easy to do, because you cannot assume the employee’s motivation is high.

Our program is therefore based on positive reinforcement when a user correctly reports a phish and is combined with daily exposure to cyber security awareness concepts through interactive lessons that only take 4 minutes a day.

To learn more about our work, you can read Stanford’s Peace Innovation Lab article about the project.

The upshot is behavior-design concepts like these will not only help drive change for better cyber security awareness; they can drive change for all of your other risk management programs too.

There are many facets to the behavior design process, but if you focus on these two things (BJ Fogg’s Maxims) your risk management program stands to be in a better position to drive the type of change you’re looking for:

1) help people feel good about themselves and their work

2) promote behaviors that they’ll actually want to do

After all, I want you to feel successful, too.

Coronavirus and Cybersecurity: The Human Factors

In the past, cybersecurity threats tend to increase in times of crisis. Now, bad actors are already using the coronavirus pandemic to their advantage. Employers are beginning to ask employees to work from home, and there are already numerous articles on security concerns about remote access. And while it is certainly important to ensure remote access systems are properly secured, it is equally as important to understand the human factors that create certain security vulnerabilities. Mass confusion and panic often lead to faulty or rash decision making, which is precisely what scammers are banking on now. A study by Check Point, for instance, revealed that coronavirus-related web domains are 50% more likely to be malicious than other domains.

When considering the coronavirus and cybersecurity, it is important for employers to use cyber awareness training to ensure employees continue to think critically and use proper judgment online. Here are four key areas to help employees limit their risk of exposure:

Use Multi-Factor Authentication

Perhaps the most important measure you can put in place is to make sure that all remote users are required to use multi-factor authentication (MFA) when accessing your system.

Device Security

Businesses need to ensure all employees that are working from home are taking appropriate steps to keep sensitive information safe. Anyone using remote access needs to be trained in the use of essential endpoint protections. VPNs, for example, are extremely important to make sure logs can’t be sniffed out by others in the neighborhood.

Employees should also be reminded of basic measures to take with personal devices. Screen and application time-outs should be set up to limit the risk that unwanted eyes around the house can view sensitive information and communications.

To limit the impact of stolen or lost devices, all sensitive information should be fully encrypted.

Online communication

Employees should be updated about current phishing campaigns that are taking advantage of the confusion and panic surrounding the coronavirus. The World Health Organization recently released a statement warning of fake emails posing as the WHO to steal money, information, and credentials. According to The Wall Street Journal, the WHO is receiving daily reports of coronavirus-related phishing schemes.

Working remotely will also require expanded use of online communications such as email, video services, and phones. It is therefore important that all communications relating to business should only take place through company-approved communication services. It is difficult to monitor the security of personal and social media messaging services and should not be used for any business-related communications.

Reporting and Incident Response

Being aware of increased cyber threats is only half the battle. Employees also need to understand how and when to report any suspected incidents. Keep help desks up and running, and encourage employees to be overly cautious in reporting any suspicious emails or activity. Employees need to know that someone can help if they think any information is at risk. 

Incident response teams should also be briefed on and prepared for any threats related to remote access work. Not only should response teams understand the current threats, everyone involved should have a clear understanding of how communication and responses will be carried out remotely. Because previous response simulations were probably conducted in-office, it is helpful to run a test response using only remote communication.

Communicate and Connect

Companies are ecosystems and healthy corporate ecosystems are a function of purpose, recognition, connection and intentional urgency.  All of which feeds into employee actions, whether it involves cybersecurity issues or marketing or administration or service issues.  Companies which do a better job of communicating what is going on in their organization and connecting with their remote staff and acknowledging their respective situations create a caring environment which helps everyone pay attention to little things – like perhaps not clicking on that strange link or hiding the fact they accidentally sent the wrong person confidential information.

Conclusion

Given the severity of the ongoing coronavirus crisis, bad actors are counting on an increase in confusion, panic, and fear to profit and cause further disruption. The coronavirus and cybersecurity concerns need to be considered, Above all else, employers need to do their part to ensure workers stay well-informed and secure. Working at home might mean we can dress a little more casually, it doesn’t mean we should be any less serious about threats online.

Creating a Vaccine for Phishing Attacks

Creating a Vaccine for Phishing Attacks

Another day another phishing story.  According to reports a scammer recently sent out emails to a Texas school district posing as one of the district’s vendors and requested a series of payments. One month later, the district realized they had been conned out of $2.3 million. 

Unfortunately, stories like these are increasingly common 

Not unlike propaganda, social engineering and phishing campaigns are forms of attack that rely primarily on deception and disinformation. Defending against these attacks therefore requires more than technical defenses. Instead, it’s necessary to look at strategies used to combat disinformation in general.  

A Vaccine for Social Influence

Inoculation theory is one such strategy and has been gaining steam recently. The main premise of the theory is that the best way to defend against manipulation and unwanted influence is through exposure to the influence in a smaller, weaker form. Exactly like a vaccination.  

In general, the application of inoculation theory involves three basic elements: 

Threat 

The first step is so obvious that it’s can be easy to overlook. If you want to defend against a threat, you first need to be aware that the threat exists.  For instance, if your employees don’t know what a phish is, they are far more likely to get tricked by oneOne study found that the simple awareness that a threat exists increases the ability to combat it, even when they weren’t given the tools to fight it.  

Refutational Preemption

Refutation preemption is a fancy phrase, but, in the metaphor of the vaccine, it simply stands for the weak strain of a virus or threat. The idea is to introduce someone to faulty messaging that stands in opposition to what they usually hold to be true. By being exposed to a weaker version of the messaging, the person receiving the message will be able to learn how to argue against it and strengthen their own beliefs. Then, when they encounter a similar but stronger message in real life, they will have already developed the tools needed to combat it.  

Within the context of phishing schemes, this would involve presenting someone with examples of phishing emails asking them to identify the methods used that make the email seem real. Another method is to have participants create their own phishing emails to get them to know what is involved in creating a deceptive message.

Involvement

The final element of the theory simply states that the more someone cares about an issue, the easier it will be for them to defend against a threat to that issue. So, when it comes to phishing, if your employees understand and care about the stakes involved with a phishing attack, they will be in a better position to spot them. Essentially, the more vested interest someone has in defending against an attack, the easier it will be for them to do so successfully.  

Putting Inoculation Theory into Practice

With the rise of socially engineered threats, inoculation theory has seen a bit of a resurgence lately. For instance, researchers at Cambridge University created the simulation Get Bad News, a game that uses inoculation theory to combat false or misleading news articles.  

And it doesn’t take a big leap to see how inoculation theory can be useful for cyber security threats, such as phishing campaigns. By combining education with simulated phishing attacks, businesses can use inoculation theory to: 

  1. Using education tools to raise employees’ awareness of the threat phishing attacks pose. 
  2. Expose employees to simulations of phishing attacks and have them proactively respond to it by reporting potential phish. You can even have employees create their own phish. Like Get Bad News, this will further inform participants of common tactics used in social engineering schemes.  
  3. Create a program that keeps employees engaged in the process. Focusing on positive reinforcement over punishing mistakes, for example, will help encourage participants to take the process seriously. 

Inoculation Theory At Work

Our digital awareness program The Phishmarket™uses inoculation theory in various phases throughout the program. Our phish simulations uses a reporting feature that empowers participants to be actively involved in combating phishing attacks and rewards progress instead of punishing mistakes. 

The Phishmarket™ also includes an online training program that uses daily micro-lessons to teach participants about common and emerging methods used in social engineering schemes. Some of the micro-lessons even asks users to try creating their own phish.  

Want to try it out for yourself? Simply follow this link to test out a preview of the training program and create your very own (fake) phishing campaign.  

Cyber Awareness 4 mins at a time

Last week we announced our new Behavior-Designed Cyber Awareness ProgramOne part of that program will be a structured phish simulation campaigns; another part of the program is series of courses on a broad range of topics related to digital awareness, appropriate security practices, and behavioral biases which impact susceptibility to phishing emails and other forms of social engineering. Each course contains a number of micro-lessons designed to take only a few minutes — typically around 4 minutes — to complete. The intent of each course, in addition to the phish simulations that will run concurrently, is to give participants the tools they need to recognize and modify their online behavior in order to maintain a safer and healthier digital presence.  

Soon we will be rolling out the entire program, but for now we want to offer a sneak peak of what’s to come. Right now we are offering a free preview of a course on phishing attacks and how to spot them. If you want to try it out click here and enroll now for free 

And, if you haven’t already, you can check out a review of our new program published as a part of the Stanford Peace Tech Lab. 

Behavior-Designed Cyber Awareness — A New Program

For the Past Year, Designed Privacy has been working to integrate behavior design into the cyber awareness process. Through a series of testing, we have created a CyberAwareness Program which we are launching this Fall.  The Program not only shows strong results in reducing phish susceptibility, the behaviors it’s designed to create show the potential to both mitigate digital disinformation efforts and get people to collaborate on reinforcing secure behaviors, whether in the office, at home or with clients and vendors.

In addition, we are extremely pleased to have process and results published by the Peace Innovation Lab at Stanford.

After a year of testing three things are clear:
1). Cyber awareness without behavior change is a waste of time, money and energy;
2). Behavior changes occurs through a combination of ease, prompting and positive reinforcement. People are more apt to change behaviors when they see a positive WIIFM.
3). Behavior-designed cyber awareness not only leads to reduced phish susceptibility, but it also has the potential to lead to better organizational decision making, especially as we are relying more and more on digital information to make those decisions.

In a world of phishing, online scams,  deepfake video and content, and the weaponization of social media, we all need to develop behaviors to help us determine what is real and what is not if we want to be secure, make sound decisions and feel that we still have the space where our choices are our own.

Please read the Stanford Peace Innovation Lab article here.