Focusing on the Wins

Focusing on the Wins

One of the main tenants of behavior science is something called “operant conditioning.” It’s a fancy phrase for a concept that’s actually pretty simple: a behavior followed by a reward is more likely to be repeated than a behavior followed by a punishment. While this is a pretty common sense idea, when it comes to our own goals, we don’t often think this way. Instead, we’ve grown up with a myth that true success comes only with struggle and that our biggest opponent is ourselves. Instead of focusing on our wins, we focus on our loses and think that to get anything accomplished we have to be hard on ourselves. And how well does that usually work out?

In order to create new behaviors that you can actually sustain, you need to have positive reinforcement. In other words, if you set yourself a goal that is too difficult or takes too long to achieve, your focus will be on what you’re doing wrong and lead you to give up. Instead, it’s important build on goals that you can actually achieve and feel positive about. This isn’t to say you shouldn’t set big goals for yourself, but that in, order to get there, you first have to focus on the wins: the small, achievable goals that you can then build upon to make the changes you want for yourself.

This is a lesson that most cybersecurity training programs have yet to understand. Phish simulation programs often will often focus on the loses: when you click on a phish or don’t report it to your IT department. Instead, accountability with compassion is far more effective for driving long term behavior change, and training programs that reward positive behaviors rather than punish bad ones are more likely to help users achieve their goals.

Using positive reinforcement and focusing on the wins helps us build the skills and abilities that enable us to do great things. And, perhaps after we have accomplished the large goal we were after, we’ll realize that the actual goal was to just feel better about ourselves.

Phish Scales: Weighing Your Risk

Phish Scales: Weighing Your Risk

With phishing campaigns now the #1 cause of successful breaches, it’s no wonder more and more businesses are investing in phish simulations and cybersecurity awareness programs. These programs are designed to strengthen the biggest vulnerability every business has and that can’t be fixed  through technological means: the human factor. One common misconception that may employers have, however, is that these programs should result in a systematic reduction of phish clicks over time. After all, what is the point of investing in phish simulations if your employees aren’t clicking on less phish? Well, a recent report from The National Institute of Standards and Technology actually makes the opposite argument. Phish come in all shapes and sizes; some are easy to catch while others are far more cunning. So, if your awareness program only focuses on phish that are easy to spot or are contextually irrelevant to the business, then a low phish click rate could lead to a false sense of of security, leaving employee’s unprepared for more crafty phishing campaigns. It’s therefore important that phish simulations present a range of difficulty, and that’s where the phish scale come in.

Weighing Your Phish

If phish simulations vary the difficulty of their phish, then employers should expect their phish click rates to vary as well. The problem is that this makes it hard to measure the effectiveness of the training. NIST therefore introduced the phish scale as a way to rate the difficulty of any given phish and weigh that difficulty when reporting the results of phish simulations. The scale focuses on two main factors:

#1 Cues

The first factor included in the phish scale is the number of “cues” contained in a phish. A cue is anything within the email that one can look for to determine if it is real of not. Cues include anything from technical indicators, such as suspicious attachments or an email address that is different from the sender display name, to the type of content the email uses, such as an overly urgent tone or spelling and grammar mistakes. The idea is that the less cues a phish contains, the more difficult it will be to spot.

#2 Premise Alignment

The second factor in the phish scale is also the one that has a stronger influence on the difficulty of a phish. Essentially, premise alignment has to do with how accurately the content of the email aligns with what an employee expects or is used to seeing in their inbox. If a phish containing a fake unpaid invoice is sent to an employee who does data entry, for example, that employee is more likely to spot it than someone in accounting. Alternatively, a phish targeting the education sector is not going to be very successful if it is sent to a marketing firm. In general, the more a phish fits the context of a business and the employee’s role, the harder it will be to detect.

Managing Risk and Preparing for the Future

The importance of the phish scale is more than just helping businesses understand why phish click rates will vary. Instead, understanding how the difficulty of a phish effects factors such as response times and report rates will deepen the reporting of phish simulations, and ultimately give organizations a more accurate view of their phish risk. In turn, this will also influence an organization’s broader security risk profile and strengthen their ability to respond to those risks.

The phish scale can also play an important role in the evolving landscape of social engineering attacks. As email filtering systems become more advanced, phishing attacks may lessen over time. But that will only lead to new forms of social engineering across different platforms. NIST therefore hopes that the work done with the phish scale can also help manage responses to these threats as they emerge.

How Notifications are Re-Wiring Our Brains

How Notifications are Re-Wiring Our Brains

“How prone to doubt, how cautious are the wise!”
― Homer

We’ve written before about how hackers and online scammers rely on human factors just as much as technological factors. They attempt to manipulate our emotions in order to trick us into handing over information or even money. However, the problem of social engineering goes beyond these tactics used by scammers. We’ve all experienced the anxious rush to check our notifications as soon as they come in. But these aren’t just simple habits we’ve developed —  our phones, and especially notifications, are literally re-wiring how our brains work and even dulling our critical thinking skills.

Ever heard of Pavlov’s dog? It was an experiment conducted by the physiologist Ivan Pavlov in which he rang a bell when presenting food to a dog. Upon seeing the food, the dog naturally began to salivate. After awhile, however, Pavlov rang the bell without giving the dog any food and found that the dog began to salivate based on the sound of the bell alone, effectively re-wiring how the dog’s brain responds to certain sounds. Well, this type of conditioned response is also exactly what our phone notifications are doing to us. The ping we hear when a text or email pops up on our phone acts as a trigger for our brain to release pleasure-seeking chemicals such as dopamine. According to behavioral psychologist Susan Weinschenk, this sets us on an endless dopamine loop: “Dopamine starts you seeking, then you get rewarded for the seeking, which makes you seek more. It becomes harder and harder to stop looking at email, stop texting, or stop checking your cell phone to see if you have a message or a new text.”

However, the way that notifications re-wire our brains goes beyond the endless search for more and more messages. The pleasure-seeking response that dopamine triggers can actually lower our ability to think critically, making us more susceptible to online scams. According to research conducted by The University of Florida and Google, the cognitive effects notifications have on us can lower our decision-making ability. The research found that we are more likely to detect a scam when we are stressed and on high alert. However, hormones like dopamine that are pleasure-based lower our level of alertness and make us less likely to detect potential scams. This is especially troublesome when it comes to phishing emails. Emails notifications release these “feeling good” chemicals which in turn makes it harder for us to discern if what we’re looking at is a fake.

There are, however, some steps we can take to combat this. If notifications are re-wiring our brains to be less alert, one step we can take is to simply turn off all notifications. This can limit the dopamine release that notifications trigger. Taking  a few breaths before opening an email also helps. Pausing before responding to a notification can help break the “dopamine loop” by delaying the gratification cycle. Whatever method works best is up to you. The important thing is to be aware of how you respond to things like notifications. Taking the extra few seconds to think about what you’re doing and why might just save you from falling for a phish or other online scams.

COVID-19 Scams Total over $13M in Loss

COVID-19 Scams Total over $13M in Loss

Since the beginning of the COVID-19 pandemic, we’ve seen a lot scammers using the pandemic to their advantage. From attacks on the health care industry, to phishing campaigns impersonating the CARES Act small business loan program, online scammers are out in full force to exploit of our fear and confusion. So it’s not surprising to see the Federal Trade Commission confirm that COVID-19 scams are on the rise.

But what is surprising is that those scams have already resulted in $13.44 million in fraud loss since January. This morning the FTC released updated data relating to COVID-19 scams reported to their agency. Here are a few key points from the new data:

  • Since January, there are been over 18,000 reports made to the FTC about COVID-19 related scams.
  • 46% of scams reported resulted in the victim losing money.
  • The most common form of fraud involve scammers impersonating travel and vacation companies such as airlines and hotels. Online shopping companies are also a large source of fraud. Many report that fake businesses are selling high-demand cleaning and medical products that simply never arrive after you pay for them.
  • A lot of scammers are also pretending to be the government. In many cases, this involves asking the victim to report personal and financial information to receive their stimulus check.
  • Robocalls are back on the rise. Last year, we finally started to see a decline in the number of robocalls. However, those numbers are starting to rise again as scammers use the COVID-19 crisis to commit fraud or illegally gain personal information.

What You Should do

While we are certainly going through an unprecedented and confusing time, it’s important that you stay alert online for COVID-19 scams. If a person or businesses calls, texts, or emails you asking for money or personal details, make sure you know exactly who you are talking to. Here are a few tips to stay safe online:

  • If you ever receive a random call claiming to be from the government asking for payment, hang up. The government will never call out of the blue to ask for financial or other personal information.
  • When doing online shopping, google reviews of the company first to see if people are getting their products.
  • If you get an email from a known company or friend asking or money, look carefully at the email address and URL in the email to make sure they are legitimate.
  • If you aren’t sure if something is a scam or not, try googling it or even looking on Twitter. In many cases, scammers will send the same message out to a lot of people, so you may find some helpful stranger warning you not to fall for it.

Above all else, it’s important to practice good digital awareness everywhere online. Be skeptical of what you are seeing and reading. Follow up. Look for others online who can confirm what you’re seeing is real. Scammers rely on us making split decisions, so just taking an extra minute to confirm something is real could end up saving you money.

Coronavirus and Cybersecurity: The Human Factors

In the past, cybersecurity threats tend to increase in times of crisis. Now, bad actors are already using the coronavirus pandemic to their advantage. Employers are beginning to ask employees to work from home, and there are already numerous articles on security concerns about remote access. And while it is certainly important to ensure remote access systems are properly secured, it is equally as important to understand the human factors that create certain security vulnerabilities. Mass confusion and panic often lead to faulty or rash decision making, which is precisely what scammers are banking on now. A study by Check Point, for instance, revealed that coronavirus-related web domains are 50% more likely to be malicious than other domains.

When considering the coronavirus and cybersecurity, it is important for employers to use cyber awareness training to ensure employees continue to think critically and use proper judgment online. Here are four key areas to help employees limit their risk of exposure:

Use Multi-Factor Authentication

Perhaps the most important measure you can put in place is to make sure that all remote users are required to use multi-factor authentication (MFA) when accessing your system.

Device Security

Businesses need to ensure all employees that are working from home are taking appropriate steps to keep sensitive information safe. Anyone using remote access needs to be trained in the use of essential endpoint protections. VPNs, for example, are extremely important to make sure logs can’t be sniffed out by others in the neighborhood.

Employees should also be reminded of basic measures to take with personal devices. Screen and application time-outs should be set up to limit the risk that unwanted eyes around the house can view sensitive information and communications.

To limit the impact of stolen or lost devices, all sensitive information should be fully encrypted.

Online communication

Employees should be updated about current phishing campaigns that are taking advantage of the confusion and panic surrounding the coronavirus. The World Health Organization recently released a statement warning of fake emails posing as the WHO to steal money, information, and credentials. According to The Wall Street Journal, the WHO is receiving daily reports of coronavirus-related phishing schemes.

Working remotely will also require expanded use of online communications such as email, video services, and phones. It is therefore important that all communications relating to business should only take place through company-approved communication services. It is difficult to monitor the security of personal and social media messaging services and should not be used for any business-related communications.

Reporting and Incident Response

Being aware of increased cyber threats is only half the battle. Employees also need to understand how and when to report any suspected incidents. Keep help desks up and running, and encourage employees to be overly cautious in reporting any suspicious emails or activity. Employees need to know that someone can help if they think any information is at risk. 

Incident response teams should also be briefed on and prepared for any threats related to remote access work. Not only should response teams understand the current threats, everyone involved should have a clear understanding of how communication and responses will be carried out remotely. Because previous response simulations were probably conducted in-office, it is helpful to run a test response using only remote communication.

Communicate and Connect

Companies are ecosystems and healthy corporate ecosystems are a function of purpose, recognition, connection and intentional urgency.  All of which feeds into employee actions, whether it involves cybersecurity issues or marketing or administration or service issues.  Companies which do a better job of communicating what is going on in their organization and connecting with their remote staff and acknowledging their respective situations create a caring environment which helps everyone pay attention to little things – like perhaps not clicking on that strange link or hiding the fact they accidentally sent the wrong person confidential information.

Conclusion

Given the severity of the ongoing coronavirus crisis, bad actors are counting on an increase in confusion, panic, and fear to profit and cause further disruption. The coronavirus and cybersecurity concerns need to be considered, Above all else, employers need to do their part to ensure workers stay well-informed and secure. Working at home might mean we can dress a little more casually, it doesn’t mean we should be any less serious about threats online.

Creating a Vaccine for Phishing Attacks

Creating a Vaccine for Phishing Attacks

Another day another phishing story.  According to reports a scammer recently sent out emails to a Texas school district posing as one of the district’s vendors and requested a series of payments. One month later, the district realized they had been conned out of $2.3 million. 

Unfortunately, stories like these are increasingly common 

Not unlike propaganda, social engineering and phishing campaigns are forms of attack that rely primarily on deception and disinformation. Defending against these attacks therefore requires more than technical defenses. Instead, it’s necessary to look at strategies used to combat disinformation in general.  

A Vaccine for Social Influence

Inoculation theory is one such strategy and has been gaining steam recently. The main premise of the theory is that the best way to defend against manipulation and unwanted influence is through exposure to the influence in a smaller, weaker form. Exactly like a vaccination.  

In general, the application of inoculation theory involves three basic elements: 

Threat 

The first step is so obvious that it’s can be easy to overlook. If you want to defend against a threat, you first need to be aware that the threat exists.  For instance, if your employees don’t know what a phish is, they are far more likely to get tricked by oneOne study found that the simple awareness that a threat exists increases the ability to combat it, even when they weren’t given the tools to fight it.  

Refutational Preemption

Refutation preemption is a fancy phrase, but, in the metaphor of the vaccine, it simply stands for the weak strain of a virus or threat. The idea is to introduce someone to faulty messaging that stands in opposition to what they usually hold to be true. By being exposed to a weaker version of the messaging, the person receiving the message will be able to learn how to argue against it and strengthen their own beliefs. Then, when they encounter a similar but stronger message in real life, they will have already developed the tools needed to combat it.  

Within the context of phishing schemes, this would involve presenting someone with examples of phishing emails asking them to identify the methods used that make the email seem real. Another method is to have participants create their own phishing emails to get them to know what is involved in creating a deceptive message.

Involvement

The final element of the theory simply states that the more someone cares about an issue, the easier it will be for them to defend against a threat to that issue. So, when it comes to phishing, if your employees understand and care about the stakes involved with a phishing attack, they will be in a better position to spot them. Essentially, the more vested interest someone has in defending against an attack, the easier it will be for them to do so successfully.  

Putting Inoculation Theory into Practice

With the rise of socially engineered threats, inoculation theory has seen a bit of a resurgence lately. For instance, researchers at Cambridge University created the simulation Get Bad News, a game that uses inoculation theory to combat false or misleading news articles.  

And it doesn’t take a big leap to see how inoculation theory can be useful for cyber security threats, such as phishing campaigns. By combining education with simulated phishing attacks, businesses can use inoculation theory to: 

  1. Using education tools to raise employees’ awareness of the threat phishing attacks pose. 
  2. Expose employees to simulations of phishing attacks and have them proactively respond to it by reporting potential phish. You can even have employees create their own phish. Like Get Bad News, this will further inform participants of common tactics used in social engineering schemes.  
  3. Create a program that keeps employees engaged in the process. Focusing on positive reinforcement over punishing mistakes, for example, will help encourage participants to take the process seriously. 

Inoculation Theory At Work

Our digital awareness program The Phishmarket™uses inoculation theory in various phases throughout the program. Our phish simulations uses a reporting feature that empowers participants to be actively involved in combating phishing attacks and rewards progress instead of punishing mistakes. 

The Phishmarket™ also includes an online training program that uses daily micro-lessons to teach participants about common and emerging methods used in social engineering schemes. Some of the micro-lessons even asks users to try creating their own phish.  

Want to try it out for yourself? Simply follow this link to test out a preview of the training program and create your very own (fake) phishing campaign.