Coronavirus and Cybersecurity: The Human Factors

In the past, cybersecurity threats tend to increase in times of crisis. Now, bad actors are already using the coronavirus pandemic to their advantage. Employers are beginning to ask employees to work from home, and there are already numerous articles on security concerns about remote access. And while it is certainly important to ensure remote access systems are properly secured, it is equally as important to understand the human factors that create certain security vulnerabilities. Mass confusion and panic often lead to faulty or rash decision making, which is precisely what scammers are banking on now. A study by Check Point, for instance, revealed that coronavirus-related web domains are 50% more likely to be malicious than other domains.

When considering the coronavirus and cybersecurity, it is important for employers to use cyber awareness training to ensure employees continue to think critically and use proper judgment online. Here are four key areas to help employees limit their risk of exposure:

Use Multi-Factor Authentication

Perhaps the most important measure you can put in place is to make sure that all remote users are required to use multi-factor authentication (MFA) when accessing your system.

Device Security

Businesses need to ensure all employees that are working from home are taking appropriate steps to keep sensitive information safe. Anyone using remote access needs to be trained in the use of essential endpoint protections. VPNs, for example, are extremely important to make sure logs can’t be sniffed out by others in the neighborhood.

Employees should also be reminded of basic measures to take with personal devices. Screen and application time-outs should be set up to limit the risk that unwanted eyes around the house can view sensitive information and communications.

To limit the impact of stolen or lost devices, all sensitive information should be fully encrypted.

Online communication

Employees should be updated about current phishing campaigns that are taking advantage of the confusion and panic surrounding the coronavirus. The World Health Organization recently released a statement warning of fake emails posing as the WHO to steal money, information, and credentials. According to The Wall Street Journal, the WHO is receiving daily reports of coronavirus-related phishing schemes.

Working remotely will also require expanded use of online communications such as email, video services, and phones. It is therefore important that all communications relating to business should only take place through company-approved communication services. It is difficult to monitor the security of personal and social media messaging services and should not be used for any business-related communications.

Reporting and Incident Response

Being aware of increased cyber threats is only half the battle. Employees also need to understand how and when to report any suspected incidents. Keep help desks up and running, and encourage employees to be overly cautious in reporting any suspicious emails or activity. Employees need to know that someone can help if they think any information is at risk. 

Incident response teams should also be briefed on and prepared for any threats related to remote access work. Not only should response teams understand the current threats, everyone involved should have a clear understanding of how communication and responses will be carried out remotely. Because previous response simulations were probably conducted in-office, it is helpful to run a test response using only remote communication.

Communicate and Connect

Companies are ecosystems and healthy corporate ecosystems are a function of purpose, recognition, connection and intentional urgency.  All of which feeds into employee actions, whether it involves cybersecurity issues or marketing or administration or service issues.  Companies which do a better job of communicating what is going on in their organization and connecting with their remote staff and acknowledging their respective situations create a caring environment which helps everyone pay attention to little things – like perhaps not clicking on that strange link or hiding the fact they accidentally sent the wrong person confidential information.

Conclusion

Given the severity of the ongoing coronavirus crisis, bad actors are counting on an increase in confusion, panic, and fear to profit and cause further disruption. The coronavirus and cybersecurity concerns need to be considered, Above all else, employers need to do their part to ensure workers stay well-informed and secure. Working at home might mean we can dress a little more casually, it doesn’t mean we should be any less serious about threats online.

Creating a Vaccine for Phishing Attacks

Creating a Vaccine for Phishing Attacks

Another day another phishing story.  According to reports a scammer recently sent out emails to a Texas school district posing as one of the district’s vendors and requested a series of payments. One month later, the district realized they had been conned out of $2.3 million. 

Unfortunately, stories like these are increasingly common 

Not unlike propaganda, social engineering and phishing campaigns are forms of attack that rely primarily on deception and disinformation. Defending against these attacks therefore requires more than technical defenses. Instead, it’s necessary to look at strategies used to combat disinformation in general.  

A Vaccine for Social Influence

Inoculation theory is one such strategy and has been gaining steam recently. The main premise of the theory is that the best way to defend against manipulation and unwanted influence is through exposure to the influence in a smaller, weaker form. Exactly like a vaccination.  

In general, the application of inoculation theory involves three basic elements: 

Threat 

The first step is so obvious that it’s can be easy to overlook. If you want to defend against a threat, you first need to be aware that the threat exists.  For instance, if your employees don’t know what a phish is, they are far more likely to get tricked by oneOne study found that the simple awareness that a threat exists increases the ability to combat it, even when they weren’t given the tools to fight it.  

Refutational Preemption

Refutation preemption is a fancy phrase, but, in the metaphor of the vaccine, it simply stands for the weak strain of a virus or threat. The idea is to introduce someone to faulty messaging that stands in opposition to what they usually hold to be true. By being exposed to a weaker version of the messaging, the person receiving the message will be able to learn how to argue against it and strengthen their own beliefs. Then, when they encounter a similar but stronger message in real life, they will have already developed the tools needed to combat it.  

Within the context of phishing schemes, this would involve presenting someone with examples of phishing emails asking them to identify the methods used that make the email seem real. Another method is to have participants create their own phishing emails to get them to know what is involved in creating a deceptive message.

Involvement

The final element of the theory simply states that the more someone cares about an issue, the easier it will be for them to defend against a threat to that issue. So, when it comes to phishing, if your employees understand and care about the stakes involved with a phishing attack, they will be in a better position to spot them. Essentially, the more vested interest someone has in defending against an attack, the easier it will be for them to do so successfully.  

Putting Inoculation Theory into Practice

With the rise of socially engineered threats, inoculation theory has seen a bit of a resurgence lately. For instance, researchers at Cambridge University created the simulation Get Bad News, a game that uses inoculation theory to combat false or misleading news articles.  

And it doesn’t take a big leap to see how inoculation theory can be useful for cyber security threats, such as phishing campaigns. By combining education with simulated phishing attacks, businesses can use inoculation theory to: 

  1. Using education tools to raise employees’ awareness of the threat phishing attacks pose. 
  2. Expose employees to simulations of phishing attacks and have them proactively respond to it by reporting potential phish. You can even have employees create their own phish. Like Get Bad News, this will further inform participants of common tactics used in social engineering schemes.  
  3. Create a program that keeps employees engaged in the process. Focusing on positive reinforcement over punishing mistakes, for example, will help encourage participants to take the process seriously. 

Inoculation Theory At Work

Our digital awareness program The Phishmarket™uses inoculation theory in various phases throughout the program. Our phish simulations uses a reporting feature that empowers participants to be actively involved in combating phishing attacks and rewards progress instead of punishing mistakes. 

The Phishmarket™ also includes an online training program that uses daily micro-lessons to teach participants about common and emerging methods used in social engineering schemes. Some of the micro-lessons even asks users to try creating their own phish.  

Want to try it out for yourself? Simply follow this link to test out a preview of the training program and create your very own (fake) phishing campaign.  

Cyber Awareness 4 mins at a time

Last week we announced our new Behavior-Designed Cyber Awareness ProgramOne part of that program will be a structured phish simulation campaigns; another part of the program is series of courses on a broad range of topics related to digital awareness, appropriate security practices, and behavioral biases which impact susceptibility to phishing emails and other forms of social engineering. Each course contains a number of micro-lessons designed to take only a few minutes — typically around 4 minutes — to complete. The intent of each course, in addition to the phish simulations that will run concurrently, is to give participants the tools they need to recognize and modify their online behavior in order to maintain a safer and healthier digital presence.  

Soon we will be rolling out the entire program, but for now we want to offer a sneak peak of what’s to come. Right now we are offering a free preview of a course on phishing attacks and how to spot them. If you want to try it out click here and enroll now for free 

And, if you haven’t already, you can check out a review of our new program published as a part of the Stanford Peace Tech Lab. 

Behavior-Designed Cyber Awareness — A New Program

For the Past Year, Designed Privacy has been working to integrate behavior design into the cyber awareness process. Through a series of testing, we have created a CyberAwareness Program which we are launching this Fall.  The Program not only shows strong results in reducing phish susceptibility, the behaviors it’s designed to create show the potential to both mitigate digital disinformation efforts and get people to collaborate on reinforcing secure behaviors, whether in the office, at home or with clients and vendors.

In addition, we are extremely pleased to have process and results published by the Peace Innovation Lab at Stanford.

After a year of testing three things are clear:
1). Cyber awareness without behavior change is a waste of time, money and energy;
2). Behavior changes occurs through a combination of ease, prompting and positive reinforcement. People are more apt to change behaviors when they see a positive WIIFM.
3). Behavior-designed cyber awareness not only leads to reduced phish susceptibility, but it also has the potential to lead to better organizational decision making, especially as we are relying more and more on digital information to make those decisions.

In a world of phishing, online scams,  deepfake video and content, and the weaponization of social media, we all need to develop behaviors to help us determine what is real and what is not if we want to be secure, make sound decisions and feel that we still have the space where our choices are our own.

Please read the Stanford Peace Innovation Lab article here.