Coronavirus and the Right to Privacy

 The coronavirus has unquestionably changed the way we live. It has also forced us into strange and, until just a few weeks ago, unthinkable ethical dilemmas. To visit loved ones is worth genuine ethical reflection. Modern nations, especially in the West, are built on an ethics of individual freedoms and the right to privacy. However, the current global health crisis is forcing us to rethink just how fundamental those ethics should be. While we already feel this with regards to the freedom of movement, we are just beginning to contemplate how the coronavirus can and should effect our right to privacy.

Contact Tracing and Enforced Quarantine

In order to limit the spread of the coronavirus, experts emphasize the importance of tracking every contact infected patients have had with others. Countries such as China, Singapore, South Korea, and Taiwan have all taken aggressive measure trace all potential contact infected people have had. These measures are widely considered to be a large reason why these countries have been successful in lowering the rate of transmission. However, the aggressive measures taken have come at the cost of individual privacies.

Taiwan and Singapore, for example, regularly post detailed information about everyone who test positive, including where they live and work, what train stations they have used, and what bars and restaurants they frequent. South Korea now has an app that allows users to track the exactly movement of those infected.

Countries are also using location data to enforce quarantine for those infected. Israel, for example, is now using data collection techniques previously used for counterterrorism efforts to identify anyone potentially exposed to the virus. The government uses this information to send text messages to those exposed ordering them to quarantine.

European and the U.S. Response

As the coronavirus spreads to Europe and the U.S., lawmakers are exploring the use of similar techniques. Italy now uses location data to monitor whether people are obeying quarantine orders. In the U.S., the White House is reportedly in conversations with tech companies to use anonymized location data to track the spread of the virus. HIPPA regulations are being waived to allow doctors and mental health providers to more freely use telecommunication to speak with patients. Companies in Italy, Austria, and Germany have also announced that they will provide location data to governments.

However, with privacy regulations such as the GDPR, it is unclear how aggressively European countries will be able to use personal information. The European Data Protection Board (EDPB) released a statement urging governments to continue to abide by privacy regulations in place. At the same time, however, the EDPB conceded that countries may suspend such regulations “when processing is necessary for reasons of substantial public interest in the area of public health.”

Consequences

Relaxing the right to privacy has garnered mixed responses by government officials and security experts. Many have pointed out that while the measures taken are extreme, personal information such as location data is highly effective in limiting the spread of the coronavirus. “We are stretched very thin in most states,” said the director of the Center for Global Health at Oregon State University, “so this kind of technology can help every state to prioritize, given their limited resources, which communities, which areas, need more aggressive tracking and testing.”

Others are concerned how this could endanger those whose information is made public. In South Korea, some have used information released by the government to identify infected individuals and attack them online. This has led officials to question how the government uses this information, worrying it will discourage others from getting tested for fear of being publicly exposed.

While nearly all countries have explained suspending the right to privacy is a temporary measure for the benefit of the public health, many worry it will have a permanent effect on how governments and countries view privacy concerns. After 9/11, for example, the U.S. used highly invasive surveillance measures that have since become common place among law enforcement agencies. According to the New York Times, privacy experts worry something similar could happen after the current crisis.

What restrictions we, as a society, can tolerate, and what effect this will have after the current crisis remains an open question. However, it may also involve a false choice.  There are technologies to both assist contract tracing and preserve anonymity.  Privacy by Design does not have to be put on pause as we develop these tools.  In fact, if we want to encourage wide adoption, it might be required.

Subscribe to our blog here:  https://mailchi.mp/90772cbff4db/dpblog

Writing a Privacy Policy You’ll Actually Want To Read

Creating a privacy policy is necessary for any business collecting or processing personal information and is essentially a legal agreement between you and people visiting your website. And more often than not privacy policies are thought of as just that: a legal buffer. But with more users mistrusting the services they use, these policies should instead be seen as an opportunity to build trust with customers, establish a level of transparency, and show that your respect their privacy.  

Here is a short primer on what should be included in a privacy policy, and how to write it in a way that is accessible to users.  

The What

What information you collect 

It’s important to be upfront about all type of information you may collect about your users. This not only includes personal information (name, email, phone number, etc.), but also things like usage and analytics data, as well as the first- and third-party cookies.  

How you collect information 

Listing the methods used to collect data is another important aspect of a privacy policy. Is it information that they are freely providing? Is it automatically collected through your browser? Is it collecting through a script or plug-ins on your website? Providing this information will help users make informed decisions on how to navigate your site in a way that fits their privacy needs.  

How you use information 

It’s essential that you inform users not only of what you’re collecting, but how youre using that information. In many cases, it can help explain why it’s important that you collecting this information in the first place. Examples include customer service, payment processing, and improving site experience. On top of these, you’ll also need to state if you’re using data for marketing and joint marketing purposes. 

What information you share and why 

You’ll also want to state any information that you share with others. This might be for something like third-party advertising but can also include other companies related by common ownership, non-affiliates that market to you, or even non-profits using the data for research studies. Today, users are concerned about understanding who has access to their data, so this information is especially important.   

How that information is secured  

This is something you’ll definitely want your users to know about. Listing what security systems and practices you have in place will go a long way to show users that you care about their privacy and are taking the necessary steps to ensure it’s secure. 

What privacy options do users have 

It’s become more common for websites to give users some choice with regards to their privacy. This includes whether they can access the data that has been collected, the ability to change what information they want to share, whether they can delete data previous collected, as well as the ability to decide how long you hold on to their information. If you allow users these options, you want to explicitly state that they have those abilities.  

Who users can contact about privacy concerns 

Another component to your privacy policy should be a contact person that users can contact when they have questions or concerns regarding the policy or any other privacy-related issues. It’s important that users have someone they can reach out to when they have concerns.  

Regulation Compliance 

Lastly, depending on where you operate and even where your servers are located, you may be subject to certain privacy regulations that require you to both include certain components in your policy as well as explicitly state your compliance with these regulations. Two big regulations that could effect your privacy policy is the California Consumer Privacy Act (CCPA) (effective in 2020) and the EU’s General Data Protection Regulation (GDPR). Another important regulation is the Children’s Online Privacy Protection Act (COPPA) which requires certain privacy controls and parental consent before collecting data on children under 13. 

The How

Above all, when it comes to writing your privacy policy, it should be readable. 

Your users shouldn’t need a law degree to understand what’s in the policy. Write in plain English. Keep it as short as possible. While there is a lot of information to include, you should stay as concise as possible. If need be, you can layer the policy, meaning have basic language that provides a general overview and link else for details about different sections. Lastly, you want to ensure that the policy itself is easily accessible to users. It shouldn’t be tucked away in tiny font. Place it somewhere prominent that users to find whenever they’d like to refer back to it. 

This is especially important if you need to comply with the GDPR. Not only does the regulation require you to include certain information in your privacy policy, but also includes requirements to ensure your policy is sufficiently clear. The GDPR’s website provides some guidance on privacy policy best practices that you can find here 

Even if you’re not subject to the GDPR, it’s probably a good idea to try and follow their guidelines as well. Again, your privacy policy isn’t just a legal safeguard. It should be understood as a way to communicate to your users about their privacy and ensure them you’re being transparent about your data collection.  

Privacy Sells

There is no doubt that technology and digital tools have helped business grow. From more effective lead generation to highly-targeted marketing campaigns, there is a lot that organization can gain from using such tools.  And, there is a lot that consumers gain in terms of ease, cost and convenience.

Follow the adage that “there is no free lunch”, consumers do pay a number of costs related to the access to their data — the costs related to their ability to learn, costs related to their ability to expand beyond their narrow world past decisions, choices and interactions, costs related to their ability to feel and act independent and costs related to their privacy or their ability to choose how and with whom they share information about themselves.

Regulations such as the European GDPR and the California CCPA are upping the ante for businesses to install more privacy mechanisms in place.  And typically, when business hears regulation it hears disruption (in the bad way, not the sexy positive way disruption is used most times today).

But it doesn’t have to be that way.  Set aside the regulation and focus on your brand.  Focus on your relationship with your customer. Then ask yourself the following questions:

  1. Am I willing to be transparent of what I do with my customer’s data?
  2. Am I willing to tell my customers to whom their data may be shared (and hold those parties to the standards I am committing ourselves to with regards to the customer’s data)?
  3. Am I willing to ask my customers if it is ok to use their data for specific purposes?
  4. Am I willing to assist my customers if they wish to change or delete their data from our systems?
  5. Am I focused on only asking for or tracking data that I absolutely need in order to delight them and enhance our combined experience?
  6. Am I prepared to put in necessary safeguards to protect their data while it is on our systems?

If you can say ‘yes’ to each of these questions, not only will you have an opportunity to comply with privacy regulations, but you put yourself in the position of respecting your customer and enhancing your brand.

Perhaps privacy does sell.

 

 

The GDPR’s Got Teeth

This week, the UK’s Information Commissioner’s Office (ICO) proposed two massive fines against companies found in violation of the EU’s newly enacted General Data Protection Regulation (GDPR).  

The first came on Monday when the ICO announced the proposed £183.39m fine against British Airways for a data breach in September 2018. The breach began in June 2018 after users attempting to access British Airways’ website were diverted to a fraudulent site. The attackers used this site to harvest customer information, resulting in the personal data of approximately 500,000 customers being stolen. 

British Airways first notified the ICO of the cyber-attack in September 2018. According to the ICO’s statement, their investigation found that customer information was comprised due to “poor security arrangements at the company, including log in, payment card, and travel booking details as well name and address information.”  

Then on Tuesday the ICO put out another statement, this time proposing a £99.2m fine against Marriott International for a data breach that was discovered in November 2018. The breach was the result of a compromise in the Starwood Hotels’ systems dating back to 2014. Marriott acquired Starwood in 2016 but did not discover the vulnerability until 2018. It is believed that roughly 339 million guest records were exposed between the initial breach and the time it was discovered.  

With regards to the Marriott investigationICO Information Commissioner Elizabeth Denham stated, “The GDPR makes it clear that organizations must be accountable for the personal data they hold. This can include carrying out proper due diligence when making a corporate acquisition, and putting in place proper accountability measures to assess not only what personal data has been acquired, but also how it is protected.” 

The GDPR is the EU’s wide-ranging privacy regulations, requiring companies to “implement appropriate technical and organizational measures… in an effective way… in order to meet the requirements of [the] Regulation and protect the rights of data subjects.” In addition, the regulation establishes broad privacy rights for consumers, including widened conditions of consent for companies to process personal information, the right of users to obtain information on how their data is being used, and even provides users the right to request that companies delete their information.   

Under the GDPR, organizations can be fined up to €20 Million or 4% of annual global profits (whichever is greater).  

Both incidents make clear that the GDPR is taking matters of consumer’s privacy extremely seriously, and they’re sending a message that companies need to as well. From the perspective of the GDPR, business are not passive victims of cyber-attacks, but directly responsible for securing consumers’ information. 

Every organization should take this news to heart, no matter where they do business. Lawmakers in the U.S. are beginning to pass regulations such as the California Consumer Privacy Act that are modelled after the GDPR. Fines such as those proposed against British Airways and Marriott could be devastating to a company. So, it’s essential that all business take steps to ensure they are doing the upmost to protect their data. Now.  

 

Google Fined 50 Million Euros for Violations of EU’s New GDPR

The Commission nationale de l’informatique et des libertés (CNIL), France’s nation data protection authority, has just levied a 50 million euro fine on Google for violations of the EU’s General Data Protection Regulation. The GDPR was implemented in May of last year and, widely considered the strictest data regulations in effect, notably gives much of the control back to the consumer, including opt-in consent for the use of private information. Google is appealing the decision.

 

The CNIL found Google in violation of two aspects of the GDPR:

 

First, Google failed to make properly transparent information regarding the use of consumer data. According to the report, “essential information, such as the data processing purposes, the data storage periods or the categories of personal data used for the ads personalization, are excessively disseminated across several documents, with buttons and links on which it is required to click to access complementary information.”

 

Second, Google failed to gain valid consent to process data for ad personalization. The key word here is valid. Google does in fact obtain consent from users, but the CNIL found this consent was not sufficiently informed. “The information on processing operations for the ads personalization is diluted in several documents and does not enable the user to be aware of their extent.” Moreover, the consent obtained was not considered to be “specific or unambiguous”. Google allows users to access ad configuration, however, “the user not only has to click on the button ‘More options’ to access the configuration, but the display of the ads personalization is moreover pre-ticked.” The consent is therefore not obtained with the “clear affirmative action from the user” required for the consent to be considered valid.

 

While Google is not the first company to be fined for violating the GDPR, it is the largest fine received under the new regulations by far. However, the damage could have been a lot worse for Google. Organizations can be fined up to 4% of their annual global revenue, and with 33.7 billion in revenue last quarter alone, Google might consider themselves lucky.

 

Google’s Appeal May Help Clarify the Scope of the CNIL’s Ruling

 

In a statement to Politico, however, Google confirmed they will be appealing the CNIL’s decision: “We’ve worked hard to create a GDPR consent process for personalized ads that is as transparent and straightforward as possible, based on regulatory guidance and user experience testing. We’re also concerned about the impact of this ruling on publishers, original content creators and tech companies in Europe and beyond. For all these reasons, we’ve now decided to appeal.”

 

Google’s claim that their consent process is based on regulatory guidance and user experience testing may point to their argument:  that they followed in good faith regulator guidance (either specific and targeted guidance or public guidance) and the user testing; the regulators may say the consent is not informed and Google might try to refute that via an analysis of its user testing.

 

Google is also appealing to the concerns from companies in other industries on how the GDPR may affect them. Echoing these concerns, CCO of the Financial Times, Jon Slade, told Digiday “the interpretation of GDPR has been inconsistent at best, and in some cases has willfully chosen to ignore both the letter and the spirit of the regulation. The industry now can’t say it hasn’t been warned.” While Google is likely overstating their concern for other industries, the appeal process may at the very least lead to clarify the definition and scope of certain aspects of the GDPR.  


The CNIL’s decision is therefore an essential reminder for any business that transparency and consent is increasingly becoming the name of the game. As the example of Google makes clear, simply having information available to consumers is not enough, that information needs to “intelligible and easily accessible.” While in the United States there are no federal data protections laws with the same scope of the GDPR, states such as California are beginning to pass regulations similar to those in the EU. Companies not currently affected by such regulations therefore still prioritize data processing and put in a place a plan that would allow quick and easy compliance with any new regulations that may be implemented. Or, as Jon Slade puts it, “anyone handling data would be crazy not to look at this strong enforcement of GDPR and double-check themselves.”

 

     

    Privacy is coming out of the shadows. Should businesses be scared?

    Just a few months after Facebook’s highly-publicized data breach California passed the strongest regulations on the collection and sale of personal information that the U.S. has ever seen. Around the same time, the EU passed the General Data Protection Regulation (GDPR) that even surpass the new regulations in California. Then, late last month, Google admitted to a breach of information on their Google+ platform that potentially affected over 500,000 users.

    What businesses now need to realize is that such high-profile scandals will likely have direct impacts not simply in Silicon Valley, but on a national and even global scale.

    In fact, on October 22, Google, Facebook, Apple and Microsoft are endorsing a federal privacy law based upon a framework developed by the Information Technology Industry Council.

    To help businesses better understand the impact privacy regulation may have for them, we have put together the top three implications these new regulations could have on businesses in the coming months.

    Consumers will play an active role in how companies collect and use personal information

    Perhaps the strictest aspect of California’s new regulations is the central role consumers will now play in deciding how (or if at all) their information is used. Consumers now have the right to request from companies not only what information is being collected (even allowing the consumer to request an accessible copy of that data), but also for what purpose. Moreover the law allows consumers to request that companies deleted their personal information and can even opt-out of the sale of such information.

    A broader definition of protected private data.

    The California Privacy Act substantially broadens what is considered ‘personal information’ and therefore increases the scope of regulations beyond what we generally consider tech companies. Under the new regulations, ‘personal information’ now includes the consumers’ internet activity, biometric data, education and employment information, as well as information on the consumer’s purchases and personal property. Broadening the definition of personal information therefore implicates far more businesses than the likes of Facebook and Google. Now, any company that collects or uses such consumer data will be subject to regulation.

    Targeted advertising will become less effective

     The effectiveness of targeted online advertising campaigns relies on the extreme specificity enabled by access to consumer data. As Dipayan Ghosh of the Harvard Business Review points out, these regulations will have any impact on any business that makes use of online advertising. Targeted campaigns will become less precise and may therefore “significantly cut into the profits [ ] firms currently enjoy, or force adjustments to [ ] revenue-growth strategies.”

     Any business that has customers in California need to be seriously considering how they will now comply will these new regulations. What’s more, discussions of putting in place federal regulations are well underway and it is possible that California’s new private information laws could form the basis of such regulations. It is therefore in the best interest of any business that makes use of consumer data to seriously consider what impact such regulations could have in the coming months and years.

     What should businesses be doing now, even if they don’t fall into under California or GDPR privacy regulations?

    1. Know what data you are capturing and where it is stored.  Review your data flows in your customer, accounting, employee and other databases so you know what you are capturing, the reason you are capturing it and where you are storing it.  Keeping an accurate data inventory is critical. And, it makes good sense.
    2. Be Transparent to your users with what you are doing with their data.  Review your privacy policies.  Make sure they are free of legalese and clearly explains what you will doing with the data, who (if any) will you share the data with and what rights the user has if they want to have the data changed or removed.  Try not to think of this as a compliance exercise. Think of it as customer engagement. By doing so, you can create a better relationship with your customers because you show that you respect them and their information.
    3. Ask before you Capture — Where possible, get the user’s consent prior to capturing the data.  You will have better customers if they opt in to the relationship rather than finding themselves in one.

    Privacy does not have to be viewed as compliance or even a restriction on doing business.  In fact, successful businesses going forward will use privacy as a tool for increased customer engagement.