Changes to the California Consumer Privacy Act (CCPA) have been finalized – Goes into effect January 1

As of September 13th, the California Legislature has finished passing amendments to the California Consumer Privacy Act (CCPA) meaning no more changes to the law will be made before it goes into effect this January.  

Originally passed in September 2018, the CPPA is widely considered to be the most comprehensive privacy law in the U.S. to date. Taking their cue for the E.U.’s GDPR, the CPPA gives California consumers the right to know what data companies collect on them and even opt of the collection and sale of their personal information. However, as we wrote about in Julya number of amendments were introduced that privacy experts fear could greatly reduce the impact of the new law.  

In the months since then, some of those amendments successfully passed while others were reworked or scraped altogether. The legislature passed a number of amendments, most of the highly contested changes were put together in bill 1355 Personal Information. 

Here is an overview of some of the changes that made it through: 

Non-discrimination 

While the CCPA prohibits any discrimination against consumers who opt-out of the sale of personal information, the new amendment makes an exemption if “differential treatment is reasonably related to value provided to the business by the consumer’s data.”  

This is potentially a big deal. While some of this language will likely be challenged and clarified after the Act goes into effect, it opens the door for business to offer different services and/or prices if a user exercises their right to opt-out of the sale of their personal information.  

Definition of Personal Information 

The amendment also makes a very small change to the definition of personal information, but one that could have large implications. In defining what counts as personal information, the bill simply adds the word “reasonably” to the phrase “capable of being associated with” a particular consumer or household. This small change creates some wiggle room for business when it comes to arguing what information is protected under the CCPA.  

This also reinforces the clarification in the amendment that de-identified and aggregate consumer information does not fall within the scope of the CCPA. And with efforts already underway to weaken the definition of de-identified information, this could potentially further limit what personal information is protected.  

Employee Information is Exempt 

The other big change to the CCPA concerns employee information. The new amendments now excludes employees from the right to know, opt-out, or delete any personal information their employer collects and sells. However, this exemption sunsets in 2021 and will therefore have to be re-introduced after that. This will likely be the site of a large battle between unions and privacy advocates on one side and industry groups on the other.  

 

While these changes certainly reduce the scope and impact of the CCPA, the central tenants of the law remained largely intact. Overall, consumers will still be able to exercise their rights to know what personal information businesses are collecting, to opt-out of the sale of this information to third parties, and to even request that a business delete their information. It’s therefore important that all impacted business continue to work to be in compliance by the beginning of next year. 

How Social Loneliness Could Effect Privacy Practices

Social media was designed to connect people. At least, that’s what those behind these sites never stop of telling us. They’re meant to create, as Mark Zuckerberg says, “a digital town square.” Yet, as it turns out, the effect social media has on us seems to actually be going in the opposite direction. Social media is making us less social. 

Last year a study by the University of Pittsburgh and West Virginia University was published showing links between social media use and depression. And now the same team has released new study that takes things a step further. The study found that not only does social media lead to depression, but actually increases the likelihood of social isolation. According to the study’s findings, for every 10% rise in negative experience on social media, there was a 13% increase in loneliness. And what’s more, they found that positive experiences online show no link to an increase in feelings of social connections.  

These two studies make clear what we may already feel: the form in which social media connects us ends up leaving us more isolated. And, as strange as it may sound, this could have a profound impact on how we view our privacy. At root, privacy involves the maintenance of a healthy self-identity. And this identity doesn’t form in a vacuum. Instead, it is shaped through our relationship to a community of people. 

So, to the extent social media is isolating, it is also desensitizing to our notions of ourselves and to the world which surrounds us. When we lose a sense of boundaries in relation to community then anything, including the value of  privacy, can go out the window.  

And this can turn into a vicious cycle: the lonelier you feel, the more you’re likely to seek validation on social media. Yet, the more you seek that validation, the more that sense of loneliness rears its head. And often seeking this type of social validation leads to privacy taking a back seat. Earlier we wrote about an increase in the success of romance scams, which is just one example of how a sense of loneliness can have the effect of corroding privacy practices.  

While these studies don’t exactly mean we should go off the grid, it’s clear that to understand and value ourselves, we need at times to detach from technology. And, from a business perspective, there are lessons to be learned here too. While technology can make communication more convenient, that shouldn’t translate to having every conversation through a digital platform. Pick up the phone. Have lunch with a customer. Talk to them instead of selling themHaving more personalized conversation will not only translate to stronger business relationships but may even have an effect on the value placed on privacy as well.  

Preparing for the CCPA

Time is running out. The California Consumer Privacy Act (CCPA) goes into effect January 1st 2020, and businesses need to be taking the steps necessary to comply. The new law is widely considered to be the most comprehensive privacy regulation in the U.S. to date and won’t just affect businesses operating within the state of California. Instead, any organization that collects the personal information of California residents might be subject to the new regulation. It’s important that every business reviews the regulation to understand whether they will be required to comply.  

And while the CCPA has many similarities to the E.U.’s General Data Protection Regulation (GDPR)organizations should not assume that compliance with one automatically means compliance with the other. It’s therefore essential that any business potentially affected by California’s new law understand what compliance entails and take steps to put any necessary new systems in place.  

Compliance: The Essentials

Inventory California Data

Really, it’s always a good idea to conduct an inventory of the data collected and processed, but it’s going to be especially important for compliance with the CCPA. Because the regulation gives consumers the right to request information about how their data is used, the first step will be to conduct and maintain a comprehensive inventory of your data. This should include not only what data you’re collecting, but also how it’s collected, where it’s stored, and who it’s shared with.  

It’s important to note that “personal information” covers more than just names and addresses. It also includes, among others, biometric data, geolocations, and internet activityReally, any information that can be linked back to an individual will fall under the scope of the CCPA.  

Develop Systems to Process Consumer Requests

After conducting a throughout inventory of this data, organizations will need to put in place procedures to quickly and accurately processing consumer requests to access this information. Under the CCPA, consumers have the right to request information on what data is being collected and who that information is being shared with. 

The regulation requires organizations to provide at least two methods for requesting this information, including at minimum a toll-free number and a webpage designated for requests. Once a request is made, businesses need to be able to quickly process and fulfill them. The CPPA requires all requested information to be delivered to the consumer within 45 days of the request.  

For most businesses, this will be the toughest aspect of the regulation to put in place. To help, there are a number of automated tools that can assist with processing. We also recommend having someone on staff certified in privacy through the IAPP or have someone on retainer who can assist with the process.  

Introduce an Opt Out Link on the Homepage

Under the CCPA, businesses will need to include a link on their homepage allowing users to opt out of the sale of any personal information. The regulation requires that this link needs to be “clear and conspicuous” and be titled “Do Not Sell My Personal Information.” Consumers also need to be able complete the opt out request without having to create an account.  

Update Privacy Policy

The CCPA will require businesses to update their privacy policy. According to the regulation, privacy policies will now need to include a description of consumer rights under the CCPA as well as a list of the types of personal information the company collects, shares, and sells with other entities. The privacy policy should also include the link to the “Do Not Sell My Personal Information” page. 

Review Overall Cybersecurity Policies and Practices

On a more general level, businesses should also take the time to ensure their cybersecurity policies and procedures are up to snuff. According to the CCPA, if an organization experiences a data breach, they will be considered responsible and be subject to fines if the state deems the organization to have failed to implement and maintain reasonable security procedures and practices.” There will likely be more clarification on what “reasonable security procedures and practices” entails once the regulation goes into effect, but organizations should play it safe and ensure they have a strong cybersecurity system in place to safeguard against potential liability 

New York Isn’t Sleeping on Consumer Privacy

Two years later the impact of Equifax’s massive data breach continues to be felt. As we reported last week, the FTC announced a $700 million settlement with Equifax. Then on Thursday, in reported response to the settlement, New York governor Andrew Cuomo signed two new data privacy bills into law.  

Here is a quick run down of the two privacy laws New York passed last week and how they could impact your business:

Senate Bill S3582

The first bill passed into law last week concerns consumer credit reporting agencies. Under the new law, all credit reporting agencies that have experienced a data breach are required to offer effected consumers free identity theft prevention and mitigation services for up to five years. The law additionally gives effected customers the right to freeze their credit at no cost.  

SHIELD Act

The second and by far most impactful of the laws passed is the Stop Hacks and Improve Electronic Data Security Act (SHIELD Act). While the focus is simply on breach notification procedures, the law is noticeable for the expanded scope of the regulation and the broadened definitions it introduces. 

In short, the new law requires businesses to report any breach of personal information that an organization hasWhat is notable, however, is that the SHEILD Act doesn’t just apply to businesses operating within New York. Instead, any organization that owns the personal information of New York residents must now comply with the reporting requirements.  

What’s more, the law expands the definition of what counts as a data breach. Traditionally, data breaches are understood as an instance where someone actually takes an organization’s data. The SHIELD Act, however, expands this definition to also include instances where the data has simply been accessed by an outside entity. The definition of personal information has also been expanded by thnew law to include biometric data and a usernames or email addresses in combination with a password or security question.  

In addition to notification requirements, the law requires businesses with the personal information of New York residents to implement “reasonable” security requirements. These include compliance with regulations such as HIPPA and GLBA as well as “reasonable administrative, technical and physical safeguards.” 

Lastly, the law lays out a new penalty framework for organizations that fail to properly report data breaches. Under the SHEILD Act, action against businesses will be pursued by the State Attorney General rather than through individual or class action civil suits. The law also increases the maximum penalty for organizations from $150,000 to $250,000.  

Signs of Regs to Come

These two new laws solidify the impression that New York is working hard to strengthen its stance on cyber security and data privacy. Just last month state senator Kevin Thomas introduced the New York Privacy Act, considered by some to surpass even the GDPR in the privacy rights it gives consumers. Perhaps the most unique feature the bill proposes is the concept of Information Fiduciaries. 

While the Privacy Act has a long way to go before passing into law, the ease with which these two laws were enacted may be a sign of things time.  

Writing a Privacy Policy You’ll Actually Want To Read

Creating a privacy policy is necessary for any business collecting or processing personal information and is essentially a legal agreement between you and people visiting your website. And more often than not privacy policies are thought of as just that: a legal buffer. But with more users mistrusting the services they use, these policies should instead be seen as an opportunity to build trust with customers, establish a level of transparency, and show that your respect their privacy.  

Here is a short primer on what should be included in a privacy policy, and how to write it in a way that is accessible to users.  

The What

What information you collect 

It’s important to be upfront about all type of information you may collect about your users. This not only includes personal information (name, email, phone number, etc.), but also things like usage and analytics data, as well as the first- and third-party cookies.  

How you collect information 

Listing the methods used to collect data is another important aspect of a privacy policy. Is it information that they are freely providing? Is it automatically collected through your browser? Is it collecting through a script or plug-ins on your website? Providing this information will help users make informed decisions on how to navigate your site in a way that fits their privacy needs.  

How you use information 

It’s essential that you inform users not only of what you’re collecting, but how youre using that information. In many cases, it can help explain why it’s important that you collecting this information in the first place. Examples include customer service, payment processing, and improving site experience. On top of these, you’ll also need to state if you’re using data for marketing and joint marketing purposes. 

What information you share and why 

You’ll also want to state any information that you share with others. This might be for something like third-party advertising but can also include other companies related by common ownership, non-affiliates that market to you, or even non-profits using the data for research studies. Today, users are concerned about understanding who has access to their data, so this information is especially important.   

How that information is secured  

This is something you’ll definitely want your users to know about. Listing what security systems and practices you have in place will go a long way to show users that you care about their privacy and are taking the necessary steps to ensure it’s secure. 

What privacy options do users have 

It’s become more common for websites to give users some choice with regards to their privacy. This includes whether they can access the data that has been collected, the ability to change what information they want to share, whether they can delete data previous collected, as well as the ability to decide how long you hold on to their information. If you allow users these options, you want to explicitly state that they have those abilities.  

Who users can contact about privacy concerns 

Another component to your privacy policy should be a contact person that users can contact when they have questions or concerns regarding the policy or any other privacy-related issues. It’s important that users have someone they can reach out to when they have concerns.  

Regulation Compliance 

Lastly, depending on where you operate and even where your servers are located, you may be subject to certain privacy regulations that require you to both include certain components in your policy as well as explicitly state your compliance with these regulations. Two big regulations that could effect your privacy policy is the California Consumer Privacy Act (CCPA) (effective in 2020) and the EU’s General Data Protection Regulation (GDPR). Another important regulation is the Children’s Online Privacy Protection Act (COPPA) which requires certain privacy controls and parental consent before collecting data on children under 13. 

The How

Above all, when it comes to writing your privacy policy, it should be readable. 

Your users shouldn’t need a law degree to understand what’s in the policy. Write in plain English. Keep it as short as possible. While there is a lot of information to include, you should stay as concise as possible. If need be, you can layer the policy, meaning have basic language that provides a general overview and link else for details about different sections. Lastly, you want to ensure that the policy itself is easily accessible to users. It shouldn’t be tucked away in tiny font. Place it somewhere prominent that users to find whenever they’d like to refer back to it. 

This is especially important if you need to comply with the GDPR. Not only does the regulation require you to include certain information in your privacy policy, but also includes requirements to ensure your policy is sufficiently clear. The GDPR’s website provides some guidance on privacy policy best practices that you can find here 

Even if you’re not subject to the GDPR, it’s probably a good idea to try and follow their guidelines as well. Again, your privacy policy isn’t just a legal safeguard. It should be understood as a way to communicate to your users about their privacy and ensure them you’re being transparent about your data collection.  

Whose Identity is it Anyway?

Our identity is something we often take for granted. Traditionally understood, identity is a simple one-to-one relation. It’s what links a single person to a single identity.  

However, the digital landscape has changed the very nature of our identity. Now, it’s more accurate to say that a single person contains a whole multiplicity of identities, many of which we don’t have a lot of say in. At bottom, digital identities are constructed far less by what we think and say about ourselves, and far more through a complex network of information that moves and interacts with other elements to construct who we are. 

The Digital Footprint 

 When we go online, we leave a trail of our interactions. From browsing history, to shopping preferences, to movie and music tastes, to ‘likes’ on social media, everything we do is logged and collected. And the emerging landscape of artificial intelligence and the ‘internet of things’ (IoT) greatly expands the traces we leave.  

In some cases, this is done to make our experience online more efficient and convenient. Much of the time, however, our digital footprint is being used to build a detailed profile of who we are. The issue here isn’t so much that we have to wade through a bunch of highly-targeted ads. Instead, it raises essential questions over who has control over who we are 

In a post we wrote last month —and really, it bears repeating— we quoted an article by Shoshana Zuboff, who argues that data collection “is not only to know our behaviour but also to shape it in ways that can turn predictions into guarantees. It is no longer enough to automate information flows about us; the goal now is to automate us.” 

No Privacy without Control of Identity  

In the direction we’re headed, our identities are constructed for us instead of by us. This is largely because of the fact that our informational society is driven far more by the interests of the organizations collecting personal information than the interests of consumers. 

The question then becomes: how can we retain our privacy when it is only known in a digital footprint which, by its nature, is programmed by a third party? Defining our relationship to these identities is essential so that we can define how to protect them.