Privacy Sells

There is no doubt that technology and digital tools have helped business grow. From more effective lead generation to highly-targeted marketing campaigns, there is a lot that organization can gain from using such tools.  And, there is a lot that consumers gain in terms of ease, cost and convenience.

Follow the adage that “there is no free lunch”, consumers do pay a number of costs related to the access to their data — the costs related to their ability to learn, costs related to their ability to expand beyond their narrow world past decisions, choices and interactions, costs related to their ability to feel and act independent and costs related to their privacy or their ability to choose how and with whom they share information about themselves.

Regulations such as the European GDPR and the California CCPA are upping the ante for businesses to install more privacy mechanisms in place.  And typically, when business hears regulation it hears disruption (in the bad way, not the sexy positive way disruption is used most times today).

But it doesn’t have to be that way.  Set aside the regulation and focus on your brand.  Focus on your relationship with your customer. Then ask yourself the following questions:

  1. Am I willing to be transparent of what I do with my customer’s data?
  2. Am I willing to tell my customers to whom their data may be shared (and hold those parties to the standards I am committing ourselves to with regards to the customer’s data)?
  3. Am I willing to ask my customers if it is ok to use their data for specific purposes?
  4. Am I willing to assist my customers if they wish to change or delete their data from our systems?
  5. Am I focused on only asking for or tracking data that I absolutely need in order to delight them and enhance our combined experience?
  6. Am I prepared to put in necessary safeguards to protect their data while it is on our systems?

If you can say ‘yes’ to each of these questions, not only will you have an opportunity to comply with privacy regulations, but you put yourself in the position of respecting your customer and enhancing your brand.

Perhaps privacy does sell.

 

 

Data Matters

Today, even small businesses collect and store an overwhelming amount of information. And with data breaches occurring all the time, it’s more important than ever that this information is properly secured. But with different databases and systems in place, it becomes easy to lose track of exactly what information you have.

Having a complete picture of your data will not only leave your business in a far better position to respond to a breach, but can help you grow in a number of ways. Properly organized consumer data is incredibly helpful for market research and data analysis, giving you a better sense of who your customers are and why they are working with you. Classifying your data can even help you save money on storage if, for instance, you are holding on to too much or redundant data. 

Here are a few steps you can take to help get a better handle on your data and make sure its protected.

Take an Inventory 

The first step to securing your data is to know exactly what you have. There is a variety of information that most companies collect and store. Proprietary and financial data, employee records, personally identifiable information (PPI), and personal health information (PHI) are all examples of some of the different types of data you may be storing. Consumer data is often covered by a variety of privacy regulations, so tracking what states and countries your customers reside is also important. Taking the time to complete a comprehensive inventory of what types of information you have, where that data is stored, and how it is transferred will go a long way to making sure your systems are secured. 

Rank by Sensitivity

Not all data is created equal. You might want to share some of your data with the world, where others will be regarded as highly sensitive. Ranking your data by sensitivity will help you keep track of what level of security you need for certain types of data. A ranking system commonly used is: public, internal only, confidential, and restricted. Of course, all companies are information systems are unique, so be sure to take the time to create a sensitivity ranking that makes sense for you. 

Define Controls

Once you’ve classified your data by sensitivity, you need to create security controls and procedures for each category. More sensitive data requires more advanced protections, where low-risk information may only need lower-level protections. 

Access restrictions are also essential to securing your data. Not all of your employees will need to access all of your information. Define access based on the level of the information’s sensitivity and the employees that need to utilize that information. You can also create time-sensitive access that will restrictive the availability of data after a certain amount of time. 

 

Regularly Re-Evaluate

Information and technology is constantly in flux. The types of data, its value, and who should have access to it will change regularly, and it’s important that your organization changes with it. Periodically re-evaluating your classification system and security controls will help you stay on top of data. 

 

Data Classification: It’s Good for Business 

Keeping track of your data will benefit your company by not only making your data for more secure, but will also allow you to determine what you can do to streamline your operations and make your business more efficient.

The GDPR’s Got Teeth

This week, the UK’s Information Commissioner’s Office (ICO) proposed two massive fines against companies found in violation of the EU’s newly enacted General Data Protection Regulation (GDPR).  

The first came on Monday when the ICO announced the proposed £183.39m fine against British Airways for a data breach in September 2018. The breach began in June 2018 after users attempting to access British Airways’ website were diverted to a fraudulent site. The attackers used this site to harvest customer information, resulting in the personal data of approximately 500,000 customers being stolen. 

British Airways first notified the ICO of the cyber-attack in September 2018. According to the ICO’s statement, their investigation found that customer information was comprised due to “poor security arrangements at the company, including log in, payment card, and travel booking details as well name and address information.”  

Then on Tuesday the ICO put out another statement, this time proposing a £99.2m fine against Marriott International for a data breach that was discovered in November 2018. The breach was the result of a compromise in the Starwood Hotels’ systems dating back to 2014. Marriott acquired Starwood in 2016 but did not discover the vulnerability until 2018. It is believed that roughly 339 million guest records were exposed between the initial breach and the time it was discovered.  

With regards to the Marriott investigationICO Information Commissioner Elizabeth Denham stated, “The GDPR makes it clear that organizations must be accountable for the personal data they hold. This can include carrying out proper due diligence when making a corporate acquisition, and putting in place proper accountability measures to assess not only what personal data has been acquired, but also how it is protected.” 

The GDPR is the EU’s wide-ranging privacy regulations, requiring companies to “implement appropriate technical and organizational measures… in an effective way… in order to meet the requirements of [the] Regulation and protect the rights of data subjects.” In addition, the regulation establishes broad privacy rights for consumers, including widened conditions of consent for companies to process personal information, the right of users to obtain information on how their data is being used, and even provides users the right to request that companies delete their information.   

Under the GDPR, organizations can be fined up to €20 Million or 4% of annual global profits (whichever is greater).  

Both incidents make clear that the GDPR is taking matters of consumer’s privacy extremely seriously, and they’re sending a message that companies need to as well. From the perspective of the GDPR, business are not passive victims of cyber-attacks, but directly responsible for securing consumers’ information. 

Every organization should take this news to heart, no matter where they do business. Lawmakers in the U.S. are beginning to pass regulations such as the California Consumer Privacy Act that are modelled after the GDPR. Fines such as those proposed against British Airways and Marriott could be devastating to a company. So, it’s essential that all business take steps to ensure they are doing the upmost to protect their data. Now.  

 

I AI, therefore I am. Survelliance Capitalism and its Discontents.

We are at a crossroads between what we typically call identity and digital identity.  In fact, some claim that the crossroads are actually train tracks and our personal identity has already been run over by our digitized identity created by Facebook’s, Google’s, and Amazon’s prediction algorithms.

Regardless we still have time to pose the question:  Do we want to retain (or regain) control over our digital identities? 

This goes beyond what we think about privacy.  The prediction algorithms are not taking your social security number per se; they are, however, taking your behaivor, your traits, your tendencies, your movements, your musings, your likes, your dislikes, who you listen to, who you associate with, your passions, your seethings, even your sleep patterns and driving habits.  And not only are these algorithms mapping your behaviors; they are defining them. In a recent article, author and academic Shoshana Zuboff describes this as a migration from behavior monitoring to behavior actuation.  “The idea is not only to know our behaviour but also to shape it in ways that can turn predictions into guarantees. It is no longer enough to automate information flows about us; the goal now is to automate us.”  We are past traditional capitalism and into what she calls, “Survelliance Capitalism”.

Kinda makes the social security number a quaint concept.

As you might suspect, Zuboff decries Survelliance Capitalism and the loss of our ability to define and protect our own personal experiences. Her well written and thoroughly researched book, Survelliance Capitalism, reviews the extent to which that loss as already occurred and suggests that the only way to get it back is through regulation, similar to what was done to the robber barrons at the turn of the twentieth century.

As bleak as Zuboff makes Survelliance Capitalism out to be, there is a counterpoint:  in post titled, “In praise of survelliance capitalism”, James Pethokoukis of the American Enterprise Institute claims that most people are ok giving up privacy in return for free or inexpensive products and services.  He goes on to state how Europe’s recently instituted privacy regulation, GDPR, is hampering innovation and ceding more marketing dollars to Google and Facebook  while at the same time to most popular brands are by and far those that are winning in the Survelliance Capitalist game.

In both gravitas and breadth, Pethokoukis is no match for Zuboff; it’s not even a fair fight.  But it is this very whimsy that Pethokoukis brings to this argument that underlies the challenges we have in having a cogent discussion that balances what I would call the sanctity of personal experience, with the promise technology has to enable me to realize my aspirations.  I do think Big Tech has created a false choice between the two while at the same time I do not think the issue can easily be addressed simply with more regulation.

More on this in future posts, but technology should enable us to be ourselves and that means enabling us to choose those selves we share and those keep private.

 

 

 

 

Nothing Up My FB Sleeve

Two weeks ago,  Mark Zuckerberg penned an essay detailing Facebook’s shift towards a more privacy-focused platform. “As I think about the future of the internet,” he writes, “I believe a privacy-focused communications platform will become even more important than today’s open platforms.” For Zuckerberg, this predominantly means focusing efforts more on his private messaging services (Facebook Messenger, Instagram Direct, and Whatsapp) by including end-to-end encryption across all platforms.

 

But given mirad privacy scandals plaguing Facebook over the past few years, it is important to look critically at what Zuckerberg is outlining. Many of the critiques of Zuckerberg that have been written focus primarily on the monopolistic power-grab that he introduces under the term “interoperability.” For Zuckerberg, this means integrating private communications across all of Facebook’s messaging platforms. From a security perspective, the idea is to be able to standardize end-to-end encryption across a diversity of messaging platforms (including SMS), but, as the MIT Technology Review points out, this amounts to little more than a heavy-handed centralization of power: “If his plan succeeds, it would mean that private communication between two individuals will be possible when Mark Zuckerberg decides that it ought to be, and impossible when he decides it ought not to be.”

 

However, without downplaying this critique, what seems just as if not more concerning is concept of privacy that Zuckerberg is advocating for. In the essay, he speaks about his turn towards messaging platforms as a shift from the town square to the “digital equivalent of a living room,” in which our interactions are more personal and intimate. Coupled with end-to-end encryption, the idea is that Facebook will create a space in which our communications are kept private.

 

But they won’t, because Zuckerberg fundamentally misrepresents how privacy works. Today, the content of what you say is perhaps the least important aspect of your digital identity. Instead, it is all about the metadata. In terms of communication, the who, the when, and the where can tell someone more about you then simply the what. Digital identities are constructed less by what we think and say about ourselves, and far more through a complex network of information that moves and interacts with other elements within that network. Zuckerberg says that “one great property of messaging services is that even as your contacts list grows, your individual threads and groups remain private,” but who, for example, has access to our contact lists? These are the type of questions that Zuckerberg sidesteps in his essay, but are the ones that show how privacy actually functions today.

 

Like a living room, we can concede that end-to-end encryption will give users more confidence that their messages will only be seen by the person or people within that space. But digital privacy does not function on a “public vs. private sphere” model. If it is a living room, it has the equivalent of a surveillance team stationed outside, recording who enters, how long they stay there for, how that room is accessed, etc. For all his failings, we would be wrong to assume that Zuckerberg is ignorant of the importance of metadata. In large part he has built is fortune on it. What we see in his essay, then, is little more than a not-so-subtle misdirect.

Do Androids Dream of Your Privacy?

 

Today, artificial intelligence is already playing a substantial role in our increasingly connected lives. As the European Commission stated in a report last April, “from using a virtual personal assistant to organise our working day, to travelling in a self-driving vehicle, to our phones suggesting songs or restaurants that we might like, AI is a reality.” And with tech giants like Google and Amazon investing millions of dollars in AI, it is a sure bet that innovation in artificial intelligence will only continue to advance.

 

It’s worth pausing over the consequences this technology from a privacy standpoint. The key to successful AI is not just processing power, but also massive amounts of data. The larger and more in-depth the datasets AI has access to, the more accurate its decisions will be. Companies are therefore incentivized to collect or buy large and diverse amounts of data in order to advance AI technology. According to a report by the Center for Information Policy Leadership, artificial intelligence “broadens the types of and demand for collected data, for example, from the sensors in cell phones, cars and other devices.”

 


AI and De-Identification

 

There is therefore an apparent tension between the drive towards innovation in artificial intelligence and the right to privacy. New regulations, like the California Consumer Privacy Act of 2018 (CCPA) and the EU’s General Data Protection Regulation (GDPR), pose challenges to some of the collection techniques deployed in order to gather data for AI. Section 22 of the GDPR, for instance, address concerns surrounding AI and automated decision-making head on, stating that individuals “have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her,” unless that decision “is based on the data subject’s explicit consent.”

 

One keyword here is profiling. According to the GDPR, profiling is:

 

any form of automated processing of personal data evaluating the personal aspects relating to a natural person, in particular to analyse or predict aspects concerning the data subject’s performance at work, economic situation, health, personal preferences or interests, reliability or behaviour, location or movements.

 

The problem, however, is that AI is explicitly designed to process and compare large sets of data at lightning speed, which only makes identifying an individual extremely simple. Researchers from MIT and Berkeley published a study in December in which they took de-identified data of test subjects’ step counts and were able to use machine learning to re-identity subjects with almost 95% accuracy. According to a lead researcher of the study, “advances in AI make it easier for companies to gain access to health data, the temptation for companies to use it in illegal or unethical ways will increase. Employers, mortgage lenders, credit card companies and others could potentially use AI to discriminate based on pregnancy or disability status, for instance.”

 

Consent in Context

 

Given the unprecedented speed at which AI processes data, it becomes inconceivable that consent provisions like the GDPR’s can actually be enforced. The challenge is that the GDPR doesn’t incorporate the diverse contexts in which AI processes data. A self-driving car must be able recognize pedestrians, for instance, and cannot reasonably receive consent in that context.

 

Something like the GDPR’s right to erasure will therefore start to play a larger role. The right to erasure states that data must be forgotten when it is no longer necessary in relation to the task performed, or when the individual has withdrawn consent. Placing the focus on the right the erasure would allow AI to first process the necessary data, then recognize the context that data is in, and to then receive consent relative to those contexts. Chief Privacy Officer at Cisco, Michelle Dennedy, gives the example of a machine asking consent for specific tasks:

 

“It might say, ‘Okay, well I understand you have a dataset served with this platform… and this platform over here. Are you willing to actually have that data be brought together to improve your housekeeping?’ And you might say ‘no’” He says, ‘Okay. But would you be willing to do it if your heart rate drops below a certain level and you’re in a car accident?’ And you might say ‘yes.’

 

AI Innovation and Privacy in Tandem

 

Above all, the concern for privacy within AI technology comes down to the need to promote the continued recognition of the individual’s rights within a free society, virtual or otherwise. Future development therefore needs to be built upon a framework that guides AI applications based on privacy principles. Artificial intelligence may very well save lives, but must learn to do so without denying individual rights and freedoms those same people demand in every other aspect of their lives.