Privacy For Sale

privacy_lemonade

 

The recently announced anti-trust suit against Google is not about privacy, per se.  It is about leveraging monopolistic power to secure a dominant position on mobile devices.  One of Google’s claims is that it provides a free service to consumers so there is, in the end, no harm caused by their actions.

In fact, Google is not offering their services for free; they provide us their capabilities in return for our information and our behavioral tendencies.  That data is pumped into their algorithms that predicts our behaviors and tendencies and then sold to third parties.

What will be interesting is how much of this will be exposed during the case.  Google’s use of data has historically been opaque.  It will also be interesting if this case opens more eyes to the importance and value of privacy.  Are we perfectly happy giving away our privacy in return for free search, or do we have no other choice because Google has so much dominance it permeates our digital worlds whether we want it or not.

In the end, of course, there is no free lunch (or lemonade).  It’s just at what price are we willing to pay?

Contact Tracing Technology Raises Privacy Concerns

Contact Tracing Technology Raises Privacy Concerns

As the COVID-19 pandemic continues, the world has turned to the tech industry to help mitigate the spread of the virus and, eventually, help transition out of lockdown. Earlier this month, Apple and Google announced that they are working together to build contact-tracing technology that will automatically notify users if they have been in proximity to someone who has tested positive for COVID-19. However, reports show that there is a severe lack of evidence to show that these technologies can accurately report infection data. Additionally, the question arises as to the efficacy of these types of apps to effectively assist the marginal populations where the disease seems to have the largest impact.  Combined with the invasion of privacy that this involves, the U.S. needs to more seriously interrogate whether or not the potential rewards of app-based contact tracing outweigh the obvious—and potentially long term— risks involved.

First among the concerns is the potential for the information collected to be used to identify and target individuals. For example, in South Korea, some have used the information collected through digital contact tracing to dox and harass infected individuals online. Some experts fear that the collected data could also be used as a surveillance system to restrict people’s movement through monitored quarantine, “effectively subjecting them to home confinement without trial, appeal or any semblance of due process.” Such tactics have already been used in Israel.

Apple and Google have taken some steps to mitigate the concerns over privacy, claiming they are developing their contact tracing tools with user privacy in mind. According to Apple, the tool will be opt-in, meaning contact tracing is turned off by default on all phones. They have also enhanced their encryption technology to ensure that any information collected by the tool cannot be used to identify users, and promise to dismantle the entire system once the crisis is over.

Risk

Apple and Google are not using the phrase “contact tracing” for their tool, instead branding it as “exposure notification.” However, changing the name to sound less invasive doesn’t do anything to ensure privacy. And despite the steps Apple and Google are taking to make their tool more private, there are still serious short and long term privacy risks involved.

In a letter sent to Apple and Google, Senator Josh Hawley warns that the impact this technology could have on privacy “raises serious concern.” Despite the steps the companies have taken to anonymize the data, Senator Hawley points out that by comparing de-identified data with other data sets, individuals can be re-identified with ease. This could potentially create an “extraordinarily precise mechanism for surveillance.”

Senator Hawley also questions Apple and Google’s commitment to delete the program after the crisis comes to an end. Many privacy experts have echoed these concerns, worrying what impact these expanded surveillance systems will have in the long term. There is plenty of precedent to suggest that relaxing privacy expectations now will change individual rights far into the future. The “temporary” surveillance program enacted after 9/11, for example, is still in effect today and was even renewed last month by the Senate.

Reward?

Contact tracing is often heralded as a successful method to limit the spread of a virus. However, a review published by a UK-based research institute shows that there is simply not enough evidence to be confident in the effectiveness of using technology to conduct contact tracing. The report highlights the technical limitations involved in accurately detecting contact and distance. Because of these limitations, this technology might lead to a high number of false positives and negatives. What’s more, app-based contact tracing is inherently vulnerable to fraud and cyberattack. The report specifically worries about the potential for “people using multiple devices, false reports of infection, [and] denial of service attacks by adversarial actors.”

Technical limitations aside, the effectiveness of digital contact tracing also requires both large compliance rate  and a high level of public trust and confidence in this technology. Nothing suggests Apple and Google can guarantee either of these requirements. The lack of evidence showing the effectiveness of digital contact tracing puts into question the use of such technology at the cost serious privacy risks to individuals.

If we want to appropriately engage technology, we should determine the scope of the problem with an eye towards assisting the most vulnerable populations first and at the same time ensure that the perceived outcomes can be obtained in a privacy perserving manner.  Governments need to lay out strict plans for oversight and regulation, coupled with independent review. Before comprising individual rights and privacy, the U.S. needs to thoroughly asses the effectiveness of this technology while implementing strict and enforceable safeguards to limit the scope and length of the program. Absent that, any further intrusion into our lives, especially if the technology is not effective, will be irreversible. In this case, the cure may well be worse than the disease.

How asking for little can mean a lot — integrating privacy into strategy

How asking for little can mean a lot — integrating privacy into strategy

There’s been a lot of talk about privacy lately, whether it’s about how social media is tracking and selling your every move online, or video-conferencing privacy breaches, or regulations such as GDPR or CCPA.  And now, with COVID-19, there are numerous conversations about the balance between effective mitigation through contract tracing and privacy rights (eg: is it ok for the government to know my health status and track me if I’m positive?).

For Companies — Privacy Builds Trust and Trust Builds Value.

Conversations about privacy are healthy and important.  And as a business, those conversations should be starting early in your strategic planning.  If you do it right, you can build brand value.  If you do wrong, or only do it when pressed by your clients or the press, you have an uphill battle. Just ask Zoom.

Privacy by Design creates the framework for building a brand based on respect

The best thing, therefore, is to get ahead of the curve, and institute a concept called Privacy by Design into your systems and operations planning. Privacy by Design is a set of foundational principles originally developed by the former Privacy Commissioner of Ontario, Ann Cavoukian, and has subsequently been incorporated into the E.U’.s privacy regulations, GDPR.

Privacy as the Default is key

A full review of the Privacy by Design principles are beyond the scope of this blog; they can be reviewed here. One of the principles I would like to review is the concept of Privacy as the Default.  As the name implies, this principle states that all aspects of the system and operational workflows assume privacy first.  For every piece of personal or sensitive information, we first ask why we need it in the first place.  Is it actually crucial to the client’s use of our product or our ability to serve the client?

If we decide we need the data, we should then seek to limit how much and for how long we need to keep the data.  And we should be transparent with our clients as to why and how their data will be used and disposed of and to whom and under what conditions it may be shared.

Differentiation in a digital age is harder than ever.  Fortunately, you can demonstrate that you respect your clients and improve your brand value by being proactive with regards to privacy.

Zoom is leaning in to privacy and security

Zoom is leaning in to privacy and security

Much has been written about the security and privacy issues with the Zoom videoconferencing application.  What may be written more about over the next few months (and in numerous case studies) is how Zoom is responding to those issues.

To begin, the CEO, Eric Yuan, has apologized for Zoom’s prior lack of focus on privacy.  Next, his team has stopped all development projects to focus exclusively on security and privacy issues.  In addition, he has hired Alex Stamos to be Zoom’s privacy and security advisor as well as has recruited top Chief Security Officers from around the world to serve on an advisory board.

With a user base which has more than doubled since the beginning of the year, Zoom has benefited greatly from the WFH global environment.  It is incredible that it has been able to sustain its operability during this growth.  But it’s perhaps more impressive that the company, and its CEO in particular, is focusing seriously and aggressively on privacy.  This is particularly notable in an era that is unfortunately also fraught with profiteering, scamming and passing the buck.

It hopefully is a wake up call for any company to take it’s privacy issues seriously and to recognize that by doing so, you are not only securing public trust, you are creating brand value.

In 1982, Tylenol responded to its own crisis, when some of its products were tampered leading to poisoning, by pulling every bottle off the shelves and owning the issue.  Since then, their response has been a PR crisis case study.

I think Zoom is on its way to becoming a case study as well.

Zoom’s Boom Raises Confidentiality Concerns

Zoom’s Boom Raises Confidentiality Concerns

With stay-at-home orders in place across the globe, online video communication services have seen a skyrocket in use. In particular, the video platform Zoom is on a tear. The company’s shares are on the rise, and mobile app is currently #1 in the Apple app store. Families and friends use it to connect, and entire school systems rely on it to continue classes online. But with the increased use comes increased scrutiny.

According to the New York Times, the New York Attorney General is now looking into Zoom’s security practices. The letter, sent from the state’s Attorney General’s office, expresses concern “that Zoom’s existing security practices might not be sufficient to adapt to the recent and sudden surge in both the volume and sensitivity of data being passed through its network.”  Zoom’s privacy issues have also been noted by Consumer Reports, Forbes and Doc Searles.

Zoom for Telehealth and Legal Counsel

 Worries about Zoom’s privacy standards are of particular concern for industries that require confidentiality, such as medicine, therapy, and legal counsel.

Telehealth services have quickly become commonplace as more and more people are staying at home. The company does provide a HIPPA-compliant version of their services. However, the recent compliance waiver for telehealth allows health care providers to opt for the far cheaper but less secure version of the software. Now, many insurers are allowing health care providers to bill for telehealth visits. This has opened up the floodgates for patients to meet with doctors and therapists over Zoom.

Given the concerns over Zoom’s privacy practice, it is an open questions whether doctor patient confidentiality and attorney client privileges can be properly guaranteed. For example, Zoom boosts the use of end-to-end encryption, but recent reports show this is not entirely accurate. While Zoom does use end-to-end encryption in certain settings, video meetings use another form of encryption that does not restrict the company’s ability to access those communications.

And, of course, Zoom alone can’t stop the trolls from invading your zoom meetings, especially those you’ve shared publicly. Even the FBI is warning about zoom bombing. There are ways to limit that, though by being prudent with your Zoom settings.

Zoom Responds, but is it Enough?

In response to mounting concerns, Zoom updated their privacy policy over the weekend, stating that customer content will not be used for advertising and that videos are only retained if users request it.  This update is important and it is good to know that the brand of wine we are toasting each other with during our zoom happy hours won’t be sold to a digital marketer.  However, in a blog post about the changes, Zoom’s chief legal officer, Aparna Bawa, said that new policy only clarifies what information they collect, and does not change the companies practices. Zoom also removed code from their platform that sent data analytics to Facebook, after reports surfaced last week.

Despite these minor changes, it may not enough to protect user privacy and guarantee confidentiality for industries that require it.  Zoom is, indeed, booming and it’s hard to see it receding dramatically in a post-Covid world.  Let’s hope it takes all the reasonable steps it should to respect privacy along its ride.

Coronavirus and the Right to Privacy

Coronavirus and the Right to Privacy

 The coronavirus has unquestionably changed the way we live. It has also forced us into strange and, until just a few weeks ago, unthinkable ethical dilemmas. To visit loved ones is worth genuine ethical reflection. Modern nations, especially in the West, are built on an ethics of individual freedoms and the right to privacy. However, the current global health crisis is forcing us to rethink just how fundamental those ethics should be. While we already feel this with regards to the freedom of movement, we are just beginning to contemplate how the coronavirus can and should effect our right to privacy.

Contact Tracing and Enforced Quarantine

In order to limit the spread of the coronavirus, experts emphasize the importance of tracking every contact infected patients have had with others. Countries such as China, Singapore, South Korea, and Taiwan have all taken aggressive measure trace all potential contact infected people have had. These measures are widely considered to be a large reason why these countries have been successful in lowering the rate of transmission. However, the aggressive measures taken have come at the cost of individual privacies.

Taiwan and Singapore, for example, regularly post detailed information about everyone who test positive, including where they live and work, what train stations they have used, and what bars and restaurants they frequent. South Korea now has an app that allows users to track the exactly movement of those infected.

Countries are also using location data to enforce quarantine for those infected. Israel, for example, is now using data collection techniques previously used for counterterrorism efforts to identify anyone potentially exposed to the virus. The government uses this information to send text messages to those exposed ordering them to quarantine.

European and the U.S. Response

As the coronavirus spreads to Europe and the U.S., lawmakers are exploring the use of similar techniques. Italy now uses location data to monitor whether people are obeying quarantine orders. In the U.S., the White House is reportedly in conversations with tech companies to use anonymized location data to track the spread of the virus. HIPPA regulations are being waived to allow doctors and mental health providers to more freely use telecommunication to speak with patients. Companies in Italy, Austria, and Germany have also announced that they will provide location data to governments.

However, with privacy regulations such as the GDPR, it is unclear how aggressively European countries will be able to use personal information. The European Data Protection Board (EDPB) released a statement urging governments to continue to abide by privacy regulations in place. At the same time, however, the EDPB conceded that countries may suspend such regulations “when processing is necessary for reasons of substantial public interest in the area of public health.”

Consequences

Relaxing the right to privacy has garnered mixed responses by government officials and security experts. Many have pointed out that while the measures taken are extreme, personal information such as location data is highly effective in limiting the spread of the coronavirus. “We are stretched very thin in most states,” said the director of the Center for Global Health at Oregon State University, “so this kind of technology can help every state to prioritize, given their limited resources, which communities, which areas, need more aggressive tracking and testing.”

Others are concerned how this could endanger those whose information is made public. In South Korea, some have used information released by the government to identify infected individuals and attack them online. This has led officials to question how the government uses this information, worrying it will discourage others from getting tested for fear of being publicly exposed.

While nearly all countries have explained suspending the right to privacy is a temporary measure for the benefit of the public health, many worry it will have a permanent effect on how governments and countries view privacy concerns. After 9/11, for example, the U.S. used highly invasive surveillance measures that have since become common place among law enforcement agencies. According to the New York Times, privacy experts worry something similar could happen after the current crisis.

What restrictions we, as a society, can tolerate, and what effect this will have after the current crisis remains an open question. However, it may also involve a false choice.  There are technologies to both assist contract tracing and preserve anonymity.  Privacy by Design does not have to be put on pause as we develop these tools.  In fact, if we want to encourage wide adoption, it might be required.

Subscribe to our blog here:  https://mailchi.mp/90772cbff4db/dpblog