As the COVID-19 pandemic continues, the world has turned to the tech industry to help mitigate the spread of the virus and, eventually, help transition out of lockdown. Earlier this month, Apple and Google announced that they are working together to build contact-tracing technology that will automatically notify users if they have been in proximity to someone who has tested positive for COVID-19. However, reports show that there is a severe lack of evidence to show that these technologies can accurately report infection data. Additionally, the question arises as to the efficacy of these types of apps to effectively assist the marginal populations where the disease seems to have the largest impact. Combined with the invasion of privacy that this involves, the U.S. needs to more seriously interrogate whether or not the potential rewards of app-based contact tracing outweigh the obvious—and potentially long term— risks involved.
First among the concerns is the potential for the information collected to be used to identify and target individuals. For example, in South Korea, some have used the information collected through digital contact tracing to dox and harass infected individuals online. Some experts fear that the collected data could also be used as a surveillance system to restrict people’s movement through monitored quarantine, “effectively subjecting them to home confinement without trial, appeal or any semblance of due process.” Such tactics have already been used in Israel.
Apple and Google have taken some steps to mitigate the concerns over privacy, claiming they are developing their contact tracing tools with user privacy in mind. According to Apple, the tool will be opt-in, meaning contact tracing is turned off by default on all phones. They have also enhanced their encryption technology to ensure that any information collected by the tool cannot be used to identify users, and promise to dismantle the entire system once the crisis is over.
Apple and Google are not using the phrase “contact tracing” for their tool, instead branding it as “exposure notification.” However, changing the name to sound less invasive doesn’t do anything to ensure privacy. And despite the steps Apple and Google are taking to make their tool more private, there are still serious short and long term privacy risks involved.
In a letter sent to Apple and Google, Senator Josh Hawley warns that the impact this technology could have on privacy “raises serious concern.” Despite the steps the companies have taken to anonymize the data, Senator Hawley points out that by comparing de-identified data with other data sets, individuals can be re-identified with ease. This could potentially create an “extraordinarily precise mechanism for surveillance.”
Senator Hawley also questions Apple and Google’s commitment to delete the program after the crisis comes to an end. Many privacy experts have echoed these concerns, worrying what impact these expanded surveillance systems will have in the long term. There is plenty of precedent to suggest that relaxing privacy expectations now will change individual rights far into the future. The “temporary” surveillance program enacted after 9/11, for example, is still in effect today and was even renewed last month by the Senate.
Contact tracing is often heralded as a successful method to limit the spread of a virus. However, a review published by a UK-based research institute shows that there is simply not enough evidence to be confident in the effectiveness of using technology to conduct contact tracing. The report highlights the technical limitations involved in accurately detecting contact and distance. Because of these limitations, this technology might lead to a high number of false positives and negatives. What’s more, app-based contact tracing is inherently vulnerable to fraud and cyberattack. The report specifically worries about the potential for “people using multiple devices, false reports of infection, [and] denial of service attacks by adversarial actors.”
Technical limitations aside, the effectiveness of digital contact tracing also requires both large compliance rate and a high level of public trust and confidence in this technology. Nothing suggests Apple and Google can guarantee either of these requirements. The lack of evidence showing the effectiveness of digital contact tracing puts into question the use of such technology at the cost serious privacy risks to individuals.
If we want to appropriately engage technology, we should determine the scope of the problem with an eye towards assisting the most vulnerable populations first and at the same time ensure that the perceived outcomes can be obtained in a privacy perserving manner. Governments need to lay out strict plans for oversight and regulation, coupled with independent review. Before comprising individual rights and privacy, the U.S. needs to thoroughly asses the effectiveness of this technology while implementing strict and enforceable safeguards to limit the scope and length of the program. Absent that, any further intrusion into our lives, especially if the technology is not effective, will be irreversible. In this case, the cure may well be worse than the disease.
There’s been a lot of talk about privacy lately, whether it’s about how social media is tracking and selling your every move online, or video-conferencing privacy breaches, or regulations such as GDPR or CCPA. And now, with COVID-19, there are numerous conversations about the balance between effective mitigation through contract tracing and privacy rights (eg: is it ok for the government to know my health status and track me if I’m positive?).
For Companies — Privacy Builds Trust and Trust Builds Value.
Conversations about privacy are healthy and important. And as a business, those conversations should be starting early in your strategic planning. If you do it right, you can build brand value. If you do wrong, or only do it when pressed by your clients or the press, you have an uphill battle. Just ask Zoom.
Privacy by Design creates the framework for building a brand based on respect
The best thing, therefore, is to get ahead of the curve, and institute a concept called Privacy by Design into your systems and operations planning. Privacy by Design is a set of foundational principles originally developed by the former Privacy Commissioner of Ontario, Ann Cavoukian, and has subsequently been incorporated into the E.U’.s privacy regulations, GDPR.
Privacy as the Default is key
A full review of the Privacy by Design principles are beyond the scope of this blog; they can be reviewed here. One of the principles I would like to review is the concept of Privacy as the Default. As the name implies, this principle states that all aspects of the system and operational workflows assume privacy first. For every piece of personal or sensitive information, we first ask why we need it in the first place. Is it actually crucial to the client’s use of our product or our ability to serve the client?
If we decide we need the data, we should then seek to limit how much and for how long we need to keep the data. And we should be transparent with our clients as to why and how their data will be used and disposed of and to whom and under what conditions it may be shared.
Differentiation in a digital age is harder than ever. Fortunately, you can demonstrate that you respect your clients and improve your brand value by being proactive with regards to privacy.
Much has been written about the security and privacy issues with the Zoom videoconferencing application. What may be written more about over the next few months (and in numerous case studies) is how Zoom is responding to those issues.
To begin, the CEO, Eric Yuan, has apologized for Zoom’s prior lack of focus on privacy. Next, his team has stopped all development projects to focus exclusively on security and privacy issues. In addition, he has hired Alex Stamos to be Zoom’s privacy and security advisor as well as has recruited top Chief Security Officers from around the world to serve on an advisory board.
With a user base which has more than doubled since the beginning of the year, Zoom has benefited greatly from the WFH global environment. It is incredible that it has been able to sustain its operability during this growth. But it’s perhaps more impressive that the company, and its CEO in particular, is focusing seriously and aggressively on privacy. This is particularly notable in an era that is unfortunately also fraught with profiteering, scamming and passing the buck.
It hopefully is a wake up call for any company to take it’s privacy issues seriously and to recognize that by doing so, you are not only securing public trust, you are creating brand value.
In 1982, Tylenol responded to its own crisis, when some of its products were tampered leading to poisoning, by pulling every bottle off the shelves and owning the issue. Since then, their response has been a PR crisis case study.
I think Zoom is on its way to becoming a case study as well.
One can argue about the steps taken so far with regards to the coronavirus, but perhaps no other report has had an impact on what the United States is now doing to curb the spread of the virus than the report published on March 16 by the UK’s Imperial College COVID-19 Response Team. In plain, stark language, the report warns of the dangers of doing nothing and emphasizes that if we want to minimize mortality rate “combining all four interventions (social distancing of the entire population, case isolation, household quarantine and school and university closure) is predicted to have the largest impact.”
Key to this is case isolation and household quarantine, both of which are containment measures. Containment requires, at minimum identification (you have to know who is symptomatic to make sure they are isolated and you have to know who the symptomatic were in contact with to make sure they are quarantined) and communication (you have to know whether you’ve been in contact with someone if you are to self-quarantine).
The technologies exist to help both identification and communication, but at a potential cost to privacy. There’s the impact on privacy to the symptomatic individual, those with whom they have been in contact, and even locations (towns, neighborhoods, stores) through which the person traveled. These risks are not insubstantial. In the case of individuals, it could result in stigmatization, harassment, and even physical threats (if not harm); in the case of locations, it could result in severe economic losses and stigmatization itself. The key to leverage technology with containment is to identify potential privacy risks and embed privacy practices into the technology to minimize those risks.
The MIT Media Lab is doing just this. Yesterday, they released an open-source application called Private Kit: Safe Paths which uses your phone to track your location data and uses that to trace where symptomatic individuals have been and share that information to others so that they can determine whether they may have been in contact with those individuals. And, the app does it in a privacy-preserving way. The app works like this: it first logs your phone’s location data, but keeps it on your phone so that you retain possession of it. If you are diagnosed, you have the choice to consent to sharing your location data with health officials who can make it public. Ultimately, the app will share symptomatic location data with others without the middleman of a health authority so that one can see if they have been in recent contact with anyone who has been symptomatic. It’s a powerful tool that has the potential to have a material impact on containment efforts.
Of particular interest, is the whitepaper MIT developed on this application that outlines the various privacy risks pertaining to containment and how Private Kit addresses them. The report provides an instruction lesson to any organization conduct privacy risk assessments or evaluating privacy controls relative to GDPR or CCPA regulations or to better serve the needs of its constituents.
When confronted with the enormity of something like the coronavirus, its both critical and refreshing to know that we don’t have to throw out our rights to deal with it. After all, in battling something like this virus, we are not only defending our selves, we are preserving the very freedoms that define who we are.
Subscribe to our blog here: https://mailchi.mp/90772cbff4db/dpblog