Google Fined 50 Million Euros for Violations of EU’s New GDPR

The Commission nationale de l’informatique et des libertés (CNIL), France’s nation data protection authority, has just levied a 50 million euro fine on Google for violations of the EU’s General Data Protection Regulation. The GDPR was implemented in May of last year and, widely considered the strictest data regulations in effect, notably gives much of the control back to the consumer, including opt-in consent for the use of private information. Google is appealing the decision.

 

The CNIL found Google in violation of two aspects of the GDPR:

 

First, Google failed to make properly transparent information regarding the use of consumer data. According to the report, “essential information, such as the data processing purposes, the data storage periods or the categories of personal data used for the ads personalization, are excessively disseminated across several documents, with buttons and links on which it is required to click to access complementary information.”

 

Second, Google failed to gain valid consent to process data for ad personalization. The key word here is valid. Google does in fact obtain consent from users, but the CNIL found this consent was not sufficiently informed. “The information on processing operations for the ads personalization is diluted in several documents and does not enable the user to be aware of their extent.” Moreover, the consent obtained was not considered to be “specific or unambiguous”. Google allows users to access ad configuration, however, “the user not only has to click on the button ‘More options’ to access the configuration, but the display of the ads personalization is moreover pre-ticked.” The consent is therefore not obtained with the “clear affirmative action from the user” required for the consent to be considered valid.

 

While Google is not the first company to be fined for violating the GDPR, it is the largest fine received under the new regulations by far. However, the damage could have been a lot worse for Google. Organizations can be fined up to 4% of their annual global revenue, and with 33.7 billion in revenue last quarter alone, Google might consider themselves lucky.

 

Google’s Appeal May Help Clarify the Scope of the CNIL’s Ruling

 

In a statement to Politico, however, Google confirmed they will be appealing the CNIL’s decision: “We’ve worked hard to create a GDPR consent process for personalized ads that is as transparent and straightforward as possible, based on regulatory guidance and user experience testing. We’re also concerned about the impact of this ruling on publishers, original content creators and tech companies in Europe and beyond. For all these reasons, we’ve now decided to appeal.”

 

Google’s claim that their consent process is based on regulatory guidance and user experience testing may point to their argument:  that they followed in good faith regulator guidance (either specific and targeted guidance or public guidance) and the user testing; the regulators may say the consent is not informed and Google might try to refute that via an analysis of its user testing.

 

Google is also appealing to the concerns from companies in other industries on how the GDPR may affect them. Echoing these concerns, CCO of the Financial Times, Jon Slade, told Digiday “the interpretation of GDPR has been inconsistent at best, and in some cases has willfully chosen to ignore both the letter and the spirit of the regulation. The industry now can’t say it hasn’t been warned.” While Google is likely overstating their concern for other industries, the appeal process may at the very least lead to clarify the definition and scope of certain aspects of the GDPR.  


The CNIL’s decision is therefore an essential reminder for any business that transparency and consent is increasingly becoming the name of the game. As the example of Google makes clear, simply having information available to consumers is not enough, that information needs to “intelligible and easily accessible.” While in the United States there are no federal data protections laws with the same scope of the GDPR, states such as California are beginning to pass regulations similar to those in the EU. Companies not currently affected by such regulations therefore still prioritize data processing and put in a place a plan that would allow quick and easy compliance with any new regulations that may be implemented. Or, as Jon Slade puts it, “anyone handling data would be crazy not to look at this strong enforcement of GDPR and double-check themselves.”

 

     

    Please Sir, May I have Another? Why Businesses should better understand their Cybersecurity Risks

    In the recent Op-Ed, Maybe We Have the Cyber Security We Deserve, Roger Grimes makes the argument that, despite the failings of current cyber security practices, there is a certain lack interest on the part of consumers with respect to the  protection of their identity and data online. This attitude, Grimes argues, results from an increasing focus on incidence response rather than prevention. That is, for the average consumer, the inconvenience of stolen data is decreasing: if credit card information gets stolen, you likely won’t be responsible for fraudulent charges; if your login information is compromised, you just change your password. The impact of data breaches on consumers is often rather low, and therefore, without a catastrophic event — a ‘digital 9/11,’ as Grimes puts it — there doesn’t seem to be much urgency for comprehensive change. “We are OK with OK security,” Grimes says. And maybe, he concludes, that’s good enough.

     

    Grimes is not wrong in diagnosing the state of cyber security as reactive rather than proactive. And it is true that incident response should play an important role in cyber security; for businesses it is crucial for mitigating larger losses and reducing overall costs. The problem with Grimes’ argument, however, is that it places the focus on the consumers rather than on businesses and their leadership. Instead of asking why users don’t care about security, we should instead ask why companies don’t care about security.

     

    Organizations have to contend with and prioritize their position relative to risk every day. One would think that, given the scrutiny in the press after every new breach, companies would focus more on cybersecurity risk. But they don’t. The problem that Grimes neglects, however, is the the cultural and semantic disconnect between the technology and the business leadership.

     

    The Equifax breach, for example, happened in large part because a governance structure that stilfied communication between security and IT. According to the report by the House Oversight Committee, the CSO used to report directly to the CIO, but because of personal difference Equifax decided to have the CSO report instead to legal. When others came to fill those positions, however, the structure remained the same. Therefore, according to the report, “collaboration between IT and Security mostly occurred when required, such as when Security needed IT to authorize a change on the network. Communication and coordination between these groups was often inconsistent and ineffective at Equifax.”

     

    While Equifax’s executive structure particularly facilitated a breakdown in communication, having the CSO report to the CIO might not have been good enough. The House’s report goes on to say that “Equifax’s CEO did not prioritize cybersecurity” and that “the CSO was not considered part of the senior leadership team.” This structure therefore excluded the CSO from quarterly senior leadership meetings.

    Shifting the focus to prevention requires businesses to think about the handling of private information that consumers entrust within the context of their overall enterprise strategy. And this is something that can only start at the top. An article for the Harvard Business Review emphasizes the inclusion of security executives in board meetings and other meetings about business priorities. “By including [security executives] in discussions about immediate and long-term business priorities, customer issues, and overall strategies, directors can ensure that the company’s security plan aligns with the company’s business goals.”

     

    Roger Grimes is right, consumers have by and large accepted the inconveniences of data breaches, but the point is that it is up to businesses and their leadership to realize what is directly in from of them: technology and security is not just one aspect of the business, it is the business — and increasingly so. As Cybersecurity expert Bruce Schneier states, consumers will change their views as “Automation, autonomy, and physical agency will make computer security a matter of life and death, and not just a matter of data.” Companies would therefore do well to bridge the gap between its business and its technology, creating a proactive culture that works to protect its most critical asset: their customers.

     

       

      Privacy is coming out of the shadows. Should businesses be scared?

      Just a few months after Facebook’s highly-publicized data breach California passed the strongest regulations on the collection and sale of personal information that the U.S. has ever seen. Around the same time, the EU passed the General Data Protection Regulation (GDPR) that even surpass the new regulations in California. Then, late last month, Google admitted to a breach of information on their Google+ platform that potentially affected over 500,000 users.

      What businesses now need to realize is that such high-profile scandals will likely have direct impacts not simply in Silicon Valley, but on a national and even global scale.

      In fact, on October 22, Google, Facebook, Apple and Microsoft are endorsing a federal privacy law based upon a framework developed by the Information Technology Industry Council.

      To help businesses better understand the impact privacy regulation may have for them, we have put together the top three implications these new regulations could have on businesses in the coming months.

      Consumers will play an active role in how companies collect and use personal information

      Perhaps the strictest aspect of California’s new regulations is the central role consumers will now play in deciding how (or if at all) their information is used. Consumers now have the right to request from companies not only what information is being collected (even allowing the consumer to request an accessible copy of that data), but also for what purpose. Moreover the law allows consumers to request that companies deleted their personal information and can even opt-out of the sale of such information.

      A broader definition of protected private data.

      The California Privacy Act substantially broadens what is considered ‘personal information’ and therefore increases the scope of regulations beyond what we generally consider tech companies. Under the new regulations, ‘personal information’ now includes the consumers’ internet activity, biometric data, education and employment information, as well as information on the consumer’s purchases and personal property. Broadening the definition of personal information therefore implicates far more businesses than the likes of Facebook and Google. Now, any company that collects or uses such consumer data will be subject to regulation.

      Targeted advertising will become less effective

       The effectiveness of targeted online advertising campaigns relies on the extreme specificity enabled by access to consumer data. As Dipayan Ghosh of the Harvard Business Review points out, these regulations will have any impact on any business that makes use of online advertising. Targeted campaigns will become less precise and may therefore “significantly cut into the profits [ ] firms currently enjoy, or force adjustments to [ ] revenue-growth strategies.”

       Any business that has customers in California need to be seriously considering how they will now comply will these new regulations. What’s more, discussions of putting in place federal regulations are well underway and it is possible that California’s new private information laws could form the basis of such regulations. It is therefore in the best interest of any business that makes use of consumer data to seriously consider what impact such regulations could have in the coming months and years.

       What should businesses be doing now, even if they don’t fall into under California or GDPR privacy regulations?

      1. Know what data you are capturing and where it is stored.  Review your data flows in your customer, accounting, employee and other databases so you know what you are capturing, the reason you are capturing it and where you are storing it.  Keeping an accurate data inventory is critical. And, it makes good sense.
      2. Be Transparent to your users with what you are doing with their data.  Review your privacy policies.  Make sure they are free of legalese and clearly explains what you will doing with the data, who (if any) will you share the data with and what rights the user has if they want to have the data changed or removed.  Try not to think of this as a compliance exercise. Think of it as customer engagement. By doing so, you can create a better relationship with your customers because you show that you respect them and their information.
      3. Ask before you Capture — Where possible, get the user’s consent prior to capturing the data.  You will have better customers if they opt in to the relationship rather than finding themselves in one.

      Privacy does not have to be viewed as compliance or even a restriction on doing business.  In fact, successful businesses going forward will use privacy as a tool for increased customer engagement.