By now it is commonly understood that free online services such as social media, search engines, and emails are not actually that free. Instead, we use those services in exchange for data about who we are and what we want, which can then be used to show us highly targeted advertising or even just sold to third-party companies. Our very identities are now the most valuable object in the world and we give it to tech giants every single day.
That’s why there is a growing movement among some lawmakers to make companies pay consumers for the data they use. Data dividends, as it’s called, is now being pushed by politicians like Andrew Yang and California governor Gavin Newsom who argue that, by ensuring companies are paying users for their data, consumers will be empowered to take more control of their online identity and privacy.
The problem, however, is once you take a closer look at the concept Yang and Government Newsom are pushing, it becomes clear that this project, which is meant to promote privacy, ends up reinforcing a system that commodifies consumer data and disincentives privacy-first practices. We are treading a dangerous path if we attempt to monetize identity.
Here is why:
Paying consumers for their data doesn’t protect their privacy. Instead it ends up justifying the current practice of data mining that undermines the right to privacy. Certain companies are already using similar practices. Amazon, for example, offered $25 Amazon gift cards for full body 3D scans of their users. It’s a dramatic example, but fundamentally equivalent to what lawmakers are now proposing.
The concept of privacy is literally a human right and as such cannot be bought and sold in a free and open society. It’s like saying that companies can take away your right of free expression so long as they compensate you for it. Making money off of and sharing user data with third-parties has already been normalized by tech companies, and data dividends only further validates these practices.
This isn’t to say that distributing the money earned from personal information back to the consumer is necessarily a bad thing, it’s simply an issue entirely separate from privacy. If companies are required to give out data dividends, it would in no way lessen the importance of ensuring the privacy of our identities and data.
On Wednesday, The New York Department of Financial Services (NYDFS) announced their first ever cybersecurity charges against title insurance company First American for a data breach that exposed hundreds of millions of records containing sensitive information over the course of nearly five years.
The First American data breach initially occurred in October 2014 after an error in an application update left 16 years worth of mortgage title insurance records available to anyone online without authentication. These documents included information such as social security numbers, tax records, bank statements, and drivers license images. The error went undetected until December 2018, when First American conducted a penetration test that discovered the venerability. According to the NYDFS, however, First American did not report the breach and left the documents exposed for another 6 months, until a cybersecurity journalist discovered and published about the breach.
Charges against First American for their role in the data breach is the first time the NYDFS is enforcing the department’s cybersecurity regulations established in 2017. The regulation requires financial organizations with a license to operate in New York to establish and follow a comprehensive cybersecurity policy, provide training for all employees, implement effective access controls, and conduct regular venerability tests in line with a cybersecurity risk assessment.
First American is facing 6 charges, including failing to follow their internal cybersecurity policy, misclassifying the exposed documents as “low” severity, as well as failing to investigate and report the breach in a timely manner.
While the fine for a violation of the regulation is only up to $1,000, the NYDFS considers each exposed document as a separate violation. So, with up to 885 million records potentially exposed, First American could be looking at millions of dollars in fines if the charges stick.
News of the charges should serve as a wake-up call to U.S. organizations unconcerned with cybersecurity regulations. While the U.S. does not have any federal regulations, and there are a number of state regulations that have gone into effect in the past 5 years. This is merely one of what is likely many companies that will face enforcement unless they take steps now to ensure compliance.
Last week the top court in the European Union found that Privacy Shield, the framework used to transfer data between the E.U. and the U.S., does not sufficiently protect the privacy of E.U. citizens. and is therefore invalid. The courts decision has left many businesses scrambling and throws the difference between E.U and U.S. privacy standards in stark relief.
Privacy Shield was a data sharing framework enacted by the E.U. courts in 2015. Since then, however, the E.U. established the General Data Protection Regulation (GDPR) three years later, which places stricter privacy requirements when processing the data of E.U. citizens. According to the Washington Post, over 5,300 companies — including Facebook, Google, Twitter, and Amazon — that signed up to use the Privacy Shield framework now need to find a new way to handle the data of E.U. citizens in the United States.
The court made their decision after privacy expert Max Schrems filed a complaint against Facebook for violating his privacy rights under the GDPR once Facebook moved his information to the U.S. for processing. While the GDPR does allow the data of E.U. citizens to be transferred to other countries, that data must continue to comply with the GDPR standards after it is transfer. The problem with Privacy Shield, according to the E.U. decision, is that the U.S. government has wide-reaching access to personal data stored in the United States. And while the E.U. acknowledges that government authorities may access personal information when necessary for public security, the courts ultimately found that the U.S. does not meet the requirements of the GDPR “in so far as the surveillance programmes…. are not limited to what is strictly necessary.”
This decision starkly highlights the differences not only in E.U. and U.S. privacy regulations but also the privacy standards used in surveillance activities. In a statement to the Washington Post, Schrems said, “The court clarified…that there is a clash of E.U. privacy law and U.S. surveillance law. As the E.U. will not change its fundamental rights to please the [National Security Agency], the only way to overcome this clash is for the U.S. to introduce solid privacy rights for all people — including foreigners….Surveillance reform thereby becomes crucial for the business interests of Silicon Valley.”
Moving forward, U.S. companies processing E.U. citizen data will either need to keep that data on servers within the E.U. or use standard contractual clauses (SCCs). SCCS are legally agreements created by individual organizations that cover how data is used. Of course, any SCCs will need to be compliant with the GDPR.
The good news: Many companies these days are using cybersecurity controls and security training for their employees. The bad news: A lot of these businesses are putting in the place the bare minimum in order to meet compliance requirements. The truth is, however, the you can be compliant but not secure. Remember the big Target breach in 2013? Hackers were able to take the debit and credit card information of millions are shoppers by accessing Target point-of-sale systems. The irony is that, just months before the attack, Target was certified PCI compliant. In the words of then-CEO Gregg Steinhafel, “Target was certified as meeting the standard for the payment card industry in September 2013. Nonetheless, we suffered a data breach.” Simply put: Target was compliant but not secure.
Creating a Culture
If your security awareness program is a “check the box” compliance program, you can bet your employees are going through the same motions as you are. How has that improved your security posture? It hasn’t. Instead, creating a strong security program is first and foremost about creating a culture around security. And this has to start at the top, with your executive officers and your board. If business leaders set a security-focused tone, then employees will likely follow suit.
The reason a business can be compliant and not secure is because cybersecurity isn’t a one and done deal. Compliance is a state, cybersecurity is an ongoing process that involves the entire organization — from the boardroom to the cubicle. Verizon Data Breach Investigation Report shows that the human factor is the largest factor leading to breaches today. If that’s the case, perhaps instead of checking off the boxes and before investing in that new machine learning intrusion detection gizmo, consider focusing on human learning, engagement and the behaviors that can drive a mindful security culture.
The EU’s General Data Protection Regulation (GDPR), one of the most comprehensive privacy laws in the world, celebrated its two-year anniversary last month. The regulation establishes a range of privacy and data protection rights to EU citizens, such as widened conditions for consent and the right to request companies delete user data, and requires organizations to implement technical safeguards. Along with the regulation comes some pretty hefty fines. Google, for example, received a 50 million euro fine for failing to properly state how they use consumer data. The law also requires that the GDPR commission release a report evaluating the regulation after the first two years, then every four years going forward. In compliance with the law, the commission released their report this month, broadly finding the regulation a success, but also highlighting certain areas for improvement.
According to the GDPR report, one of the regulation’s main successes is the increased awareness of the privacy rights among EU citizens, and that they are empowered to exercise those rights. The report found that 69% of the EU population above 16 has heard of the GPDR and 71% know about their country’s nation data protection agency. One issue however, is that this awareness has not fully translated into the use of these rights. The right to data portability, for example, which allows users to obtain and transfer their data, shows potential to “put individuals at the centre of the data economy,” but, according to the report, is still underutilized.
One other area of success is the flexibility of the regulation in its ability to apply to principles of the law to emerging technologies. This has been especially important recently, with the rise of the COVID-19 pandemic and the numerous tracing apps created. The report found that the GDPR has been successful in providing a framework that allows for innovation while ensuring that these new technologies are created with privacy in mind.
Areas of Improvement
Perhaps the biggest area of concern that the report highlights, is the uneven enforcement of the GDPR among EU states. All EU members states except Slovenia have adopted the law. However, the report notes that the law has not been applied consistently across the board. For example, the GDPR allows individual member states to set the age of consent for data processing, but this has created uncertainty for children and parents and made it more difficult for companies that conduct business across borders. The commission has recommended a creating codes of conduct to apply across all member states in order to allow for more consistency between states.
The GDPR report also found that there is some inconsistency when it comes to enforcing the regulations. Overall, the report found that the various data protection agencies were properly using their strengthened enforcement capabilities, but worried that resources have not been evenly divided among the agencies. While some countries that are seen as tech hubs require additional resources, the commission found that the overall budget allocation was too inconsistent.
Taking a step back, the GDPR report largely shows that the new regulation has had a positive impact on the views towards privacy, and has empowered individuals to take control of their information. The law, however, is still relatively new, and will continue to require tweaks to better serve consumers. Privacy regulations continue to be a work in progress, but are at least headed in the right direction.
Other than California, New York now has some of the strictest cybersecurity regulations in the U.S. In 2017, New York passed a bill that regulates data privacy for the financial services. Now, the Stop Hacks and Improve Electronic Data Security Act (SHIELD Act) is in effect as of March 21st. Unlike previous legislation, compliance is not limited to specific industries and pertains to any business that processes the personal information of New York residents. And, despite the current pandemic, lawmakers have not delayed the implementation of the new law.
Here is what you need to know to ensure compliance with the SHEILD Act.
Much of the data protected under the SHIELD act is already covered by the state’s breach notification laws. This includes social security numbers, driver license numbers, account numbers, and debit and credit card numbers. However, the new regulation expands the definition of protected data by also including biometric data, and email addresses in combination with passwords or security questions and answers.
The SHIELD Act also expands the definition of a security breach. A breach is considered to occur not just if an unauthorized person takes or uses private information, but also if that data is accessible to anyone not considered authorized to view that information. There are many examples of where this could possibly take place, including providing access of sensitive information to third party vendors who do not need to access that information or having the credentials of an email account compromised even though there was no sensitive data in the email folder.
The SHIELD Act also lays out a series of cybersecurity protections needed to maintain compliance with the regulation. Broadly, the act requires businesses to put in place “reasonable safeguards” to ensure the privacy of their information. However, the regulation also requires organizations to maintain a written cybersecurity policy. One of the unique requirements of the policy is that organization must have at least one employee dedicated to maintaining cybersecurity procedures. In addition, cybersecurity policies need to address the following:
Identification of internal and external security risks
Assessment of the ability of technical safeguards to protect against identified risks
The training of employees on security practices
Reviewing security practices of third party vendors
Proper detection and response to unauthorized access
Regular testing of security controls
Secure disposal of protected information within a reasonable time frame.
There are certain businesses that do not need to meet these exact security requirements. Small businesses with under 50 employees, for example, are exempt if they can demonstrate they have taken reasonable steps to ensure the privacy of their information. In addition, organization already regulated by other privacy laws such as HIPAA, Graham-Leach-Bliley Act, or New York Department of Financial Services regulations are covered if they maintain compliance with these other regulations.
Because the scope of the SHIELD Act is so broad and could affect many businesses outside of New York, it is very important for all organizations to carefully review the new regulation. New York is likely to begin enforcement of the regulations very soon, and non-compliant business may receive fines of $5,000 per violation with no penalty caps.
However, even businesses not affected by the SHIELD Act should think seriously about implementing some of the recommended security measures. More and more states are beginning to implement similar regulations, and the burden of implementation could be costly if it is left to the last minute.