Privacy Law Forecast for 2019
Image Credit: ID 23689850 © Steve Ball | Dreamstime.com
This past year was quite a whirlwind for privacy and cybersecurity watchers. Just to sum up a few of the top events of last year:
- Facebook’s Cambridge Analytica scandal rocked political headlines
- Europe introduced the GDPR, the most comprehensive data protection legislation to date in the world
- California enacted the California Consumer Privacy Act, becoming the first US state to create GDPR-style rules
- Google came under fire for allowing app developers to read your email, and track your location (even with location tracking off!)
- Marriott’s guest reservation system was hacked, exposing the personal information of up to 500 million guests, including passport numbers and payment numbers for some of those hacked
What will happen in 2019? Here are our top 5 predictions:
- Congress will move closer towards passage of a federal privacy law and will toughen up the FTC’s enforcement powers
- Facebook is only the first target of a series of state-led privacy actions
- Californians will resist efforts to water down the California Consumer Privacy Act
- India’s new data protection law will pose challenges for US businesses that outsource services
- Privacy will butt heads with free speech in the realm of “Deep Fakes”
1. Bipartisan Support for a Federal Privacy Bill and a Tougher FTC
The passage of the California Consumer Privacy Act, the Facebook Cambridge Analytica scandal, and large-scale data breaches have all bolstered bipartisan support for a federal privacy bill.
In September of 2018, senators on both sides of the aisle urged the administration to take initiative on national privacy legislation. Senators Wicker (R-Mississippi), Moran (R-Kansas), Blumenthal (D-Connecticut), and Schatz (D-Hawaii) – all leading members of Commerce Committee subcommittees – sent a joint letter to Commerce Secretary Wilbur Ross urging concerted action to “provide consumers with more transparency and control over the collection and use of their personal data while preserving the innovation at the heart of the internet.”
Currently, two senators have introduced draft legislation. Senator Wyden (D-Oregon) has suggested a tough Consumer Data Protection Act, complete with Sarbanes-Oxley-style disclosure requirements and criminal penalties. Senator Schatz (D-Hawaii) has the support of 15 Democrats for a draft Data Care Act that would impose duties of care, loyalty, and confidentiality on online service providers that collect personal data.
This issue speaks to Republicans and Democrats alike for several reasons. From the Republican point of view, a federal data privacy law will alleviate national security concerns from foreign data hackers. In addition, national legislation will alleviate the compliance burden on businesses from the current patchwork of state rules. From the Democrats point of view, a federal data privacy law will address consumer rights and growing concerns about social media manipulation.
Yet, with such a polarized Congress, the passage of a federal privacy bill will be difficult in 2019. Democrats will want strict privacy legislation and expansive consumer rights, while Republicans will want preemption of California’s new law and the avoidance of onerous regulations on businesses. If Congress is able to pass a bill (I predict a 60% chance of success), it will likely occur at the tail end of 2019 or early 2020, spurred by the January 1, 2020 effective date of California’s law.
Regardless of the passage of a federal privacy law, the FTC will obtain bipartisan support for increased enforcement of current privacy, false advertising, and unfair competition laws against businesses that mismanage data. We already received a preview of this in late 2018, when the FTC testified before the Senate Commerce Subcommittee, calling for more resources and personnel to combat privacy infractions.
2. More Privacy Enforcement Actions by State Attorney Generals
On December 19, 2018, the Washington D.C Attorney General Karl Racine filed the first lawsuit against Facebook over Cambridge Analytica.
In the absence of a federal privacy law, Racine filed the complaint under D.C.’s Consumer Protection Procedures Act (CPPA). The CPPA prohibits “unfair and deceptive trade practices in connection with the offer, sale, and supply of consumer goods and services.” In essence, the D.C. attorney general’s complaint alleges that Facebook misrepresented the level of privacy protection it provided to customers. Furthermore, the complaint alleges that Facebook failed to adequately disclose the extent of its data sharing practices, thereby misleading customers.
The language of Washington D.C.’s consumer protection statute mimics state unfair competition and false advertising laws across the country. Thus, if Racine’s lawsuit gains traction, expect similar actions against Facebook, Google, and other tech companies by state officials or PAGA-like claims in California, New York, Massachusetts, and elsewhere. (Indeed, on January 10, class-action attorney Kimberly Foxx just filed a lawsuit under the Illinois Consumer Fraud and Deceptive Business Practices Act as a Special Assistant State’s Attorney).
In addition to consumer protection statutes, state officials will increasingly sue under existing privacy laws like HIPAA and COPPA, and exact heftier penalties. New York’s hefty $4.95 Million settlement with AOL over children’s tracking this past December is just a signal of things to come.
3. Californians will resist efforts to water down the California Consumer Privacy Act
The California attorney general’s office is hosting public forums on the California Consumer Privacy Act throughout the month of January and early February. After receiving comments through these public forums, the attorney general will craft the governing regulations under the CCPA. This rulemaking process will essentially be a “second bite at the apple” for the CCPA, as the attorney general tries to find the right balance between consumer rights and the CCPA’s burden on California businesses.
Despite heavy opposition to the CCPA’s rules from local chambers of commerce and the tech industry, it is highly unlikely that the CCPA will be watered down for several reasons:
First, the CCPA was already passed as compromise legislation between state regulators and consumer groups, in light of a much tougher privacy law ballot initiative spearheaded by Alastair Mactaggart. If the new privacy regulations deviate too far from the spirit of the Act, then Mactaggart has already promised to put the ballot initiative back on the table.
Second, the largest tech companies (with the largest budgets) are no longer fighting the CCPA. Though Facebook, Google, and Verizon initially lobbied heavily against the passage of a California privacy law, they have quietly withdrawn their efforts in the wake of growing scrutiny of their own privacy practices. Furthermore, most multinational technology companies will already be subject to the GDPR – Europe’s data protection regime – making the CCPA less of a concern to operations.
Third and finally, with the CCPA acting as a sword of Damocles over Congressional passage of a federal privacy bill, national consumer rights groups will fight hard to keep the CCPA intact in the hopes of broader legislation.
4. India’s new data protection law will pose challenges for US businesses that outsource services
India has largely been forgotten in American data privacy headlines, but this will no longer be the case in 2019. India’s Supreme Court, in a landmark decision in August 2017, ruled that privacy is a fundamental right. Following that decision, India’s parliament has been working in earnest on a data protection law just as comprehensive (and onerous) as the GDPR.
The most recent draft Personal Data Protection Bill, released last summer by the Indian government, garnered almost 600 comments from the tech industry, consumer rights groups, and even the US government. This bill is slated to be introduced in parliament in June 2019, after India’s general election.
The draft bill imposes many of the same requirements as the GDPR, including “privacy by design”, consent management, security safeguards, and rights to data access, portability, and deletion.
What’s more controversial, however, are the data localization requirements of the bill. As drafted currently, businesses are required to store at least one copy of Indian personal data records on a server within India, and cannot transfer certain sensitive personal data overseas. If passed, these requirements will be particularly onerous for US and other international companies that outsource vital human resources, administrative, and customer service functions to India. This will also limit the current economies of scale of hosting data in the cloud or on centralized servers outside of India.
5. Balancing Privacy and Free Speech in the Era of Deep Fakes
In 2019, two separate technology developments will challenge our views on privacy and free speech: the emergence of “deep fakes” and the growing accuracy of facial recognition.
What is a deep fake? Deep fakes refer to video and audio recordings created by deep-learning computer applications. AI technologies swap out faces and voices – creating realistic footage of celebrities doing porn or of Nicolas Cage uncovering the Ark of the Covenant.
Obviously, this technology raises privacy concerns. An individual’s face or likeness can be inserted into a sex tape, CCTV footage of a crime, or YouTube clip of a neo-Nazi rally – all without the individual’s consent. And as more video data becomes public, these fake recordings will become more prevalent. “Seeing is believing” will no longer apply to online video.
Improvements in facial recognition technology will only increase the effects of deep-fake technologies. According to a series of NIST studies, “facial recognition software got 20 times better at searching a database to find a matching photograph” between 2014 and 2018. Thus, not only will deep fakes get better (they rely on facial recognition technology), it will also be easier to search for a person’s image. Anyone will be able to Google a video of you – regardless of whether the video is real or not.
Despite the potentially overwhelming personal and societal implications of deep fakes, current laws do not adequately address this technology. Deep fakes generally use an individual’s publicly posted image, not any private or sensitive images, so fall outside the purview of most privacy legislation – including revenge porn statutes. Defamation claims will also be difficult to pursue, as plaintiffs will first need to prove that deep-fake videos are fake, an increasingly challenging task.
Furthermore, any attempt to legislate deep fakes away will run afoul of the First Amendment. The technology has both artistic and satirical applications. Political cartoons and caricatures (like those wildly popular JibJab election cartoons) have long superimposed the faces of politicians or celebrities in comical or even pornographic poses – and arguably deep fakes are just an extension of the same.
Next year, expect to see politicians and courts wrangle over the best ways to regulate deep fakes. The first attempts will likely occur in Europe, as individuals attempt to exercise the GDPR’s “right to be forgotten” with respect to deep fakes. In the meantime, take a closer look at that next YouTube video or Snap. You may not believe what you see.