Emerging Technologies Law is a blog by William Ting which examines 21st century legal, business & Social tech issues.

U.S. Privacy Regulation: Glimmer of Hope? (Strategic Privacy Part 2)

U.S. Privacy Regulation: Glimmer of Hope? (Strategic Privacy Part 2)

U.S. privacy regulation: glimmer of hope? (Getty Images license)

U.S. privacy regulation: glimmer of hope? (Getty Images license)

After reviewing the sad state of GDPR here in Part 1, we now turn to examine the U.S. privacy situation. On the state level, all eyes are on the California Consumer Privacy Law to be effective January 1, 2020. Even though Forbes Magazine called it a “regulatory disaster”, California lawmakers and lobbyists are attempting to clarify dozens of ambiguities in that law by scrambling to pass a series of 9 bills before September 2019.

Two data protection & privacy bills are pending in New York. The SHIELD Act contains concrete guidelines to help businesses comply with the data security requirement. This is a good thing unlike the GDPR which is a broth of fuzzy principle-based legal alphabet soup. But the New York Privacy Act contains a concept called the “data fiduciary” that has the potential to either positively or negatively shape the future of U.S. privacy regulation.

The most light from the glimmer of hope in U.S. privacy regulation comes from two sources. First, the U.S. Federal Trade Commission (“FTC”) recently testified before Congress asking for very narrow powers to regulate privacy (unlike the attitude of the European Commission and EDPB who seem to have an unlimited appetite for extremely broad administration discretion). Second, U.S. industry groups like the American Chamber of Commerce and National Association of Manufacturers have extremely active litigation centers that challenges unfair laws and regulations and their uneven enforcement. (In fact both of these NGOs cited my comment letter as the only winning argument in their legal challenge of the conflict minerals rule.) It is unlikely that the mess which the GDPR is in can occur in the U.S. where there is a history of legal activism that protect corporate interests.

Part 2 of the series on “Strategic Privacy Management” discusses key U.S. states and federal privacy initiatives.

Section A examines the California Consumer Privacy Act (“CCPA”) and twin New York bills on data protection & privacy.

Section B explores the key differences between the CCPA and New York privacy bills, especially the latter’s emphasis on “data fiduciary” obligations to be imposed on data monetizers.

Section C sets forth the federal position on privacy and several bills pending on the issue.

Section D explains the history of U.S. corporate legal activism and how this tradition can be supplemented by a new data industry advocacy group that specializes in regulatory affairs relating to the new digital economy.

Section A. California & New York Privacy Regimes

Section A will focus on the key state privacy regimes: California and New York. Please see this summary of other privacy legislations being rolled out in the other states.

  1. California Consumer Privacy Act

The CCPA will become effective January 1, 2020 with accompanying rules to be published by the State Attorney’s Office by July 2020. It is essentially a disclosure law allowing Californian residents the right to know what personal information is collected about them, the business purpose for such collection, and with whom such information will be sold or shared. Most of the data subjects’ other rights mirror those of the GDPR such as consent, disclosure requirements, honoring consumers’ access, opting ­out and deletion rights. But the CCPA goes further than the GDPR by giving Californian residents the right to opt out of the sale of their data and informing them about what data monetizers know about them.

California Flag.jpg

The CCPA is essentially a disclosure law.

(creative commons license)

The CCPA (California Civil Code §1798.100, et seq.) applies to any business that collects or asks others to collect personal information on California residents and:

  • has annual gross revenue over US$25,000,000;

  • annually buys, receives, sells or shares the personal information of 50,000 or more California residents, households or devices; or

  • earns 50% or more of its annual revenue from selling personal information of California residents. (See here for flowchart).

Two of the key definitions of the CCPA are “personal information” and “sell”, both of which are broadly defined.

“Personal information” includes information that “identifies, relates to, describes, or is capable of being associated with a consumer or household” as well as “internet or other electronic network activity information, including, but not limited to, browsing history, search history, and information regarding a consumer’s interaction with an Internet website, application, or advertisement. See Section 1798.140(o)(1)(F). So, a cookie ID and a Social Security number both qualify as personal information.

The definition of “sell” broadly includes “selling, renting, releasing, disclosing, disseminating, making available, transferring, or otherwise communicating orally, in writing, or by electronic or other means, a consumer’s personal information by the business to another business or a third party for monetary or other valuable consideration.” See Section 1798.140(t)(1). Digital advertisers who conduct targeted marketing need to be aware of these potential pitfalls. If their data brokering activities are caught within the CCPA’s definition of “sale”, then the CCPA applies to its digital marketing business. (Part 10 of this series will discuss the effect of both the GDPR and CCPA on targeted marketing.)

It is widely acknowledged that it is too early for businesses to prepare their compliance protocol because there are over 9 bills that attempt to fix ambiguities and the State Attorney General’s Office has yet to issue its implementing regulations (expected within the first half of 2020).

Some of these bills are as follows.

  • AB873 aims to exclude from "personal information" any consumer data that is de-identified or stored in the aggregate, so that it cannot reasonably be traced back to an individual;

  • AB 25, excludes a business's employees or contractors as "consumers" so long as the information processed is in the context of the employee's relationship; and

  • another bill seeks to exempt some insurance companies already covered by the Insurance Information and Privacy Protection Act.

Despite the efforts of pro-consumer lobbyists, the CCPA does not allow a general private cause of action for any violations of its sections. See Section 1798.150(c). Private lawsuits may only be filed for negligent data breach concerning personal information, such as when “nonencrypted or nonredacted public information” has been subject to “an unauthorized access and exfiltration, theft, or disclosure as a result of the business’s violation of the duty to implement and maintain reasonable security procedures and practices appropriate to the nature of the information...” See Section 1798.150(a)(1).

Such lawsuits so filed by private plaintiffs may seek statutory damages of between US$100 and US$750 per consumer per incident, or actual damages, whichever is greater. In assessing statutory damages, courts are to consider, among other things, the nature and seriousness of the misconduct, the number of violations, the persistence of the misconduct, the length of time over which the misconduct occurred, the willfulness of the defendant’s misconduct, and the defendant’s assets, liabilities, and net worth.

Before filing such lawsuits, the private plaintiff must provide 30 days notice to the defendant to cure any violation. But no 30 day period for cure applies to consumer lawsuits seeking actual monetary loss, rather than statutory damages. Data breach class actions under the CCPA can start as early as January 1, 2020.

The CCPA authorizes the State Attorney General to file lawsuits on behalf of the people of California to claim up to US$7500 for each intentional violation or US$2500 for each non-intentional violation. There is ambiguity whether the law will aggregate incidents involving multiple consumers into a single violation or whether “per violation” is going to evolve into a per consumer standard as in the penalties available for private consumer lawsuits. Courts would also need to clarify what constitutes “intentional” violations.

Before filing such lawsuits, the State Attorney must provide 30 days notice to the defendant to cure any violation. There is a moratorium on such lawsuits by the State Attorney until July 1, 2020.

Part 3 of this series will identify ways on how data monetizers may begin to comply with the CCPA.

2. New York SHIELD Act & Privacy Act

New York has been crafting twin legislations on data protection and privacy: the SHIELD Act and the New York Privacy Act.

The New York legislature passed the SHIELD Act (“Stop Hacks and Improve Electronic Data Security Act”) early July 2019 (amending New York’s Information Security Breach and Notification Act passed in 2005) and it is on the governor’s desk for final review. In theory, the SHIELD Act applies to any person or business that collects private information of a New York resident.

SHIELD introduces 3 major changes. It: (a) expands the definition of “private information”, (b) imposes a new requirement for data monetizers to implement reasonable security measures to protect and/or dispose of that data and (c) revises the current law’s data breach disclosure provisions.

Regarding (a), SHIELD would include as “personal information” account numbers and credit or debit card numbers if they could be used to access a data subject’s financial account without additional identifying information and biometric data.

Regarding (b), SHIELD imposes a new data security duty. “Any person or business that owns or licenses computerized data which includes private information of a resident of New York shall develop, implement and maintain reasonable safeguards to protect the security, confidentiality and integrity of the private information including, but not limited to, disposal of data.”See Section 899-bb(2)(a). 

This requirement is based on the standard of reasonableness, not strict liability. “Reasonableness” is also the GDPR standard but DPAs have been treating data breaches as strict liability. See Part 1 here.

Data Security Safe Harbors

Data monetizers are not left without any guidance because SHIELD actually provides two safe harbors with special accommodations for “small businesses”. This is very important because data monetizers (especially small businesses which form the backbone of the U.S. economy that are afforded special accommodations) can easily craft their compliance data security protocols around this safe harbor (and easily connect this to their privacy protocols to be discussed in Part 3 of this series).

NewYorkFlag.jpg

New York’s SHIELD Act offers 2 safe harbors for data security.

(Creative Commons license)

Section 899-bb(2)(b) sets forth two safe harbors and states that “[a] person or business shall be deemed to be in compliance with the reasonable security requirement if:

  • it is subject to and compliant with U.S. federal and New York State financial and health privacy rules and any other data security rules as the relevant competent agency or court may interpret (Section 899-bb(2)(b)(i)); or

  • it “implements a data security program that includes …reasonable administrative, technical, and physical safeguards” (See Section 899-bb(2)(b)(ii)).

A data security program must include “reasonable administrative safeguards” such as the following, in which the person or business:

(1) “designates one or more employees to coordinate the security program;

(2) identifies reasonably foreseeable internal and external risks; 

(3) assesses the sufficiency of safeguards in place to control the identified risks

(4) trains and manages employees in the security program practices and procedures;

(5) selects service providers capable of maintaining appropriate safe-guards, and requires those safeguards by contract; and

(6) adjusts the security program in light of business changes or new circumstances”. (See Section 899-bb(2)(b)(ii)(A)(1-6)).

A data security program must also include “reasonable technical safeguards” such as the following, in which the person or business:

(1) “assesses risks in network and software design; 

(2) assesses risks in information processing, transmission and storage;

(3) detects, prevents and responds to attacks or system failures; and 

(4) regularly tests and monitors the effectiveness of key controls, systems and procedures”. (See Section 899-bb(2)(b)(ii)(B)(1-4)).

A data security program must also include “reasonable physical safeguards” such as the following, in which the person or business:

(1) “assesses risks of information storage and disposal; 

(2) detects, prevents and responds to intrusions; 

(3) protects against unauthorized access to or use of private information during or after the collection, transportation and destruction or disposal of the information; and

(4) disposes of private information within a reasonable amount of time after it is no longer needed for business purposes by erasing electronic media so that the information cannot be read or reconstructed”. (See Section 899-bb(2)(b)(ii)(C)(1-4)).

Special Treatment of Small Businesses

SHIELD provides special accommodations for how small businesses may satisfy the data security requirement. This is important to foster a thriving start-up culture and not stifle innovations with costly data privacy requirements.

A “small business” is defined as “any person or business with:

(i) fewer than fifty employees;

(ii) less than three million [U.S.] dollars in gross annual revenue in each of the last three fiscal years; or

(iii) less than five million dollars in year-end total assets, calculated in accordance with generally accepted accounting principles”. See Section 899-bb(1)(c).

Assuming a data monetizer satisfies the small business definition, SHIELD allows its security program to contain “reasonable administrative, technical and physical safeguards that are appropriate for the size and complexity of the small business, the nature and scope of the small business's activities, and the sensitivity of the personal information the small business collects from or about consumers.” See Section 899-bb(2)(c).

SHIELD Act also expands the meaning of “data breach” to include unauthorized access to computerized data. The current law only applies to unauthorized acquisition. It also requires the data breach victim to make certain written findings before it may conclude that a breach disclosure notice is not required.

The New York State Attorney General may bring civil enforcement actions. There are no private rights of actions. (See Sections 899-bb(2)(d)&(e)).

New York Privacy Act

Based on the draft as of July 15, 2019, the proposed New York Privacy Act (“NYPA”) is still in its early legislative stages. To the date of this article, it is one of the most forward-thinking privacy bill with the potential to introduce substantive protections for consumers and also make it more likely that the U.S. federal government may need to intervene to preempt it by passing a supervening federal privacy regime. In essence it assists both sides of the privacy debate: consumer groups like it because it is tough. Pro-business groups like it because it is tougher than every privacy bill or law of other states that Congress need to intervene.

At the centre of its controversy the NYPA treats data monetizers as “data fiduciaries” a term made popular by Yale Law School Professor Jack Balkin. It incorporates fiduciary law into the privacy space by holding data monetizers up to one of the highest legal standards possible.

The concept of “data fiduciary care” is enshrined in Section 1102(1) which prohibits any use, processing or transfer of personal data, “unless the consumer provides express and documented consent”. All data monetizers “shall exercise the duty of care, loyalty and confidentiality expected of a fiduciary with respect to securing the personal data of a consumer against a privacy risk; and shall act in the best interests of the consumer, without regard to the interests of the data monetizer, in a manner expected by a reasonable consumer under the circumstances”.

This goes much further than the CCPA which does not hold data monetizers to one of the highest legal duties akin to those owned by a trustee to its beneficiaries. It will be extremely difficult for Google or Facebook to even operate in New York or monetize New York residents’ personal data unless the concept of “data fiduciary” is watered down significantly.

Section 1102(1)(a) imposes an express duty of care to protect data subjects. Data monetizers and their affiliates shall “(i) reasonably secure personal data from unauthorized access; and (ii) promptly inform a consumer of any breach of the duty … with respect to personal data of such consumer.

Section 1102(1)(b) imposes a duty of loyalty to protect data subjects. Data monetizers and their affiliates “may not use personal data, or data derived from personal data, in any way that: (i) will benefit the online service provider to the detriment of an end user; and (ii) (A) will result in reasonably foreseeable and material physical or financial harm to a consumer; or (B) would be unexpected and highly offensive to a reasonable consumer. 

Section 1102(1)(c) imposes a duty of confidentiality to protect data subjects. Data monetizers and their affiliates “(i) may not disclose or sell personal data to, or share personal data with, any other person except as consistent with the duties of care and loyalty” under Sections 1102(1)(a)&(b) above; (ii) may not disclose or sell personal data to, or share personal data with, any other person unless that person enters into a contract that imposes the same duties of care, loyalty, and confidentially toward the consumer as are imposed under Section 1102; and (iii) shall "take reasonable steps to ensure" the recipient complies with those obligations assumed under such contract including by auditing, on a regular basis, the data security and data information practices” of such recipient.

Data monetizers owe data subjects a fiduciary duty to protect their data against “privacy risks” which are broadly defined.

“Privacy risk" means “potential adverse consequences to consumers and society arising from the processing of personal data, including, but not limited to:

(a) direct or indirect financial loss or economic harm;

(b) physical harm;

(c) psychological harm, including anxiety, embarrassment, fear, and other demonstrable mental trauma;

(d) significant inconvenience or expenditure of time;

(e) adverse outcomes or decisions with respect to an individual's eligibility for rights, benefits or privileges in employment (including, but not limited to, hiring, firing, promotion, demotion, compensation), credit and insurance (including, but not limited to, denial of an application or obtaining less favorable terms), housing, education, professional certification, or the provision of health care and related services;

(f) stigmatization or reputational harm;

(g) disruption and intrusion from unwanted commercial communications or contacts;

(h) price discrimination;

(i) effects on an individual that are not reasonably foreseeable, contemplated by, or expected by the individual to whom the personal data relates, that are nevertheless reasonably foreseeable, contemplated by, or expected by the controller assessing privacy risk, that:

(A) alters that individual's experiences;

(B) limits that individual's choices;

(C) influences that individual's responses; or

(D) predetermines results; or

(j) other adverse consequences that affect an individual's private life, including private family matters, actions and communications within an individual's home or similar physical, online, or digital location, where an individual has a reasonable expectation that personal data will not be collected or used.” Section 1102(2).

Fiduciary Duty to Data Subjects vs. Duty to Data Monetizer’s Shareholders

It is a basic principle of corporate law that a business owns a duty to its shareholders. But the NYPA seeks to change that by making the data fiduciary duty dominant. Section 1102(3) states that “the data fiduciary duty owed to a consumer …shall supersede any duty owed to owners or shareholders of a “data monetizer”.

Lina Khan, famed anti-trust academic, believes this reversal of the duty to shareholders conflicts with Delaware law, where most American businesses are incorporated. It was reported that her joint article states that"[i]nsofar as the interests of stockholders and users diverge, the officers and directors of these companies may be put in the untenable position of having to violate their fiduciary duties (to stockholders) under Delaware law in order to fulfill their fiduciary duties (to end users)."

Like the SHIELD Act, the NYPA does not try to undercut the primacy of federal medical and financial privacy laws. Unlike the CCPA, which only applies to businesses that have more than US$25 million in annual revenues, the NYPA would apply to all businesses. Section 1103 sets forth 6 consumer rights that are similar to those granted under the GDPR. The NYPA also shares resemblances to the CCPA in that New York residents would have the right to know what data monetizers are collecting about them, know the recipient of their data, and object to having their data shared or sold to third parties.

The NYPA empowers the Attorney General to enforce its provisions. In stark contrast to the CCPA, the NYPA permits private lawsuits and industry leaders fear an uptick in class action lawsuits versus data monetizers. As an incentive, the courts are authorized to award attorneys fees to the prevailing party. (Section 1109(3)).

The NYPA goes further than any state privacy regime as of now by imposing fiduciary duties to protect the data subject that trumps the duty of businesses to their shareholders. Congress will likely need to intervene. This is a positive development if industry will be able to comply with a single federal privacy framework administered by a federal regulator that speaks softly and carries a big stick: the Federal Trade Commission.

Section C. Federal Glimmer of Hope?

All five commissioners (3 Republicans and 2 Democrats) of the U.S. Federal Trade Commission (U.S. privacy regulator) asked a House subcommittee on May 8, 2019 to "[g]ive [them] targeted rulemaking authority, so that [they] can keep up to date and make technical changes for developments in technology or business methods." The commissioners flatly rejected any desire to have unfettered administrative discretion: “Do not give us broad rulemaking authority. Please do not do it."

Even more relevant for our discussion here, the FTC chairman told Congress: “When you give broad rulemaking authority, you're asking five of us or maybe even just three of us to decide what we want…that is not a substitute for the democratic process."

federal-trade-commission-seal-36081_640.png

Respect for the democratic process and limited rulemaking powers helps ensure fair privacy regulation.

(Pixabay Image license)

The FTC’s dedication to upholding the democratic process and narrow rulemaking stands as a stark contrast against European regulatory institutions such as the EU Commission (to whom the European Data Protection Board reports) which have been accused of being politically unaccountable and power-hungry. These are the very reasons that fueled discontent with European institutions among UK voters approving Brexit. Article 69 of the GDPR undercuts political accountability by ensuring the independence of the European Data Protection Board, so that “in the performance of its tasks and exercise of its powers it doesn’t seek nor take instructions for anyone”. This lack of political accountability still resonates today and may be worsening. In July 2019, Ursula von der Leyen obtained the smallest ever majority of votes backing her to become selected as the president of the European Commission. The margin of victory was a meager 9 votes and she had to rely on the support of far right and leftist populist politicians, giving her a very narrow and disparate support base.

Whereas the U.S. FTC desires the limited rulemaking power to adapt the law to changing technology, the EU Commission seeks to bury constituents in “too many detailed rules”. Whereas the FTC reports to the Department of Justice (sitting under the Executive branch which is politically accountable to the electorate), the EDPB is an independent agency under the purview of another independent and unaccountable agency.

Contrasting philosophical mindsets are not the only differences that set the FTC apart from EU institutions. There are important operational differences. For one, the FTC does not possess the power to unilaterally impose fines. Although there is no federal privacy legislation (yet as there are several bills pending), the FTC currently enforces privacy expectations through the Federal Trade Commission Act’s concept of unfair, deceptive trade practices. The FTC must take each violator to court and seek the court’s decision in assessing any fines. The recent proposed US$5 billion payment assessed against Facebook is not technically a fine imposed unilaterally by the FTC. It has no power to do that. It doesn’t even have the right of final approval of that settlement. The Department of Justice does. Instead Facebook agreed to pay that amount to settle any alleged violations of its 2011 consent decree to avoid having to be sued in court by the FTC and risk a much larger penalty.

There is another important difference that distinguishes the FTC: it is vulnerable to administrative law challenges launched by a very active corporate legal culture.

Section D. Corporate Legal Activism

Not all jurisdictions permit legal challenges to overturn entire statutes and regulations (or parts thereof) under constitutional or administrative legal grounds. The UK courts do not have power to overturn primary legislation because Parliament is supreme. The U.S. legal system allows and encourages this because judicial review of laws is part of the duties of the judicial branch of the federal government. Rule-making can also be challenged. The federal Administrative Procedures Act allows regulatory actions to be challenged on the basis that it is “arbitrary, capricious, an abuse of discretion, or otherwise not in accordance with law”.

Private trade associations often file legal challenges against statutes and rules regularly in the U.S. This comes as a shock to some of my professional legal friends in other jurisdictions. The U.S. Chamber of Commerce and the National Association of Manufacturers both have dedicated litigation centers to coordinate and file legal challenges against legislative & regulatory acts that they deem to be contrary to the interests of their constituents. Indeed, these groups cited my comment letter on conflict minerals to challenge the U.S. SEC rule.

This rich tradition of corporate legal activism informs both state and federal agencies on not only what they regulate, but most importantly, how they regulate or face rebuke by legal challenges. There are practical considerations as to why the FTC does not want broad rule-making powers: it increases the likelihood of such legal challenges.

Conclusion

Deep-rooted respect for the democratic process and limited rule-making powers is the crucial difference between the FTC and the EU Commission/EDPB. Part 1 of this series peeled away the layers of problems plaguing the GDPR. Here in Part 2, we find that at the core of these problems is not a lack of understanding of what to do to regulate privacy. It is the inability or unwillingness to change how to regulate privacy. The European top-heavy approach to privacy regulation emphasizes broad rule-making powers unaccountable to the democratic process (and advances in technology). It tries to manage the tremendous social changes brought about by technology through broad rule-making power. This tendency was the very point of Brexit grievances. The U.S. FTC’s light-but-agile approach to privacy regulation aims to manage technology-induced social changes by respecting the democratic process. That is the essence of the glimmer of hope towards the creation of fair, balanced privacy regulation that is checked by administrative lawsuits against whimsical or arbitrary rule-making.

The attitude expressed by the FTC before Congress is the right type of mindset to take on the challenge of regulating data protection & privacy in the 21st century digital economy in which technology is ever changing and the democratic process is ever more important to manage the social effects of such unsettling and invasive technological changes.

iStock-926708322.jpg

Regulation of privacy is ultimately managing technology-induced societal changes.

(Getty Images license)









Monetizing Coopetition: Open Source Licensing & Patents

Monetizing Coopetition: Open Source Licensing & Patents

Comply but Advocate (Strategic Privacy Part 1)

Comply but Advocate (Strategic Privacy Part 1)