Privacy Primer: Gramm-Leach-Bliley Act (GLBA)

GLBA, sometimes called the Financial Services Modernization Act of 1999, is a U.S. banking law that has important privacy and data security requirements for institutions that are subject to the law.  The law applies to “any institution the business of which is engaging in financial activities.”

GLBA’s primary purpose was to remove the barriers in the Glass-Steagall Act of 1933 and the Bank Holding Company Act that prevented organizations from functioning in any combination of a commercial bank, an investment bank, and an insurance company.  Nevertheless, concerns arose over the need to protect consumer information as institutions merged these traditionally separate functions, thereby aggregating massive amounts of customer data.  Therefore, GLBA provided for a Safeguards Rule and a Privacy Rule to help protect customer data.

First, the Safeguards Rule requires financial institutions to put in place administrative, technical, and physical safeguards to protect personal information.  This rule requires financial institutions to develop a comprehensive, written information security program that is appropriate for the size and scope of the institution and the sensitivity of the personal information at issue.  Institutions must specifically designate an employee or employees to coordinate this program.  The information security program must identify risks to the security, confidentiality, and integrity of personal information and implement controls to guard against those risks.  The rule also requires institutions to test and evaluate the controls they put in place and appropriately modify their information security program in light of the results.

Next, the Privacy Rule requires financial institutions to provide certain notices with regard to how they share information.  The rule distinguishes between consumers and customers.  For example, an individual who discloses nonpublic personal information on a loan application is a consumer of the institution under GLBA, regardless of whether the institution ultimately approves the loan.  If the institution approves the loan and extends the requested credit, thereby establishing an ongoing relationship with the individual, the individual becomes a customer of the institution.

Under the Privacy Rule, financial institutions must provide “clear and conspicuous” notice of their privacy policies in several situations.  They must provide notice to a consumer before they share any nonpublic personal information about that consumer to an unaffiliated third party.  They must provide notice to a customer no later than the time at which the customer relationship is established, and at least annually thereafter for as long as the customer relationship continues.

In general, these notices must describe the categories of nonpublic personal information the institution collects and shares with affiliated and nonaffiliated third parties and explain the right to opt out of certain disclosures.  With limited exceptions, an institution cannot share an individual’s nonpublic personal information with a nonaffiliated third party without providing the required notice and affording the individual a reasonable opportunity to exercise his or her opt out rights.  Additionally, if an institution revises its privacy policy to allow it to disclose nonpublic personal information that it did not disclose under the old policy, the institution must provide a new privacy notice and afford consumers a reasonable opportunity to opt out before disclosing their information.

GLBA disperses enforcement power across a number of agencies, depending on the institution at issue.  For example, the Board of Governors of the Federal Reserve System has enforcement authority over member banks of the Federal Reserve System, the Securities and Exchange Commission has enforcement authority over brokers and dealers, and the Board of the National Credit Union Administration has enforcement authority over federally insured credit unions.  The Federal Trade Commission has enforcement authority over any financial institution that is not specifically under the authority of any other agency.  State insurance regulators have enforcement authority over insurance providers domiciled in their state.  In addition, while the Consumer Financial Protection Bureau does not have explicit power to enforce the GLBA Safeguards Rule or Privacy Rule, it has used its general authority over unfair, deceptive, or abusive acts or practices to bring enforcement actions against regulated entities that fail to abide by those rules.

Tagged with: , ,
Posted in Legislation, Regulations

Case Update: Wakefield v. ViSalus, Inc.

A couple of months ago, I wrote about how a jury found multilevel marketing company ViSalus, Inc. responsible for making over 1.8 million robocalls in violation of the Telephone Consumer Protection Act.  Given the TCPA’s minimum statutory damages of $500 per call, ViSalus was looking at a minimum of $925 million in damages.  If those violations were found to be willful or knowing, however, damages could be tripled to nearly $2.8 billion, at the discretion of the court.

On Monday, June 24, however, U.S. District Court Judge Michael Simon denied plaintiff Lori Wakefield’s request for enhanced damages.  Judge Smith determined that the case did not call for an assessment of damages above the statutory minimum.  He pointed out that ViSalus did not have a history of TCPA violations and stopped making unlawful calls shortly after it was put on notice that it may be violating the law.  He ultimately determined that the statutory minimum award of $925 million “is sufficient to deter Defendant, and others, from committing future violations of the TCPA and that a further award of enhanced damages are not warranted.”

Although this ruling is certainly a win for ViSalus, it is likely not one that they will be enthusiastically celebrating.  As Judge Smith recognized, $925 million is a significant award.  The more critical issue for ViSalus is a pending post-trial motion to decertify the class, which could actually reduce the damages award if granted.  Coincidentally, the court’s ruling came out on the same day that the FTC announced that it would be cracking down on robocalls.  All in all, it was not a good day for robocallers.

Posted in TCPA

Senate Bill Seeks to Protect Health Information Gathered from Wearable Devices

I wear a fitness tracker.  I rarely take it off.  Throughout the course of the day, it collects a bevy of information about me: my heart rate, my exercise habits, the length and quality of my sleep.  When aggregated and observed over time, this information certainly reveals quite a bit of insight into my personal health.  Yet this health information is not Protected Health Information under HIPAA because the device manufacturer is not a HIPAA-regulated entity.

Senators Amy Klobuchar (D-MN) and Lisa Murkowski (R-AK) recently introduced legislation that recognizes this issue.  The “Protecting Personal Health Data Act” seeks to “protect the personal health data of all Americans.”  It would apply to consumer devices, services, applications, and software that are primarily designed for or marketed to consumers and a substantial purpose of which is to collect personal health data.  This would include direct-to-consumer genetic testing services, wearable fitness trackers, and social media sites that are designed for users to share health conditions and experiences.

The proposed law directs the Secretary of Health and Human Services, in consultation with the Chairman of the Federal Trade Commission and others, to promulgate regulations to strengthen privacy and data security protections for personal health information that is collected by consumer devices.  In doing so, the Secretary would have to account for differences in the nature and sensitivity of the data collected or stored on the consumer device.  Not all personal health data is created equal.

Among other things, the Secretary would also have to consider (i) standards for consent related to the handling of genetic, biometric, and personal health data with potential exceptions for law enforcement, academic research, emergency medical treatment, or determining paternity, (ii) minimum security standards for collected personal health data, and (iii) standards for the de-identification of personal health data.  These standards would include limitations on transferring personal health data to third parties.  They would also include an individual’s right to withdraw consent and access and delete his or her personal health data.

The proposed law would also establish a National Task Force on Health Data Protection to:

(1) study the long-term effectiveness of de-identification methodologies for genetic and biometric data;

(2) evaluate and provide input on the development of security standards, including encryption standards and transfer protocols, for consumer devices, services, applications, and software;

(3) evaluate and provide input with respect to addressing cybersecurity risks and security concerns related to consumer devices, services, applications, and software;

(4) evaluate and provide input with respect to the privacy concerns and protection standards related to consumer and employee health data; and

(5) provide advice and consultation in establishing and disseminating resources to educate and advise consumers about the basics of genetics and direct-to-consumer genetic testing, and the risks, benefits, and limitations of such testing.

Under the bill, the Task Force would have one year to report its findings to Congress, after which the Secretary would have six months to promulgate appropriate regulations.  The bill has been referred to the Committee on Health, Education, Labor, and Pensions.

Tagged with: , ,
Posted in Data Security, Internet of Things

Pennsylvania County Faces Up To $67 Million In Damages For Distribution Of Criminal Record Information

Criminal Record Folder with GavelA suburban Philadelphia county is facing a judgment of up to $67 million after a Pennsylvania federal jury found that it violated the Pennsylvania Criminal History Record Information Act.

Pennsylvania’s Criminal History Record Information Act (“CHRIA”) governs the dissemination of records held by criminal justice agencies.  It requires criminal justice agencies to expunge criminal history record information under certain circumstances.  It also contains detailed restrictions on when a criminal justice agency can distribute criminal record information to agencies other than criminal justice agencies or to individuals.  It provides that any person “aggrieved by a violation” of CHRIA “shall be entitled to actual and real damages of not less than $100 for each violation” and “not less than $1,000 nor more than $10,000” for each willful violation.

The Plaintiff in the case alleged that he had been arrested by the Bensalem Police Department in September of 1998 and was subsequently processed through the Bucks County Correctional Facility (“BCCF”).  He then successfully completed a pre-trial rehabilitation program, which allowed him to file a petition for expungement under state law.  He filed that petition, and the court issued an order of expungement in January of 2000.

Nevertheless, in 2007 BCCF created a website that made available to the public criminal history record information, including mug shots and booking photos, of individuals who had been placed in BCCF after their arrest, going back some 70 years.  The information accessible on the website included information for individuals whose criminal records had later been expunged or whose charges had been dismissed.  Plaintiff’s information was accessible on the website.  Plaintiff alleged that a private business running websites named BustedMugshots.com and Mugshotsonline.com was able to gather the information from the BCCF website and make it available on its own website for a fee, without the consent of the affected individual.

As a result, Plaintiff filed a class action complaint on his own behalf and on behalf of others whose records had been expunged, yet their information was published on the BCCF website.  He asserted claims under CHRIA against BCCF and the private websites.  He also asserted claims against the private websites for the unauthorized use of his name or likeness and for false light invasion of privacy.

Plaintiff’s claims against the private websites ultimately failed.  The Court dismissed the CHRIA and unauthorized use of name or likeness claims at the outset.  The Court ruled that CHRIA, by its terms, applied only to criminal justice agencies.  The websites, on the other hand, were private actors.  Therefore, the Court concluded, the websites could have no liability under CHRIA.  The Court also dismissed Plaintiff’s claim for unauthorized use of name or likeness, because Plaintiff failed to show that his name and likeness had “commercial value” as required under the relevant statute.  While the Court allowed Plaintiff’s claim against the websites for false light invasion of privacy to move past the motion to dismiss stage, it ruled in favor of the websites on that claim at summary judgment.  The Court ruled that Plaintiff had failed to produce evidence that the websites acted with actual knowledge or with reckless disregard for the falsity of the information about Plaintiff.  To the contrary, the Court found that the websites had no obvious reason to doubt that the information provided on the BCCF website did not include expunged information.

Consequently, the case moved forward only with respect to the CHRIA claim against BCCF.  The Court granted summary judgment in favor of Plaintiff on liability under CHRIA, finding that the distributed information was criminal record history information under “the unambiguous definition in CHRIA, Pennsylvania’s rules of statutory construction, relevant decisions by Pennsylvania courts, and the Attorney General’s CHRIA Handbook.”

Therefore, the only issues for trial were whether BCCF “willfully” violated CHRIA and to assess damages.  The jury ultimately found that the violations were willful under CHRIA.  It fixed punitive damages at the statutory minimum $1,000 per violation for the nearly 67,000 individuals whose records were unlawfully accessible on the website.  The potential $67 million verdict, however, is the ceiling.  The Court later will determine the exact number of class members who are eligible for the award.  The ultimate number is likely to decrease once deceased class members are removed from the equation.

Posted in Privacy

The Value Of Quickly Disclosing A Data Breach

HackedOne of the first questions a company must answer after it discovers and remediates a data breach is, “What do we tell our customers?”  Companies may delay publicly announcing a data breach out of fear that doing so will harm their reputation with customers, leading to a loss of business.  They may take an inordinate amount of time to make a public announcement, thinking their public statement must be “just right.”  This is backward and outdated thinking.  Rather, a quick public announcement of a data breach is an essential part of saving and rebuilding a company’s reputation after a data breach.

First, it is important to recognize that a company whose data systems have been breached is not in control of when the breach will be revealed to the public.  There are tools available for individuals to see if their email addresses, passwords, social security numbers, credit card numbers, and the like have been posted on the dark web.  There are cybersecurity companies and ethical hackers who are constantly on the lookout for information demonstrating a new data breach.  Not speaking publicly about the problem will not make it go away.  If a system has been compromised, that fact is going to become known sooner rather than later, regardless of whether the owner of the compromised system announces it.

Therefore, the compromised company needs to get ahead of things to control the narrative.  We oftentimes forget that a company that has been hacked is a victim.  A timely public announcement can help to remind the public of that fact.  An announcement that acknowledges the problem, provides a meaningful recourse for those affected, and emphasizes the company’s commitment to work with law enforcement can help to shift the focus toward those who invaded the company’s systems.  Delaying announcement until after a breach is already publicly discovered robs the company of the opportunity to frame itself as part of the solution rather than part of the problem.

Indeed, recent experience shows that the way a company responds to a breach is more likely to cause reputational harm than the breach itself.  As a general matter, the public accepts that data beaches are an unfortunate reality of the digital age despite best efforts to prevent them.  Moreover, given the number and size of data breaches over the past decade, many people are resigned to the fact that much of their personal data has already been compromised.  They want to know of any additional breaches so that they can remain vigilant and spot potential fraud when it occurs.  A timely announcement by the owner of the breached system gives them the information they need.  Unnecessary delay can lead them to believe that the company is not taking its customers’ interests seriously.

Posted in Data Breach

Jury Verdict in TCPA Case Puts Over $925 Million In Damages On The Table

On April 12, 2019, an Oregon federal jury returned a Friday evening verdict in a Telephone Consumer Protection Act (TCPA) class action that could put the defendant on the hook for $925 million in damages.

The TCPA makes it unlawful to make a telephone call to any cell phone or residential telephone line using an artificial or prerecorded voice without the prior express consent of the called party.  It provides a private right of action that subjects violators to actual damages or $500 for each violation, whichever is greater.  A court can award treble damages for any violation it determines to be willful or knowing.

The case of Wakefield v. ViSalus, Inc. started back in 2015 when plaintiff Lori Wakefield filed a class action complaint in the U.S. District Court for the District of Oregon against ViSalus, Inc.  ViSalus is a multilevel marketing company that markets weight loss products, dietary supplements, and energy drinks.  Ms. Wakefield alleged that ViSalus engaged in a marketing campaign that consisted of calling phone numbers that were on the Do Not Call Registry and placing robocalls initiated with a prerecorded message, both without prior express consent.  Eventually, Ms. Wakefield sought certification of three classes, a “Do Not Call Class,” a “Robocall Class,” and an “Oregon-Stop-Calling Class” based on alleged violations of state law.

In June of 2017, the Court certified the Robocall Class but denied certification of the Do Not Call Class and the Oregon-Stop-Calling Class.  With respect to the Do Not Call Class, the Court determined that Ms. Wakefield had not provided reliable evidence to show the number of potential class members.  With respect to the Oregon-Stop-Calling Class, the Court ruled that the claim did not involve common evidence across class members.  Therefore, only the claims with respect to the Robocall Class moved to trial.

After a 3-day trial that began on April 10, 2019, the jury returned its verdict.  The jury found that Ms. Wakefield proved that ViSalus made 4 calls to her in violation of the TCPA and 1,850,436 calls to class members in violation of the TCPA.  When asked to distinguish on the verdict sheet between how many of those calls were made to cell phones and how many were made to residential telephone lines, however, the jury wrote, “We cannot tell.”  Nor was the jury tasked with making a determination on damages.  Nevertheless, the number of calls and the $500 statutory damages per violation puts potential damages over $925 million, at minimum.

The verdict, however, is likely just the beginning of the next chapter in the case.  Defense counsel is likely to seize on the jury’s inability to distinguish between calls to cell phones and calls to residential landlines.  While the TCPA prohibits prerecorded voice calls to both, it does not prohibit prerecorded voice calls to business landlines.  Therefore, the jury’s statement of “We cannot tell” may give defense counsel an opening to argue that the jury based its verdict on mere speculation.  In any event, given the amount of damages at stake, an appeal appears inevitable.

Tagged with: , , ,
Posted in Litigation, TCPA

5 Ways in Which Your Company’s Privacy Policy is Insufficient

Well thought-out internal privacy policies and procedures are an essential part of any company’s information management program.  These internal policies should not be confused with a company’s external privacy notice, which informs the company’s customers as to how it may process, store, and share their personal data.  Rather, the company’s internal privacy policy sets forth company goals with respect to protected data and defines company procedures to ensure that those goals are met.  Here are five top ways in which such policies are deficient.

  1. The privacy policy isn’t properly documented

It goes without saying that a company’s privacy policies and procedures themselves should be written down and stored in an accessible location.  But all of the underlying information giving rise to those policies and procedures should be thoroughly documented as well.  This documentation should include a comprehensive description of the company’s systems, data, and data flows.  Documenting this information along with the privacy policy will make it easier to identify when the circumstances underlying the policy have changed so that the policy is in need of an update (See #2 below).  It will also ease any transition when new employees become responsible for the company’s information management program.  Spending extra time up front to thoroughly inventory, understand, and document company data will pay dividends down the road.

  1. The privacy policy hasn’t been appropriately updated

Businesses change over time.  A company may enter a new line of business in which it gathers a new category of customer data.  Or a company’s use of personal information may shift between aggressive and conservative over time.  For example, a company may see an opportunity to position itself as a privacy leader in its industry, or it may have to tighten up its data protection practices to minimize reputational harm after a data breach.  Such internal changes warrant a re-examination of the company’s privacy policy.

External changes happen as well.  New laws and regulations in the field of data privacy are a seemingly daily occurrence.  Businesses must account for these changes by appropriately revising their privacy policies.  Moreover, even if a business periodically updates its privacy policy when a new law or regulation is passed, it must occasionally look at its privacy policy more holistically to ensure that it is in accordance with the company’s goals and the regulatory scheme as a whole.

  1. There is one blanket policy that applies to all categories of data

Given the alphabet soup of laws that apply to privacy and data protection, a blanket privacy policy is often insufficient.  Privacy laws differ in their definitions of what constitutes protected information.  For example, a company may hold personally identifiable information under a state privacy law and also hold protected health information under HIPAA.  These different categories of data may require separate privacy policies.  Similarly, laws such as the GDPR categorize personal data separately from sensitive personal data with different grounds for processing each.  Therefore, privacy policies must separately account for and deal with all of the categories of data that a company processes and place appropriate procedures and safeguards around each.

  1. The policy does not appropriately limit defined user roles

Even where a privacy policy properly accounts for all categories of data within an organization, it still must ensure that only appropriate users and systems have access to that data.  Any privacy policy must therefore establish appropriate access barriers across departments and lines of business.  For example, while it may be appropriate to give a certain category of employee (e.g., managers) high-level access to company data within their department, it may not be appropriate to give that category of employee high-level access to company data across the organization.  The privacy policy must account for this by ensuring that employees only have the access to company data necessary to carry out their job functions.  While this adds a layer of complexity to the administration of user accounts and access rights, it is necessary to ensure that only those with a need to know have access to sensitive data.

  1. The policy hasn’t been adequately communicated to the workforce

Even the best-conceived and comprehensive privacy policy won’t do much good if it isn’t communicated throughout the organization.  Moreover, simply posting the company’s privacy policy on the company intranet or including it in an employee handbook may be insufficient.  Appropriate employees need training on the policy with refresher training as policies evolve.  Client or customer-facing employees in particular warrant special attention, as they have to be able to externally communicate the contours of the company’s privacy policies and procedures.  Regular internal communication about the company privacy policy also ensures that privacy is at the forefront of employees’ minds, rather than just an afterthought.

Developing a comprehensive internal company privacy policy and implementing procedures to put that policy into action is certainly not an easy task.  It requires input from multiple stakeholders and buy-in from all levels of the corporate structure.  Moreover, once a privacy policy is in place, it must be viewed as a living document that is regularly reviewed, analyzed, and updated.  Nevertheless, having a complete and updated policy in place is essential to protect your company and your customers’ data.

Tagged with: , , , , ,
Posted in Policies and Procedures, Privacy

U.S. Supreme Court Refuses to Search Google Settlement Agreement for Fairness

The U.S. Supreme Court on Wednesday remanded a class action against Google so that the lower courts could determine whether any of the named plaintiffs have standing under Spokeo, Inc. v. Robbins.

The underlying suit alleged violations of the Stored Communications Act (“SCA”).  The SCA prohibits “a person or entity providing an electronic communications service to the public” from “knowingly divulge[ing] to any person or entity the contents of a communication while in electronic storage by that service.”  The plaintiffs alleged that Google violated this provision by sending users’ search terms to the server hosting the webpage that users clicked on from the search results page.

Google eventually negotiated a settlement with the class for $8.5 million.  Absent class members, however, were to receive no payment from the settlement fund.  Rather, over $2 million was to go to class counsel.  More than $5 million was to be donated to a number of cy pres recipients, nonprofit organizations whose work would indirectly benefit the class members.

The District Court approved the proposed settlement agreement over the objection of two named plaintiffs, who contended that the settlement was not “fair, reasonable, and adequate” as required under the Federal Rules of Civil Procedure.  After the Ninth Circuit affirmed the District Court’s approval, the Supreme Court granted certiorari on the question of whether the proposed settlement agreement was fair, reasonable, and adequate.

The Court, however, deferred on the question, stating that it could not rule on the propriety of the proposed settlement agreement because, “there remain substantial questions about whether any of the named plaintiffs has standing to sue in light of our decision” in Spokeo.

The Court noted that the District Court had previously rejected Google’s standing argument, relying on the Ninth Circuit case Edwards v. First American Corp.  In Edwards, the Ninth Circuit ruled that the violation of a statutory right automatically satisfies the injury-in-fact element of standing when an individual sues to vindicate that right.

After the District Court’s ruling, however, the Supreme Court handed down the Spokeo decision.  Spokeo abrogated Edwards, holding that “Article III standing requires a concrete injury even in the context of a statutory violation.”  Neither the District Court nor the Ninth Circuit re-examined the standing question in light of Spokeo.

Indeed, Google apparently never re-raised the standing issue after the Spokeo ruling.  Nevertheless, the Court noted that it had an independent “obligation to assure ourselves of litigants’ standing under Article III.”  It also stated, “A court is powerless to approve a proposed class settlement if it lacks jurisdiction over the dispute, and federal courts lack jurisdiction if no named plaintiff has standing.”  The Court therefore vacated the approval of the proposed settlement agreement and remanded for further proceedings on the standing question.

The Court’s ruling serves as a strong reminder of just how powerful a standing defense Spokeo can provide in suits alleging a violation of a privacy statute.

Tagged with: , , ,
Posted in Litigation, Privacy

Third Circuit Affirms Dismissal of FACTA Suit on Standing Grounds

A three-judge panel of the Third Circuit recently affirmed a district court ruling that dismissed a suit for violation of the Fair and Accurate Credit Transaction Act of 2003 (FACTA) for lack of Article III standing.  The plaintiff, Ahmed Kamal, alleged that receipts he received from J. Crew showed the first six and last four digits of his credit card number in violation of FACTA.  The panel, applying the Supreme Court’s ruling in Spokeo, Inc. v. Robins, ruled that absent more, such an allegation of a “technical violation” is insufficient to demonstrate the concrete harm required to demonstrate Article III standing.

Congress enacted FACTA to combat identity theft.  The statute prohibits businesses from printing any more than the last five digits of a credit or debit card number on a receipt provided to the cardholder at the point of sale.  FACTA also prohibits businesses from printing the card’s expiration date on the receipt.  FACTA provides for actual damages and attorneys’ fees for negligent violations and statutory damages up to $1000, punitive damages, and attorneys’ fees for willful violations.

Kamal alleged that J. Crew willfully violated FACTA.  On three separate occasions, at three separate J. Crew stores, he received a receipt that showed the first six and the last four digits of his card number.  Kamal did not allege that anyone else saw these receipts, that his identity was stolen, or that his credit card number was compromised.  He filed a class action suit against J. Crew, which the district court ultimately dismissed for lack of Article III standing.

Article III standing is a component of the Constitution’s case or controversy requirement.  To maintain suit under this requirement, plaintiffs must show that 1) they suffered an injury in fact, 2) it is fairly traceable to the challenged conduct of the defendant, and 3) it is likely to be redressed by a favorable judicial decision.

On appeal, the issue before the panel was whether Kamal had sufficiently pled an injury in fact.  To do so, Kamal was required to “allege an invasion of a legally protected interest that is concrete and particularized and actual or imminent, not conjectural or hypothetical.”

Kamal argued that he had pled concrete injury for two reasons.  First, he argued that the violation of FACTA’s plain text was an intangible concrete harm in itself.  Second, he argued that the increased risk of identity theft from the violation was concrete harm.  After discussing Spokeo and a number of its own decisions, the panel rejected both arguments.

The panel discussed whether the alleged intangible harm had a close relationship to a harm that traditionally formed the basis of a common law action.  It discussed a number of privacy torts and concluded that Kamal’s alleged harm did not have a close relationship to them because they all required disclosure of some personal information to a third party.  Here, however, Kamal did not allege that any third party saw the offending receipts.

Next, the panel discussed whether Kamal had alleged an increased risk of the concrete harm of identity theft to satisfy Article III standing requirements.  The panel noted that the first six digits of a credit card number identify the bank and card type, information that is permitted to be printed elsewhere on the receipt under FACTA.  Therefore, J. Crew’s alleged violation did little to increase any risk of identity theft.

The panel also noted that for the alleged harm of identity theft to become realized, Kamal would have to lose or throw away the receipt, and then a would-be identity thief would have to find it and figure out the remaining digits along with additional information such as the expiration date, the Card Verification Value (CVV), or the billing zip code.  The panel agreed with the district court that this chain of events was too attenuated and speculative to entail the sufficient degree of risk necessary to meet the concreteness requirement.

The decision puts the Third Circuit in line with the Second Circuit, which ruled in Katz v. Donna Karan Co. that printing the first six credit card digits on a receipt was a “bare procedural violation” that “did not raise a material risk of harm to identity theft.”

The Eleventh Circuit, however, is on the other side of the issue.  In Muransky v. Godiva Chocolatier, Inc., it ruled that printing the first six digits on a receipt created concrete injury because it was similar to a common law action for breach of confidence.  The Kamal panel expressed its disagreement with the Eleventh Circuit because a breach of confidence action required disclosure to a third party, which Kamal had not alleged.

By requiring disclosure to a third party to show a close relationship to a traditional tort action, however, the panel essentially closed the door on one option to show concrete harm.  Under the panel’s reasoning, even printing the full credit card number on the receipt would not have a close relationship to traditional privacy torts so long as the merchant gave the receipt to only the customer.

Even so, under that set of facts, the plaintiff would likely be able to show concrete harm through the increased risk of identity theft.  The panel admitted that its “analysis would be different” had Kamal “alleged that the receipt included all sixteen digits of his credit card number, making the potential for fraud significantly less conjectural.”  But that raises the question of where courts should draw the line.  What about a receipt that shows 12 or 13 digits?  Is the risk of identity theft that much more appreciable to satisfy the concrete harm requirement?  And will this standard shift as identity thieves employ more sophisticated means to get the information they need?

Stay tuned, as this issue of FACTA standing is sure to get murkier as lower courts continue to grapple with the Supreme Court’s Spokeo decision.

Tagged with: , , , , ,
Posted in Litigation, Privacy, Regulations

Congress Holds Hearings on Privacy and Data Protection

With all of the hubbub swirling around Capitol Hill last week with the Michael Cohen hearings, you can’t be blamed if you missed the fact that two important congressional hearings on privacy and data protection took place as well, one in the House and one in the Senate.

First, on February 26, the House Energy and Commerce’s Subcommittee on Consumer Protection and Commerce held a hearing titled, “Protecting Consumer Privacy in the Era of Big Data.”  It was the first hearing on the topic in the 116th Congress.  Committee members expressed bipartisan support for enacting comprehensive legislation that would set a national standard for data protection, but differed on what that standard might be.  Republican committee members expressed concern that overly strict standards could burden and disadvantage small businesses.  They focused on how the European Union’s General Data Protection Regulation (GDPR) has advantaged companies with the largest market shares at the expense of smaller businesses.  Democrats, meanwhile, expressed concern over the discriminatory effects of a data marketplace without strong enough standards.

In opening statements, Representative Frank Pallone (D-NJ), Chairman of the full committee, said that dense and lengthy privacy polices mean that we can no longer rely on a system of notice and consent and advocated for a shift toward a strong, comprehensive model of data protection.  Representative Greg Walden (R-OR), Ranking Member of the full committee, expressed a desire to work toward federal privacy legislation that focuses on 1) transparency and accountability, 2) protecting innovation and small businesses, and 3) setting a single national standard.  A number of witnesses testified before the subcommittee, including representatives from Color of Change, the largest online civil rights organization in the U.S., the American Enterprise Institute, and the Center for Democracy and Technology.

Then, on February 27, the Senate Commerce Committee held a hearing titled, “Policy Principles for a Federal Data Privacy Framework in the United States.”  Committee members from both parties expressed support for strong, comprehensive legislation to protect the privacy of consumer data.  They differed, however, on what preemptive effect any federal privacy law should have.  Republican committee members tended to support the idea of preemption to avoid the potential burden of complying with a patchwork of state laws with varying standards.  Democrats, on the other hand, expressed concern that passing a preemptive federal law could lead to a lower overall standard of data protection by nullifying stricter state laws.  The preemption issue is sure to remain a hot topic as at least some of the push to pass comprehensive federal privacy legislation is being driven by concerns over the California Consumer Privacy Act (CCPA), which is scheduled to become operative on January 1, 2020.

In opening statements, Chairman Roger Wicker (R-MS) advocated for a data privacy framework that is “uniquely American.”  This framework, he said, should preempt state law and interoperate with international laws to reduce the burdens of compliance.  He made it clear that “a national framework does not mean a weaker framework than what’s being developed in the states.”  Ranking Member Maria Cantwell (D-WA) described recent data breaches as part of a larger trend rather than one-off incidents.  She suggested that the GDPR and the CCPA could provide valuable insights to congressional efforts to create comprehensive federal data protection legislation.  She stated her position that “we cannot pass a weaker federal law at the expense of the states.”  Witnesses from several organizations testified before the committee, including representatives from the 21st Century Privacy Coalition, the Retail Industry Leaders Association, and the Interactive Advertising Bureau.

While potential comprehensive federal privacy legislation has gotten a lot of attention lately, any move from the current sectorial model of U.S. data protection to a comprehensive model will be a heavy lift and will require careful analysis and balancing of privacy rights and regulatory burden.  And all the while, technologies and techniques for exploiting security vulnerabilities will continue to evolve.  Therefore, statutory and regulatory regimes must provide ample protections while also remaining flexible enough to be applicable to evolving technologies.  As expressed by Senator Cantwell, it will be no easy task.

 

Gregory is a Research Professional with the firm and is not an attorney.

Tagged with: , , , , ,
Posted in Data Security, Legislation, Privacy
About Cyber Law Monitor
In the new digital world, individuals and businesses are almost entirely dependent on computer technology and electronic communications to function on a daily basis. Although the power of modern technology is a source of opportunity and inspiration—it also poses huge challenges, from protecting privacy and securing proprietary data to adhering to fast-changing statutory and regulatory requirements. The Cyber Law Monitor blog covers privacy, data security, technology, and cyber space. It tracks major legal and policy developments and provides analysis of current events.
Receive Email Updates

Email:

Cozen O’Connor Blogs