Pennsylvania County Faces Up To $67 Million In Damages For Distribution Of Criminal Record Information

Criminal Record Folder with GavelA suburban Philadelphia county is facing a judgment of up to $67 million after a Pennsylvania federal jury found that it violated the Pennsylvania Criminal History Record Information Act.

Pennsylvania’s Criminal History Record Information Act (“CHRIA”) governs the dissemination of records held by criminal justice agencies.  It requires criminal justice agencies to expunge criminal history record information under certain circumstances.  It also contains detailed restrictions on when a criminal justice agency can distribute criminal record information to agencies other than criminal justice agencies or to individuals.  It provides that any person “aggrieved by a violation” of CHRIA “shall be entitled to actual and real damages of not less than $100 for each violation” and “not less than $1,000 nor more than $10,000” for each willful violation.

The Plaintiff in the case alleged that he had been arrested by the Bensalem Police Department in September of 1998 and was subsequently processed through the Bucks County Correctional Facility (“BCCF”).  He then successfully completed a pre-trial rehabilitation program, which allowed him to file a petition for expungement under state law.  He filed that petition, and the court issued an order of expungement in January of 2000.

Nevertheless, in 2007 BCCF created a website that made available to the public criminal history record information, including mug shots and booking photos, of individuals who had been placed in BCCF after their arrest, going back some 70 years.  The information accessible on the website included information for individuals whose criminal records had later been expunged or whose charges had been dismissed.  Plaintiff’s information was accessible on the website.  Plaintiff alleged that a private business running websites named BustedMugshots.com and Mugshotsonline.com was able to gather the information from the BCCF website and make it available on its own website for a fee, without the consent of the affected individual.

As a result, Plaintiff filed a class action complaint on his own behalf and on behalf of others whose records had been expunged, yet their information was published on the BCCF website.  He asserted claims under CHRIA against BCCF and the private websites.  He also asserted claims against the private websites for the unauthorized use of his name or likeness and for false light invasion of privacy.

Plaintiff’s claims against the private websites ultimately failed.  The Court dismissed the CHRIA and unauthorized use of name or likeness claims at the outset.  The Court ruled that CHRIA, by its terms, applied only to criminal justice agencies.  The websites, on the other hand, were private actors.  Therefore, the Court concluded, the websites could have no liability under CHRIA.  The Court also dismissed Plaintiff’s claim for unauthorized use of name or likeness, because Plaintiff failed to show that his name and likeness had “commercial value” as required under the relevant statute.  While the Court allowed Plaintiff’s claim against the websites for false light invasion of privacy to move past the motion to dismiss stage, it ruled in favor of the websites on that claim at summary judgment.  The Court ruled that Plaintiff had failed to produce evidence that the websites acted with actual knowledge or with reckless disregard for the falsity of the information about Plaintiff.  To the contrary, the Court found that the websites had no obvious reason to doubt that the information provided on the BCCF website did not include expunged information.

Consequently, the case moved forward only with respect to the CHRIA claim against BCCF.  The Court granted summary judgment in favor of Plaintiff on liability under CHRIA, finding that the distributed information was criminal record history information under “the unambiguous definition in CHRIA, Pennsylvania’s rules of statutory construction, relevant decisions by Pennsylvania courts, and the Attorney General’s CHRIA Handbook.”

Therefore, the only issues for trial were whether BCCF “willfully” violated CHRIA and to assess damages.  The jury ultimately found that the violations were willful under CHRIA.  It fixed punitive damages at the statutory minimum $1,000 per violation for the nearly 67,000 individuals whose records were unlawfully accessible on the website.  The potential $67 million verdict, however, is the ceiling.  The Court later will determine the exact number of class members who are eligible for the award.  The ultimate number is likely to decrease once deceased class members are removed from the equation.

Posted in Privacy

The Value Of Quickly Disclosing A Data Breach

HackedOne of the first questions a company must answer after it discovers and remediates a data breach is, “What do we tell our customers?”  Companies may delay publicly announcing a data breach out of fear that doing so will harm their reputation with customers, leading to a loss of business.  They may take an inordinate amount of time to make a public announcement, thinking their public statement must be “just right.”  This is backward and outdated thinking.  Rather, a quick public announcement of a data breach is an essential part of saving and rebuilding a company’s reputation after a data breach.

First, it is important to recognize that a company whose data systems have been breached is not in control of when the breach will be revealed to the public.  There are tools available for individuals to see if their email addresses, passwords, social security numbers, credit card numbers, and the like have been posted on the dark web.  There are cybersecurity companies and ethical hackers who are constantly on the lookout for information demonstrating a new data breach.  Not speaking publicly about the problem will not make it go away.  If a system has been compromised, that fact is going to become known sooner rather than later, regardless of whether the owner of the compromised system announces it.

Therefore, the compromised company needs to get ahead of things to control the narrative.  We oftentimes forget that a company that has been hacked is a victim.  A timely public announcement can help to remind the public of that fact.  An announcement that acknowledges the problem, provides a meaningful recourse for those affected, and emphasizes the company’s commitment to work with law enforcement can help to shift the focus toward those who invaded the company’s systems.  Delaying announcement until after a breach is already publicly discovered robs the company of the opportunity to frame itself as part of the solution rather than part of the problem.

Indeed, recent experience shows that the way a company responds to a breach is more likely to cause reputational harm than the breach itself.  As a general matter, the public accepts that data beaches are an unfortunate reality of the digital age despite best efforts to prevent them.  Moreover, given the number and size of data breaches over the past decade, many people are resigned to the fact that much of their personal data has already been compromised.  They want to know of any additional breaches so that they can remain vigilant and spot potential fraud when it occurs.  A timely announcement by the owner of the breached system gives them the information they need.  Unnecessary delay can lead them to believe that the company is not taking its customers’ interests seriously.

Posted in Data Breach

Jury Verdict in TCPA Case Puts Over $925 Million In Damages On The Table

On April 12, 2019, an Oregon federal jury returned a Friday evening verdict in a Telephone Consumer Protection Act (TCPA) class action that could put the defendant on the hook for $925 million in damages.

The TCPA makes it unlawful to make a telephone call to any cell phone or residential telephone line using an artificial or prerecorded voice without the prior express consent of the called party.  It provides a private right of action that subjects violators to actual damages or $500 for each violation, whichever is greater.  A court can award treble damages for any violation it determines to be willful or knowing.

The case of Wakefield v. ViSalus, Inc. started back in 2015 when plaintiff Lori Wakefield filed a class action complaint in the U.S. District Court for the District of Oregon against ViSalus, Inc.  ViSalus is a multilevel marketing company that markets weight loss products, dietary supplements, and energy drinks.  Ms. Wakefield alleged that ViSalus engaged in a marketing campaign that consisted of calling phone numbers that were on the Do Not Call Registry and placing robocalls initiated with a prerecorded message, both without prior express consent.  Eventually, Ms. Wakefield sought certification of three classes, a “Do Not Call Class,” a “Robocall Class,” and an “Oregon-Stop-Calling Class” based on alleged violations of state law.

In June of 2017, the Court certified the Robocall Class but denied certification of the Do Not Call Class and the Oregon-Stop-Calling Class.  With respect to the Do Not Call Class, the Court determined that Ms. Wakefield had not provided reliable evidence to show the number of potential class members.  With respect to the Oregon-Stop-Calling Class, the Court ruled that the claim did not involve common evidence across class members.  Therefore, only the claims with respect to the Robocall Class moved to trial.

After a 3-day trial that began on April 10, 2019, the jury returned its verdict.  The jury found that Ms. Wakefield proved that ViSalus made 4 calls to her in violation of the TCPA and 1,850,436 calls to class members in violation of the TCPA.  When asked to distinguish on the verdict sheet between how many of those calls were made to cell phones and how many were made to residential telephone lines, however, the jury wrote, “We cannot tell.”  Nor was the jury tasked with making a determination on damages.  Nevertheless, the number of calls and the $500 statutory damages per violation puts potential damages over $925 million, at minimum.

The verdict, however, is likely just the beginning of the next chapter in the case.  Defense counsel is likely to seize on the jury’s inability to distinguish between calls to cell phones and calls to residential landlines.  While the TCPA prohibits prerecorded voice calls to both, it does not prohibit prerecorded voice calls to business landlines.  Therefore, the jury’s statement of “We cannot tell” may give defense counsel an opening to argue that the jury based its verdict on mere speculation.  In any event, given the amount of damages at stake, an appeal appears inevitable.

Tagged with: , , ,
Posted in Litigation, TCPA

5 Ways in Which Your Company’s Privacy Policy is Insufficient

Well thought-out internal privacy policies and procedures are an essential part of any company’s information management program.  These internal policies should not be confused with a company’s external privacy notice, which informs the company’s customers as to how it may process, store, and share their personal data.  Rather, the company’s internal privacy policy sets forth company goals with respect to protected data and defines company procedures to ensure that those goals are met.  Here are five top ways in which such policies are deficient.

  1. The privacy policy isn’t properly documented

It goes without saying that a company’s privacy policies and procedures themselves should be written down and stored in an accessible location.  But all of the underlying information giving rise to those policies and procedures should be thoroughly documented as well.  This documentation should include a comprehensive description of the company’s systems, data, and data flows.  Documenting this information along with the privacy policy will make it easier to identify when the circumstances underlying the policy have changed so that the policy is in need of an update (See #2 below).  It will also ease any transition when new employees become responsible for the company’s information management program.  Spending extra time up front to thoroughly inventory, understand, and document company data will pay dividends down the road.

  1. The privacy policy hasn’t been appropriately updated

Businesses change over time.  A company may enter a new line of business in which it gathers a new category of customer data.  Or a company’s use of personal information may shift between aggressive and conservative over time.  For example, a company may see an opportunity to position itself as a privacy leader in its industry, or it may have to tighten up its data protection practices to minimize reputational harm after a data breach.  Such internal changes warrant a re-examination of the company’s privacy policy.

External changes happen as well.  New laws and regulations in the field of data privacy are a seemingly daily occurrence.  Businesses must account for these changes by appropriately revising their privacy policies.  Moreover, even if a business periodically updates its privacy policy when a new law or regulation is passed, it must occasionally look at its privacy policy more holistically to ensure that it is in accordance with the company’s goals and the regulatory scheme as a whole.

  1. There is one blanket policy that applies to all categories of data

Given the alphabet soup of laws that apply to privacy and data protection, a blanket privacy policy is often insufficient.  Privacy laws differ in their definitions of what constitutes protected information.  For example, a company may hold personally identifiable information under a state privacy law and also hold protected health information under HIPAA.  These different categories of data may require separate privacy policies.  Similarly, laws such as the GDPR categorize personal data separately from sensitive personal data with different grounds for processing each.  Therefore, privacy policies must separately account for and deal with all of the categories of data that a company processes and place appropriate procedures and safeguards around each.

  1. The policy does not appropriately limit defined user roles

Even where a privacy policy properly accounts for all categories of data within an organization, it still must ensure that only appropriate users and systems have access to that data.  Any privacy policy must therefore establish appropriate access barriers across departments and lines of business.  For example, while it may be appropriate to give a certain category of employee (e.g., managers) high-level access to company data within their department, it may not be appropriate to give that category of employee high-level access to company data across the organization.  The privacy policy must account for this by ensuring that employees only have the access to company data necessary to carry out their job functions.  While this adds a layer of complexity to the administration of user accounts and access rights, it is necessary to ensure that only those with a need to know have access to sensitive data.

  1. The policy hasn’t been adequately communicated to the workforce

Even the best-conceived and comprehensive privacy policy won’t do much good if it isn’t communicated throughout the organization.  Moreover, simply posting the company’s privacy policy on the company intranet or including it in an employee handbook may be insufficient.  Appropriate employees need training on the policy with refresher training as policies evolve.  Client or customer-facing employees in particular warrant special attention, as they have to be able to externally communicate the contours of the company’s privacy policies and procedures.  Regular internal communication about the company privacy policy also ensures that privacy is at the forefront of employees’ minds, rather than just an afterthought.

Developing a comprehensive internal company privacy policy and implementing procedures to put that policy into action is certainly not an easy task.  It requires input from multiple stakeholders and buy-in from all levels of the corporate structure.  Moreover, once a privacy policy is in place, it must be viewed as a living document that is regularly reviewed, analyzed, and updated.  Nevertheless, having a complete and updated policy in place is essential to protect your company and your customers’ data.

Tagged with: , , , , ,
Posted in Policies and Procedures, Privacy

U.S. Supreme Court Refuses to Search Google Settlement Agreement for Fairness

The U.S. Supreme Court on Wednesday remanded a class action against Google so that the lower courts could determine whether any of the named plaintiffs have standing under Spokeo, Inc. v. Robbins.

The underlying suit alleged violations of the Stored Communications Act (“SCA”).  The SCA prohibits “a person or entity providing an electronic communications service to the public” from “knowingly divulge[ing] to any person or entity the contents of a communication while in electronic storage by that service.”  The plaintiffs alleged that Google violated this provision by sending users’ search terms to the server hosting the webpage that users clicked on from the search results page.

Google eventually negotiated a settlement with the class for $8.5 million.  Absent class members, however, were to receive no payment from the settlement fund.  Rather, over $2 million was to go to class counsel.  More than $5 million was to be donated to a number of cy pres recipients, nonprofit organizations whose work would indirectly benefit the class members.

The District Court approved the proposed settlement agreement over the objection of two named plaintiffs, who contended that the settlement was not “fair, reasonable, and adequate” as required under the Federal Rules of Civil Procedure.  After the Ninth Circuit affirmed the District Court’s approval, the Supreme Court granted certiorari on the question of whether the proposed settlement agreement was fair, reasonable, and adequate.

The Court, however, deferred on the question, stating that it could not rule on the propriety of the proposed settlement agreement because, “there remain substantial questions about whether any of the named plaintiffs has standing to sue in light of our decision” in Spokeo.

The Court noted that the District Court had previously rejected Google’s standing argument, relying on the Ninth Circuit case Edwards v. First American Corp.  In Edwards, the Ninth Circuit ruled that the violation of a statutory right automatically satisfies the injury-in-fact element of standing when an individual sues to vindicate that right.

After the District Court’s ruling, however, the Supreme Court handed down the Spokeo decision.  Spokeo abrogated Edwards, holding that “Article III standing requires a concrete injury even in the context of a statutory violation.”  Neither the District Court nor the Ninth Circuit re-examined the standing question in light of Spokeo.

Indeed, Google apparently never re-raised the standing issue after the Spokeo ruling.  Nevertheless, the Court noted that it had an independent “obligation to assure ourselves of litigants’ standing under Article III.”  It also stated, “A court is powerless to approve a proposed class settlement if it lacks jurisdiction over the dispute, and federal courts lack jurisdiction if no named plaintiff has standing.”  The Court therefore vacated the approval of the proposed settlement agreement and remanded for further proceedings on the standing question.

The Court’s ruling serves as a strong reminder of just how powerful a standing defense Spokeo can provide in suits alleging a violation of a privacy statute.

Tagged with: , , ,
Posted in Litigation, Privacy

Third Circuit Affirms Dismissal of FACTA Suit on Standing Grounds

A three-judge panel of the Third Circuit recently affirmed a district court ruling that dismissed a suit for violation of the Fair and Accurate Credit Transaction Act of 2003 (FACTA) for lack of Article III standing.  The plaintiff, Ahmed Kamal, alleged that receipts he received from J. Crew showed the first six and last four digits of his credit card number in violation of FACTA.  The panel, applying the Supreme Court’s ruling in Spokeo, Inc. v. Robins, ruled that absent more, such an allegation of a “technical violation” is insufficient to demonstrate the concrete harm required to demonstrate Article III standing.

Congress enacted FACTA to combat identity theft.  The statute prohibits businesses from printing any more than the last five digits of a credit or debit card number on a receipt provided to the cardholder at the point of sale.  FACTA also prohibits businesses from printing the card’s expiration date on the receipt.  FACTA provides for actual damages and attorneys’ fees for negligent violations and statutory damages up to $1000, punitive damages, and attorneys’ fees for willful violations.

Kamal alleged that J. Crew willfully violated FACTA.  On three separate occasions, at three separate J. Crew stores, he received a receipt that showed the first six and the last four digits of his card number.  Kamal did not allege that anyone else saw these receipts, that his identity was stolen, or that his credit card number was compromised.  He filed a class action suit against J. Crew, which the district court ultimately dismissed for lack of Article III standing.

Article III standing is a component of the Constitution’s case or controversy requirement.  To maintain suit under this requirement, plaintiffs must show that 1) they suffered an injury in fact, 2) it is fairly traceable to the challenged conduct of the defendant, and 3) it is likely to be redressed by a favorable judicial decision.

On appeal, the issue before the panel was whether Kamal had sufficiently pled an injury in fact.  To do so, Kamal was required to “allege an invasion of a legally protected interest that is concrete and particularized and actual or imminent, not conjectural or hypothetical.”

Kamal argued that he had pled concrete injury for two reasons.  First, he argued that the violation of FACTA’s plain text was an intangible concrete harm in itself.  Second, he argued that the increased risk of identity theft from the violation was concrete harm.  After discussing Spokeo and a number of its own decisions, the panel rejected both arguments.

The panel discussed whether the alleged intangible harm had a close relationship to a harm that traditionally formed the basis of a common law action.  It discussed a number of privacy torts and concluded that Kamal’s alleged harm did not have a close relationship to them because they all required disclosure of some personal information to a third party.  Here, however, Kamal did not allege that any third party saw the offending receipts.

Next, the panel discussed whether Kamal had alleged an increased risk of the concrete harm of identity theft to satisfy Article III standing requirements.  The panel noted that the first six digits of a credit card number identify the bank and card type, information that is permitted to be printed elsewhere on the receipt under FACTA.  Therefore, J. Crew’s alleged violation did little to increase any risk of identity theft.

The panel also noted that for the alleged harm of identity theft to become realized, Kamal would have to lose or throw away the receipt, and then a would-be identity thief would have to find it and figure out the remaining digits along with additional information such as the expiration date, the Card Verification Value (CVV), or the billing zip code.  The panel agreed with the district court that this chain of events was too attenuated and speculative to entail the sufficient degree of risk necessary to meet the concreteness requirement.

The decision puts the Third Circuit in line with the Second Circuit, which ruled in Katz v. Donna Karan Co. that printing the first six credit card digits on a receipt was a “bare procedural violation” that “did not raise a material risk of harm to identity theft.”

The Eleventh Circuit, however, is on the other side of the issue.  In Muransky v. Godiva Chocolatier, Inc., it ruled that printing the first six digits on a receipt created concrete injury because it was similar to a common law action for breach of confidence.  The Kamal panel expressed its disagreement with the Eleventh Circuit because a breach of confidence action required disclosure to a third party, which Kamal had not alleged.

By requiring disclosure to a third party to show a close relationship to a traditional tort action, however, the panel essentially closed the door on one option to show concrete harm.  Under the panel’s reasoning, even printing the full credit card number on the receipt would not have a close relationship to traditional privacy torts so long as the merchant gave the receipt to only the customer.

Even so, under that set of facts, the plaintiff would likely be able to show concrete harm through the increased risk of identity theft.  The panel admitted that its “analysis would be different” had Kamal “alleged that the receipt included all sixteen digits of his credit card number, making the potential for fraud significantly less conjectural.”  But that raises the question of where courts should draw the line.  What about a receipt that shows 12 or 13 digits?  Is the risk of identity theft that much more appreciable to satisfy the concrete harm requirement?  And will this standard shift as identity thieves employ more sophisticated means to get the information they need?

Stay tuned, as this issue of FACTA standing is sure to get murkier as lower courts continue to grapple with the Supreme Court’s Spokeo decision.

Tagged with: , , , , ,
Posted in Litigation, Privacy, Regulations

Congress Holds Hearings on Privacy and Data Protection

With all of the hubbub swirling around Capitol Hill last week with the Michael Cohen hearings, you can’t be blamed if you missed the fact that two important congressional hearings on privacy and data protection took place as well, one in the House and one in the Senate.

First, on February 26, the House Energy and Commerce’s Subcommittee on Consumer Protection and Commerce held a hearing titled, “Protecting Consumer Privacy in the Era of Big Data.”  It was the first hearing on the topic in the 116th Congress.  Committee members expressed bipartisan support for enacting comprehensive legislation that would set a national standard for data protection, but differed on what that standard might be.  Republican committee members expressed concern that overly strict standards could burden and disadvantage small businesses.  They focused on how the European Union’s General Data Protection Regulation (GDPR) has advantaged companies with the largest market shares at the expense of smaller businesses.  Democrats, meanwhile, expressed concern over the discriminatory effects of a data marketplace without strong enough standards.

In opening statements, Representative Frank Pallone (D-NJ), Chairman of the full committee, said that dense and lengthy privacy polices mean that we can no longer rely on a system of notice and consent and advocated for a shift toward a strong, comprehensive model of data protection.  Representative Greg Walden (R-OR), Ranking Member of the full committee, expressed a desire to work toward federal privacy legislation that focuses on 1) transparency and accountability, 2) protecting innovation and small businesses, and 3) setting a single national standard.  A number of witnesses testified before the subcommittee, including representatives from Color of Change, the largest online civil rights organization in the U.S., the American Enterprise Institute, and the Center for Democracy and Technology.

Then, on February 27, the Senate Commerce Committee held a hearing titled, “Policy Principles for a Federal Data Privacy Framework in the United States.”  Committee members from both parties expressed support for strong, comprehensive legislation to protect the privacy of consumer data.  They differed, however, on what preemptive effect any federal privacy law should have.  Republican committee members tended to support the idea of preemption to avoid the potential burden of complying with a patchwork of state laws with varying standards.  Democrats, on the other hand, expressed concern that passing a preemptive federal law could lead to a lower overall standard of data protection by nullifying stricter state laws.  The preemption issue is sure to remain a hot topic as at least some of the push to pass comprehensive federal privacy legislation is being driven by concerns over the California Consumer Privacy Act (CCPA), which is scheduled to become operative on January 1, 2020.

In opening statements, Chairman Roger Wicker (R-MS) advocated for a data privacy framework that is “uniquely American.”  This framework, he said, should preempt state law and interoperate with international laws to reduce the burdens of compliance.  He made it clear that “a national framework does not mean a weaker framework than what’s being developed in the states.”  Ranking Member Maria Cantwell (D-WA) described recent data breaches as part of a larger trend rather than one-off incidents.  She suggested that the GDPR and the CCPA could provide valuable insights to congressional efforts to create comprehensive federal data protection legislation.  She stated her position that “we cannot pass a weaker federal law at the expense of the states.”  Witnesses from several organizations testified before the committee, including representatives from the 21st Century Privacy Coalition, the Retail Industry Leaders Association, and the Interactive Advertising Bureau.

While potential comprehensive federal privacy legislation has gotten a lot of attention lately, any move from the current sectorial model of U.S. data protection to a comprehensive model will be a heavy lift and will require careful analysis and balancing of privacy rights and regulatory burden.  And all the while, technologies and techniques for exploiting security vulnerabilities will continue to evolve.  Therefore, statutory and regulatory regimes must provide ample protections while also remaining flexible enough to be applicable to evolving technologies.  As expressed by Senator Cantwell, it will be no easy task.

 

Gregory is a Research Professional with the firm and is not an attorney.

Tagged with: , , , , ,
Posted in Data Security, Legislation, Privacy

FTC Announces Record Settlement for Children’s Privacy Violations

On February 27, the FTC announced that the operators of the video social networking application Musical.ly, now known as TikTok, agreed to pay $5.7 million to settle allegations that it violated the Children’s Online Privacy Protection Act (COPPA). According to the FTC, this is the largest civil penalty obtained in a children’s privacy case. The proposed consent order also requires TikTok to destroy all user data for users under the age of 13 and for users who are over 13 but were under 13 when TikTok collected their data, unless TikTok has verifiable parental consent to collect, use, and disclose such data.

The application at issue allows users to create videos, edit them and synchronize them to music clips, and then share them with other users.  To register, users had to provide their email address, phone number, first and last name, bio, and profile picture. User accounts were set to “public” by default, meaning other users could search for and view the user’s profile. And for a period of time, the application collected geolocation data and had a “my city” function that allowed users to view a list of other users within a 50-mile radius. According to the FTC, the defendants had received thousands of parental complaints and there were numerous public reports of adults attempting to contact children through the application.

The FTC’s complaint, which the Department of Justice filed on its behalf, alleged several longstanding COPPA violations. Specifically, the FTC alleged that the defendants failed to provide required privacy notices, failed to obtain parental consent before collecting children’s personal information, failed to delete children’s personal information upon parental request, and retained children’s personal information for longer than was reasonably necessary to fulfill the purpose for which the information was collected.

The FTC also alleged that the application was directed, at least in part, to children under the age of 13 and that the defendants knew that children were using the application. Many users stated their age in their profile bio or provided grade school information that showed they were under 13. Many of the music clips available from the application’s library are popular with children, such as clips from Disney movies or clips from musical artists who are popular with “tweens and younger children.” And while the application began requesting age information in July 2017 and prevented users under 13 from creating accounts, it had no restrictions in place before then, and it did not request age information from users who created accounts before that date.

In conjunction with the announced civil penalty, Commissioners Rohit Chopra and Rebecca Kelly Slaughter issued a notable joint statement advocating for greater individual accountability for COPPA violators. The statement expressed Chopra’s and Slaughter’s belief that the violations showed “the company’s willingness to pursue growth even at the expense of endangering children.” They went on to say, “When any company appears to have made a business decision to violate or disregard the law, the Commission should identify and investigate those individuals who made or ratified that decision and evaluate whether to charge them.”

Of course, Chopra and Slaughter are only two of the five Commissioners. It therefore remains to be seen whether the Commission as a whole will adopt a more aggressive approach to enforcement against individuals or will reserve going after individuals only in the most egregious cases. But given the record amount of the civil penalty, the Commission is certainly wary of companies that elevate profits over privacy.

Tagged with: , , ,
Posted in FTC, Privacy, Regulations, Social Media

Is it Time to Rethink Notice and Choice as a Fair Information Privacy Practice?

Since the 1970’s, fair information practices (FIPs) or fair information privacy practices (FIPPs) have formed the framework around which organizations structure their policies on data collection, use, disclosure, and retention.  The cornerstone of individual privacy rights under the FIPs is notice and choice, sometimes called notice and consent.  That is, an organization should inform individuals about how their personal information will be processed and shared and proceed only when an individual agrees to such use.  At first glance, these dual concepts may appear to adequately protect individual privacy.  As the digital landscape has evolved, however, it has become apparent that the notice and choice paradigm fails to adequately protect individual privacy in important ways.

First, the concepts of notice and choice assume that the choice is informed, but that is likely not the case.  Privacy notices are often buried in terms of service that are lengthy, confusing, and difficult to read.  They are often full of legalese and written from the perspective of protecting the organization from legal liability rather than from the perspective of genuinely and clearly informing users as to how their personal information might be shared.  The term “privacy notice” may give users the impression that it contains information on how the organization is going to protect personal information rather than how it is going to disclose that information, which further disincentives a close read.  All of this leads to the conclusion that a substantial number of individuals have no idea how companies are using or sharing their personal information.

That leads to a second related problem with the notice and choice framework.  Notice and choice adequately protect individual privacy only if the choice is meaningful and consent is freely given.  Yet accepting an organization’s privacy notice or terms of service is usually presented as a take it or leave it threshold requirement to access a website, web service, or application.  When faced with the choice of access or no access, users will choose access, no matter how draconian an organization’s information sharing practices may be.  In other words, conditioning user access on providing personal information and agreeing to an organization’s privacy policy gives the user a choice only in the most literal sense.  But given human nature and the presence of information technology in our daily lives, it really presents the user with no choice at all.

Both of these issues, difficult to understand privacy notices and conditioning access on acceptance, have real effects. Users are constantly inundated with lengthy terms of use that they know they have no choice but to accept if they want to access the website or application at issue. They soon become desensitized and simply click “accept.” To be sure, a 2017 Deloitte consumer survey concluded that 91% of consumers simply accept legal terms and conditions without reading them, and that number jumps to 97% when looking at consumers age 18 to 34. These statistics show that while notice and choice may sound good in theory, it has real shortcomings in practice.

Recognizing that notice and choice may no longer be sufficient to protect individual data privacy rights, some privacy professionals have signaled a move away from the notice and choice paradigm. For example, in a September 2018 request for comments, the National Telecommunications and Information Administration (NITA) noted, “To date, [mandates on notice and choice], have resulted primarily in long, legal, regulator-focused privacy policies and check boxes, which only help a very small number of users who choose to read these policies and make binary choices.” Fortunately, there are a number of things that a company can do to get out in front of this transition away from a strict notice and choice regime.

First, an organization can build consumer trust by posting an easy to understand, layered privacy notice.  A layered privacy notice starts with a short and simple statement of what personal information the organization collects and why it collects it.  This first layer notice then contains a link to a fuller statement of the organization’s privacy policy.  This second layer can be a broader “highlights” document as well, with a further link to the full privacy policy or perhaps an FAQ page.  Short, top-layer notices also help users and protect the organization because they are more easily read on the smaller screens of mobile devices.  Moreover, being transparent and using plain language in its privacy notice will help the organization build goodwill with its customers.

Second, an organization can protect its customers’ privacy rights by minimizing the amount of data it collects on those customers.  Organizations should give serious thought before collecting more personal information than is necessary to provide the good or service in question.  Data is not only an asset, but also a potential liability.  While a data breach is never a pleasant experience, the harm to a company’s reputation will be amplified if the breach contains disclosure of personal information that has no rational connection to the good or service the organization provides to its customers.

Third, an organization can give its customers multiple options as to how their personal information is used and shared.  For example, customers may be fine with having their email addresses added to a company’s internal marketing list, but may not want that same information sold to a third-party mailing list.  True consumer choice requires more than an all or nothing approach.

As the practical shortcomings of the notice and choice framework become more apparent, lawmakers and regulators likely will begin to mandate a more holistic approach that looks more fully at what an organization does to protect individual privacy rights, rather than focusing on whether the organization simply complied with notice and choice requirements.  By thinking about this shift now, organizations can better prepare themselves for this transition while building trust and confidence with their customers at the same time.

Tagged with: , , , ,
Posted in Privacy, Standards

Privacy Primer: The Children’s Online Privacy Protection Act (COPPA)

COPPA is a U.S. law enacted by Congress in 1998 to address concerns regarding the online collection and disclosure of children’s personal information. Children (defined by COPPA as individuals under the age of 13) may not appreciate the significance of sharing their personal information online. Therefore, the goal of COPPA is to put the power of children’s online personal information into the hands of their parents.

COPPA tasked the FTC with promulgating rules to define what an unfair or deceptive trade practice is under the law. The current Children’s Online Privacy Protection Rule applies to operators of commercial websites or online services (including mobile applications) that are directed to children and to operators who have actual knowledge that they are collecting or maintaining children’s personal information. Under the Rule, such an operator:

(a) Must provide notice on its website of what information it collects from children, how it uses that information, and how it might disclose such information;
(b) Must obtain verifiable parental consent prior to collecting, using, or disclosing a child’s personal information;
(c) Must provide a reasonable means for a parent to review the personal information the operator has collected from a child and to refuse to permit further use of that information;
(d) Must establish and maintain reasonable procedures to protect the confidentiality, security, and integrity of personal information collected from children; and
(e) Cannot condition a child’s participation in a game, prize offering, or other activity on the child disclosing more personal information than is reasonably necessary to participate in such activity.

As a general matter, verifiable parental consent includes any method reasonably calculated, in light of available technologies, to ensure that the person providing consent is the child’s parent. The Rule lays out a list of methods that the FTC has determined to meet that requirement, such as providing a consent form for the parent to sign and return via mail, fax, or email. Violations of the Rule carry civil penalties of up to $41,484 per violation.

A number of states have passed legislation to fill the gap left by COPPA regarding teenagers. For example, the Delaware Online Privacy and Protection Act extends COPPA-like provisions to all Delaware residents who are under 18. Therefore, website operators and online service providers should be aware of potentially applicable state laws even if they do not believe that COPPA applies.

Tagged with: , , ,
Posted in Legislation, Privacy, Regulations
About Cyber Law Monitor
In the new digital world, individuals and businesses are almost entirely dependent on computer technology and electronic communications to function on a daily basis. Although the power of modern technology is a source of opportunity and inspiration—it also poses huge challenges, from protecting privacy and securing proprietary data to adhering to fast-changing statutory and regulatory requirements. The Cyber Law Monitor blog covers privacy, data security, technology, and cyber space. It tracks major legal and policy developments and provides analysis of current events.
Receive Email Updates

Email:

Cozen O’Connor Blogs