Industries

E-Commerce - Data Privacy

How companies collect and use consumer data – and how they inform consumers
of those practices – is increasingly in the public eye. With several high profile data
mining debacles, online privacy is a hot topic among government regulators, consumer
advocacy groups and industry leaders. It’s also a topic generating attention among
consumers. These days, consumers are more regularly reading websites’ and apps’ terms
and conditions, which they used to hastily click through. And government agencies
and private groups are policing companies’ practices to monitor whether companies are
complying with their own privacy policies.

New regulatory initiatives, like the FTC’s proposed revisions to the Children’s Online
Privacy Protection Act (COPPA) rules, and the Obama Administration’s Privacy Bill
of Rights; and new self-regulatory projects, like the Digital Advertising Alliance’s
and the World Wide Web Consortium’s “Do Not Track” standards, lead to the reality
that companies doing business online need to think hard about their privacy policies.
Increasing enforcement actions to ensure companies comply with privacy policies leads
to the reality that companies also need to vet their data collection and usage practices.

Ifrah Law offers extensive experience in drafting privacy policies for various clients
and website operators, as well as in litigation of privacy matters dealing with issues of
data and information. Attorneys represent businesses and individuals with counsel on
information storage and rights of retrieval, including criminal background websites and
children’s privacy protections online through COPPA.

Our team includes a former staff attorney for the Federal Trade Commission’s Bureau
of Consumer Protection and we have managed high-profile FTC and CFTC investigations and enforcement actions.

Takedown Notice Success for an International VIP

When a blogger posted information from a hacked computer about an important Middle Eastern leader, Ifrah Law was asked to help. The site contained threats to the national security of the politician’s country as well as the United States – and to the life of the politician. The matter needed immediate attention and we responded to get the site (and others where the statements had been posted) taken down.

The case was made more complex when two web hosting companies created a sub-domain for the website, and refused to disclose their users – as is their privacy policy.

But the law offers some remedies of its own. One is the Digital Millennium Copyright Act (DMCA), which is U.S. copyright law as well as part of two World Intellectual Property Organization (WIPO) treaties. The DMCA assigns no liability to an Internet service Provider (ISP) for transmitting information that may infringe a copyright, but it forces the ISP to remove materials from users’ websites that appear to be copyright infringement. The DMCA provides for a takedown notice to be sent to an infringer’s ISP.

Ifrah Law successfully utilized takedown notices with two of the blogger’s ISPs as well as Facebook and is pursuing other sites. We impressed upon the web hosting companies that the content posed an immediate threat to national security. We also emphasized to one company that the blogger had violated their terms and conditions.

Privacy issues on the Internet may be rampant, but they do not have to be a fact of life.

 

Keep It Short and Prosper

What a difference two words can make. Just ask the Center for Competitive Politics (CCP) or Americans for Prosperity (AFP), two organizations that filed separate lawsuits against the same defendant, California Attorney General Kamala Harris, over the same issue: whether Harris’s office had the right to access the organizations’ donor information. (The cases are Center for Competitive Politics v. Harris and Americans for Prosperity v. Harris.)

The plaintiffs’ arguments in each case were basically the same: the state’s request to access donor information would violate the first and fourteenth amendments of the U.S. Constitution. But there the similarities stopped: the CCP never got to trial, whereas the AFP did—and won!  Was the CCP the victim of a miscarriage of justice? Nah. It all came down to two words: “as applied.”

You know the saying “go big or go home?” Well, unfortunately the CCP did both: it tried to get the court to rule that Harris’s probe of donor information would be unconstitutional for all organizations. The AFP took a different approach: it asked the court to call the probe unconstitutional “as applied” to the AFP alone.

Ding!

The AFP’s narrower approach enabled the court to provide relief without upsetting Harris’s authority and potentially affecting thousands of other organizations. Courts generally hesitate to invalidate a state’s actions when they can provide individual relief to the plaintiff instead. If the CCP had taken this course, it might have had a flying chance. But now it had the added burden of proving how the state’s actions would adversely affect all organizations subject to the same request.

Meanwhile, the AFP coasted without having to prove any such thing. All it had to show was how the state’s request had already affected the organization and could continue to do so. This was no fun task, though. Several individuals testified that they suffered reprisals, assaults, and even death threats due to their association with the AFP—a strongly conservative organization. Clearly, being publicly linked to the AFP could lead to serious fallout. For her part, Harris tried to argue that the state would keep donor information confidential, but the AFP was able to show how this had failed before, citing over one thousand instances of donor information being improperly disclosed on the AG’s own website!

The AFP showed that the risk of scaring, and therefore discouraging, would-be donors was real. The chilling effect on individuals’ freedom of association would be too steep a price to pay for a nominal benefit to the state.

It was a strong case—unlike the defendant’s. Harris claimed that accessing donor information was in the state’s best interest; reviewing the findings would help uncover potential irregularities tied to fraud, waste, or abuse. Maybe it would—but it doesn’t pass the “exacting scrutiny” test, which requires states to protect their interests by the least restrictive means in situations like this. More importantly, Harris could not produce any evidence or testimony to corroborate her argument that access to donor information was important to state law enforcement. Although several state-employed investigators and attorneys took the stand, none could claim that they needed, or even used, donor information to do their work—and if they did need it, they could generally get it elsewhere. This evidentiary failure undercut Harris’s arguments and called into question the state’s overall scheme.

In the end, it was not a tough decision: with so strong a case by the plaintiff, and so weak one by the state, the court sided plainly with the plaintiff. It could have gone a step further and declared the state’s actions broadly unconstitutional, but instead it judged the state’s actions to be improper as applied to the AFP alone. This was a good idea, because Harris will have a harder time challenging the decision on appeal.

So the AFP trial didn’t set a huge precedent for everyone—but that’s kind of the point. If you’re going to file suit, and there’s a path of least resistance, take it. Those sweeping courtroom victories you see in the movies are rare. In real life, justice takes baby steps.

The post Keep It Short and Prosper appeared first on Crime In The Suites.

Read More

Data Breach Lawsuits: Challenges Persist After Spokeo v. Robins

iStock_000063817763_Small

Data breaches are as common as the common cold—unfortunately, just as incurable. Run a news search on “data breaches” and you’ll find that all kinds of institutions—major retailers, tech companies, universities, even government agencies—have been vulnerable at some point. Now run a search on “data breaches,” but include the word “lawsuit.” You’ll find that many of these cases are going to court, but ultimately getting dismissed. What’s going on?

First, you should look at some of these lawsuits more closely: are they filed against the alleged perpetrators of the data breach? Many of them aren’t; those perpetrators are usually hackers who live outside the country or are unable to pay a money judgment. (In legal parlance, that’s known as being judgment proof.) Faced by those limitations, individual victims of data breaches frequently settle for the next best thing: going after the institutions that endured the breach.

Often, this isn’t fair—the institutions are victims too. The point here is that although going after the institutions looks like an easy win from “deep pockets,” that seldom turns out to be the case.

Plaintiffs in data breach cases, which are usually class actions, need to demonstrate liability on the part of the institution. Much of the time, they rest their case on either negligence or breach of contract claims. Both legal theories require the plaintiff to show the same things: 1) that the defendant had a clear duty to protect the plaintiff’s data, 2) that the defendant breached that duty, and 3) that the plaintiff sustained injury as a result. (For breach of contract, plaintiffs must point to a concrete or sufficiently implicit contract that binds the institution to the stated responsibilities; often this is the institution’s privacy policy.) Plaintiffs typically argue that the institution had an obligation to take precautionary measures against data breaches but failed, and therefore caused injury to the plaintiffs.

It’s with the third and final point—demonstrating injury—that plaintiffs have the most trouble. Why? Because courts view injury in fiscal terms; you need to show that you actually lost something, not simply that you might. So even if you were the victim of a data breach, as long your data hasn’t yet been compromised, it doesn’t really count as injury.

There have been exceptions, when the court greenlit cases based mainly on speculative injury, but these usually ended in a settlement before a legal precedent could be set. (See cases against Home Depot, Target, Adobe, and Sony.) For the most part, the fiscal view of injury has prevailed—reinforced in 2013, when the Supreme Court, weighing in on Clapper vs Amnesty Int’l, determined that a plaintiff cannot proceed with a data breach lawsuit unless he or she can demonstrate actual injury or at least imminent threat of injury, each one measurable in economic loss. Otherwise, mere perception of injury is too tenuous to establish legal standing, which a case requires to go forward, and the lawsuit will probably get tossed.

The challenge of establishing legal standing recently made its way to the Supreme Court in Spokeo v. Robins. In that case, a plaintiff filed suit against the “people search engine” Spokeo for publishing false information about him. The issue before the Court was this central question of how much injury must be shown for a case to go forward. Prospective plaintiffs were optimistic that the high court would affirm a lower court’s decision that speculative injury was indeed enough. Alas, the Supreme Court sidestepped the issue and punted it back to the lower court for further review. The Court nonetheless reinforced the general tenets that, for a plaintiff to have standing to bring a case, he must allege an “injury in fact” that is both “concrete and particularized.” There is still room for the lower court to broaden the approach to what constitutes an injury, but the Supreme Court’s ruling keeps the status quo in place.

For now, individuals whose data has been compromised generally must be satisfied with what the institutions offer them after a breach occurs: free credit checks and/or access to credit monitors. Do checks and monitoring seem inadequate? Not if you think about what type of harm people face after a data breach. Individuals can detect and report problems in the event someone actually misuses their data. If they keep on top of it, their credit scores will not be impacted. Moreover, credit card companies and other financial institutions will bear the cost of any unapproved charges. In the event of further problems, plaintiffs can then take their injury to the legal system and have their day in court. But at this point, the courts are right to keep this type of class action litigation at bay.

The post Data Breach Lawsuits: Challenges Persist After Spokeo v. Robins appeared first on Crime In The Suites.

Read More

Judge Flunks Case Against LabMD, FTC Appeals

Picture2

In March 2015, I wrote about the ongoing dispute between the FTC and LabMD, an Atlanta-based cancer screening laboratory, and looked at whether the FTC has the authority to take enforcement action over data-security practices alleged to be insufficient and therefore “unfair” under section 5(n) of the Federal Trade Commission Act (“FTCA”). On November 13, 2015, an administrative law judge ruled that the FTC had failed to prove its case.

In 2013, the FTC filed an administrative complaint against LabMD, alleging it had failed to secure personal, patient-sensitive information on its computer networks. The FTC alleged that LabMD lacked a comprehensive information-security program, and had therefore failed to (i) implement measures to prevent or detect unauthorized access to the company’s computer networks, (ii) restrict employee access to patient data, and (iii) test for common security risks.

The FTC linked this absence of protocol to two security breaches. First, an insurance aging report containing personal information about thousands of LabMD customers was leaked from the billing manager’s computer onto peer-to-peer file-sharing platform LimeWire, where it was available for download for at least eleven months. Second, Sacramento police reportedly discovered hard copies of LabMD records in the hands of unauthorized individuals. They were charged with identity theft in an unrelated case of fraudulent billing and pleaded no contest.

Incriminating as it all might seem, Administrative Law Judge D. Michael Chappell dismissed the FTC’s complaint entirely, citing a failure to show that LabMD’s practices had caused substantial consumer injury in either incident.

Section 5(n) of the FTCA requires the FTC to show that LabMD’s acts or practices caused, or were likely to cause, substantial injury to consumers. The ALJ held that “substantial injury” means financial harm or unwarranted risks to health and safety. It does not cover embarrassment, stigma, or emotional suffering. As for “likely to cause,” the ALJ held that the FTC was required to prove “probable” harm, not simply “possible” or speculative harm. The ALJ noted that the statute authorizes the FTC’s regulation of future harm (assuming all statutory criteria are met), but that unfairness liability, in practice, applies only to cases involving actual harm.

In the case of the insurance aging report, the evidence showed that the file had been downloaded just once—by a company named Tiversa, which did so to pitch its own data-security services to LabMD. As for the hard copy records, their discovery could not be traced to LabMD’s data-security measures, said the ALJ. Indeed, the FTC had not shown that the hard copy records were ever on LabMD’s computer network.

The FTC had not proved—either with respect to the insurance aging report or the hard copy documents—that LabMD’s alleged security practices caused or were likely to cause consumer harm.

The FTC has appealed the ALJ’s decision to a panel of FTC Commissioners who will render the agency’s final decision on the matter. The FTC’s attorneys argue that the ALJ took too narrow a view of harm, and a substantial injury occurs when any act or practice poses a significant risk of concrete harm. According to the FTC’s complaint counsel, LabMD’s data-security measures posed a significant risk of concrete harm to consumers when the billing manager’s files were accessible via LimeWire, and that risk amounts to an actual, substantial consumer injury covered by section 5(n) of the FTCA.

The Commissioners heard oral arguments in early March and will probably issue a decision in the next several months. On March 20th, LabMD filed a related suit in district court seeking declaratory and injunctive relief against the Commission for its “unconstitutional abuse of government power and ultra vires actions.”

Read More

Police Make iPhone Public Enemy No. 1

AAEAAQAAAAAAAAQvAAAAJDc0MDUzNDFlLTMyMTMtNGQ5MS04ZjMxLWYzZWQ4NjNiYWM1ZA

FBI Director James Comey took a rare break from the posturing typical of investigators and prosecutors in the current showdown between Apple and the FBI.  While prosecutors argue that Apple’s privacy concerns are a smokescreen to avoid “assist[ing] the effort to fully investigate a deadly terrorist attack,” Comey posted a statement over the weekend in which he took the position that the tension between security and privacy “should not be resolved by corporations that sell stuff for a living.  It also should not be resolved by the FBI, which investigates for a living.  It should be resolved by the American people deciding how we want to govern ourselves in a world we have never seen before.”

Comey’s statement highlights a crucial problem with the development of privacy law: it often is developed in the context of important criminal cases.  This comes at a real cost.  We all know that Syed Farook committed a horrific crime, and any rights he once had against government searches are now forfeit.  But though Apple may have chosen to serve as a limited proxy for its consumers in the San Bernardino case, often the interests of private citizens are wholly absent from the courtroom (or, often, judge’s chambers) when issues of fundamental privacy are debated.

This leads to a serious imbalance: Apple is talking about the diffuse privacy rights of its consumers and the risks of potential incursions by more restrictive, less democratic governments such as China.  On the other hand, Manhattan District Attorney Cyrus Vance can point to 175 Apple devices that he cannot physically access even though those devices may contain evidence helpful to the government.

New York Police Commissioner Bill Bratton and one of his deputies put an even finer point on it in an Op-Ed in The New York Times, citing a specific case of a murder victim in Louisiana (more than one thousand miles outside of Mr. Bratton’s jurisdiction) whose murder is unsolved because officers cannot unlock her iPhone, which is believed to contain her killer’s identity. “How is not solving a murder, or not finding the message that might stop the next terrorist attack, protecting anyone?” asks Bratton.

But in assuming that private citizens have no greater fear than whether the police can investigate and prevent crimes, Bratton begs the question.  In reality, citizens may see law enforcement as a threat of itself.  Learning that the NSA was engaging in comprehensive warrantless surveillance likely has given many law-abiding Americans a greater incentive to protect their data from being accessed by the government.  Indeed, in light of the NYPD’s record over the last few years—including a finding by a federal judge that they were systematically violating the rights of black New Yorkers and a lawsuit over religion-based spying on Muslims—it is not hard to see why citizens might want protection against Bratton’s police force.

But even if the police were the angels they purport to be, opening a door for a white hat can easily allow access to a black one.  Less than a year ago, hackers used a “brute force” approach to exploit a flaw in iCloud’s security, and dozens of celebrities had their private photos shared with the world.  These sex crimes are all but forgotten in the context of the San Bernardino shootings, even though the security weakness the FBI wants installed in Farook’s iPhone is markedly similar to that exploited with respect to iCloud.

Nor do those who wish for privacy need to invoke hackers or criminals.  A private, intimate moment with a spouse or loved one; a half-finished poem, story, or work of art; or even a professional relationship with a doctor or mental health professional cannot exist unless they can remain private.  Once these interactions took place in spoken, unrecorded conversations or on easily discarded paper; now many of our daily activities are carried out on our mobile devices.  Even if one has nothing to hide, many citizens might balk at the prospect of having to preserve their private conversations in a format readily accessible by the police.

But if Mr. Comey has shown unusual insight, Mr. Bratton’s one-sided, myopic question illustrates the importance of Apple’s position and the inability of law enforcement officials to be objective about the interests at stake.  Police and prosecutors are not always your friends or your defenders.  Their goals are—and always will be—investigating and solving crimes and convicting suspected criminals.  The less an officer knows, the harder it will be to investigate a case.  As a result, privacy rights—even when asserted by innocent, law-abiding citizens—make their job more difficult, and many officers see those rights as simply standing in their way.

This is hardly news.  Nearly sixty years ago the Supreme Court observed that officers, “engaged in the often competitive enterprise of ferreting out crime,” are simply not capable of being neutral in criminal investigations.  For precisely that reason, the Fourth Amendment requires them to seek approval from a “neutral and detached magistrate” before a search warrant may issue.

That is why Mr. Comey’s acknowledgement that the FBI is not a disinterested party is so refreshing.  Pro-law-enforcement voices have been clamoring to require Apple to compromise the security it built into the iPhone, invoking their role as public servants to buttress their credibility.  But when it comes to privacy, the police do not—and cannot—represent the public interest.  As Comey acknowledged, they are “investigators,” and privacy rights will always stand as an obstacle to investigation.

The post Police Make iPhone Public Enemy No. 1 appeared first on Crime In The Suites.

Read More

FBI Recruits Apple to Help Unlock Your iPhone

Alushta, Russia - October 27, 2015: Woman with headphones holding in the hand iPhone6S Rose Gold. iPhone 6S Rose Gold was created and developed by the Apple inc.

It is a well-known maxim that “bad facts make bad law.”  And as anybody even casually browsing social media this week likely has seen, the incredibly tragic facts surrounding the San Bernadino attacks last December have led to a ruling that jeopardizes the privacy rights of all law-abiding Americans.

First, it is important to clearly understand the ruling.  After the horrific attack in San Bernadino on December 2, 2015, the FBI seized and searched many possessions of shooters Syed Rizwan Farook and Tashfeen Malik in their investigation of the attack.  One item seized was Farook’s Apple iPhone5C.  The iPhone itself was locked and passcode-protected, but the FBI was able to obtain backups from Farook’s iCloud account.  These backups stopped nearly six weeks before the shootings, suggesting that Farook had disabled the automatic feature and that his phone may contain additional information helpful to the investigation.

Under past versions of iOS, the iPhone’s operating system, Apple had been able to pull information off of a locked phone in similar situations.  However, Farook’s iPhone—like all newer models—contains security features that make that impossible.  First, the data on the phone is encrypted with a complex key that is hardwired into the device itself.  This prevents the data from being transferred to another computer (a common step in computer forensics known as “imaging”) in a usable format.  Second, the iPhone itself will not run any software that does not contain a digital “signature” from Apple.  This prevents the FBI from loading its own forensic software onto Farook’s iPhone.  And third, to operate the iPhone requires a numeric passcode; each incorrect passcode will lock out a user for an increasing length of time, and the tenth consecutive incorrect passcode entry will delete all data on the phone irretrievably.  This prevents the FBI from trying to unlock the iPhone without a real risk of losing all of its contents.

As Apple CEO Tim Cook has explained, this system was created deliberately to ensure the security of its users’ personal data against all threats.  Indeed, even Apple itself cannot access its customers’ encrypted data.  This creates a unique problem for the FBI.  It is well-settled that, pursuant to a valid search warrant, a court can order a third party to assist law enforcement agents with a search by providing physical access to equipment, unlocking a door, providing camera footage, or even giving technical assistance with unlocking or accessing software or devices.  And, as the government has acknowledged, Apple has “routinely” provided such assistance when it has had the ability to access the data on an iPhone.

But while courts have required third parties to unlock doors, they have never required them to reverse-engineer a key.  That is what sets this case apart: to assist the government, Apple would have to create something that not only does not exist, but that it deliberately declined to create in the first instance.

On February 16, Assistant U.S. Attorneys in Los Angeles filed an ex parte motion (that is, without providing Apple with notice or a chance to respond) in federal court seeking to require Apple to create a new piece of software that would (1) disable the auto-erase feature triggered by too many failed passcode attempts and (2) eliminate the delays between failed passcode attempts.  In theory, this software is to work only on Farook’s iPhone and no other.  This would allow the FBI to use a computer to simply try all of the possible passcodes in rapid succession in a “brute force” attack on the phone.  That same day, Magistrate Judge Sheri Pym signed what appears to be an unmodified version of the order proposed by the government, ordering Apple to comply or to respond within five business days.

Though Apple has not filed a formal response, CEO Tim Cook already has made waves by publicly stating that Apple will oppose the order.  In a clear and well-written open letter, Cook explains that Apple made the deliberate choice not to build a backdoor into the iPhone because to do so would fatally undermine the encryption measures built in.  He explains that the notion that Apple could create specialized software for Farook’s iPhone only is a myth, and that “[o]nce created, this technique could be used over and over again, on any number of devices.  In the physical world, it would be the equivalent of a master key . . . .”

This has re-ignited the long-standing debate over the proper balance between individual privacy and security (and the debate over whether the two principles truly are opposed to one another).  This is all to the good, but misses a key point: Judge Pym’s order, if it stands, has not only short-circuited this debate, it ignores the resolution that Congress already reached on the issue.

Indeed, a 1994 law known as the Communications Assistance for Law Enforcement Act (“CALEA”) appears to prohibit exactly what the government requested here.  Though CALEA preserved the ability of law enforcement to execute wiretaps after changing technology made that more complicated than physically “tapping” a telephone line, it expressly does not require that information service providers or equipment manufacturers do anything to open their consumers to government searches.  But instead of addressing whether that purpose-built law permits the type of onerous and far-reaching order that was granted here, both the government and the court relied only on the All Writs Act—the two-century-old catch-all statute that judges rely on when ordering parties to unlock doors or turn over security footage.

Though judges frequently must weigh in and issue binding decisions on fiercely contested matters of great importance, they rarely do so with so little explanation, or after such short consideration of the matter.  Indeed, when the government sought an identical order this past October in federal court in Brooklyn, N.Y., Magistrate Judge James Orenstein asked for briefs from Apple, the government, and a group of privacy rights organizations and, four months later, has yet to issue an opinion.  Yet Judge Pym granted a similar order, without any stated justification, the same day that it was sought.

An order that is so far-reaching, so under-explained, and so clearly legally incorrect is deeply concerning.  And yet, but for Apple’s choice to publicize its opposition, this unjustified erosion of our privacy could have happened under the radar and without any way to un-ring the bell.  Fortunately, we appear to have avoided that outcome, and we can hope that Apple’s briefing will give the court the additional legal authority—and the additional time—that it will need to revisit its ruling.

The post FBI Recruits Apple to Help Unlock Your iPhone appeared first on Crime In The Suites.

Read More

  • Like Us on Facebook