You are viewing an archived webpage. The information on this page may be out of date. Learn about EPIC's recent work at epic.org.

In re Facebook and Facial Recognition (2018)

Summary

On April 6, 2018, EPIC and a coalition of consumer privacy organizations filed a complaint with the Federal Trade Commission, charging that Facebook's facial recognition practice lacks privacy safeguards and violates the 2011 Consent Order with the Commission.

The complaint addresses Facebook's business change that went into effect in early 2018, which enables Facebook to routinely scan photos, posted by users, for biometric facial matches without the consent of either the image subject or the person who uploaded the photo. EPIC and consumer groups emphasized to the FTC that "the scanning of facial images without express, affirmative consent is unlawful and must be enjoined."

In the complaint, EPIC and the groups ask the FTC to investigate Facebook, determine the extent of the harm to consumer privacy and safety, require Facebook to cease the collection and use of users' biometric data without their affirmative and express opt-in consent, prohibit the deployment of further facial recognition techniques, delete all facial templates and biometric identifiers wrongly obtained, establish appropriate security safeguards, limit the disclosure of user information to third parties, and seek appropriate injunctive and compensatory relief. The following organizations signed onto the complaint: Electronic Privacy Information Center, Campaign for a Commercial Free Childhood, Center for Digital Democracy, Constitutional Alliance, Consumer Action, Consumer Federation of America, Consumer Watchdog, Cyber Privacy Project, Defending Rights & Dissent, Government Accountability Project, Patient Privacy Rights, Southern Poverty Law Center, U.S. Public Interest Research Group.

Top News

  • EPIC Amicus: Unlawful Collection of Biometric Data Establishes Standing: EPIC has filed an amicus brief in a case concerning Facebook's collection of facial images in violation of the Illinois Biometric Information Privacy Act. In Patel v. Facebook, EPIC argued that the violation of the privacy law was sufficient for Facebook users to sue the company. EPIC said that that the legal doctrine of standing "simply requires plaintiffs to demonstrate that a defendant has invaded a concrete interest protected by the law—nothing more." Earlier in 2018, EPIC filed an amicus brief in Rosenbach v. Six Flags, another case about the Illinois biometric privacy law. EPIC routinely submits briefs in support of standing in privacy case. EPIC has also long advocated for limits on the use of biometric data and has opposed Facebook's use of facial recognition software. (Dec. 18, 2018)
  • UPDATE - EPIC, Consumer Groups Urge FTC to Investigate Facebook's Use of Facial Recognition: EPIC and a coalition of consumer groups have filed a complaint with the FTC, charging that Facebook's use of facial recognition techniques threaten user privacy and "in multiple ways" violate the 2011 Consent Order with the Commission. "The scanning of facial images without express, affirmative consent is unlawful and must be enjoined," the groups wrote. Last week the organizations urged the Federal Trade Commission to reopen the 2009 investigation of Facebook, arguing that the disclosure of user data to Cambridge Analytica violated the consent order, and noting that the order also prohibited Facebook from "making misrepresentations about the privacy or security of consumers' personal information." In 2011 EPIC and consumer groups urged the FTC to investigate Facebook’s facial recognition practices. In 2012 EPIC advised the FTC "Commercial actors should not deploy facial techniques until adequate safeguards are established. As such safeguards have not yet been established, EPIC would recommend a moratorium on the commercial deployment of these techniques." EPIC President Marc Rotenberg said today, "Facebook should suspend further deployment of facial recognition pending the outcome of the FTC investigation." (Apr. 6, 2018)
  • More top news

  • EPIC, Consumer Groups to Urge Federal Trade Commission to Investigate Facebook's Use of Facial Recognition + (Apr. 5, 2018)
    EPIC and a coalition of consumer groups will file a complaint with the FTC on Friday charging that Facebook's use of facial recognition techniques threaten user privacy and violate the 2011 Consent Order with the Commission. "The scanning of facial images without express, affirmative consent is unlawful and must be enjoined," the groups wrote. Last week the organizations urged the Federal Trade Commission to reopen the 2009 investigation of Facebook, arguing that the disclosure of user data to Cambridge Analytica violated the consent order, and noting that the order also prohibited Facebook from "making misrepresentations about the privacy or security of consumers' personal information." The FTC has confirmed that an investigation is now underway. The FTC said, "Companies who have settled previous FTC actions must also comply with FTC order provisions imposing privacy and data security requirements." Facebook CEO Mark Zuckerberg will testify next week before the Senate Judiciary Committee and the House Commerce Committee. In 2011 EPIC urged the FTC to investigate Facebook's facial recognition practices. In 2012 EPIC advised the FTC "Commercial actors should not deploy facial techniques until adequate safeguards are established. As such safeguards have not yet been established, EPIC would recommend a moratorium on the commercial deployment of these techniques."

Background

EPIC's Previous Complaints on Facebook and Facial Recognition

EPIC has previously urged the Commission to prohibit Facebook's facial recognition techniques on multiple occasions.

In June 2011, EPIC and a coalition of consumer organizations filed a complaint with the FTC alleging that Facebook's covert deployment of its facial recognition technology was unfair and deceptive. EPIC stated that Facebook's "Tag Suggestions" technique, "converts the photos uploaded by Facebook users into an image identification system under the sole control of Facebook. This has occurred without the knowledge or consent of Facebook users and without adequate consideration of the risks to Facebook users." EPIC warned that "unless the Commission acts promptly, Facebook will routinely automate facial identification and eliminate any pretense of user control over the use of their own images for online identification." EPIC emphasized that the Commission's "failure to act on pending consumer complaints concerning Facebook's unfair and deceptive trade practices may have contributed to Facebook's decision to deploy facial recognition."

In December 2011, EPIC urged the Commission to strengthen its proposed settlement with Facebook by requiring it to "cease creating facial recognition profiles without users' affirmative consent." EPIC contended that while the Order's broad prohibition on privacy misrepresentations already covered Facebook's deceptive use of facial recognition, the Order should have been amended to proscribe the practice explicitly.

In January, 2012, EPIC submitted extensive comments in response to the FTC's workshop "Facing Facts: A Forum on Facial Recognition Technology." EPIC again emphasized that Facebook's facial recognition practice "entirely fails at informing users how their photo data will be used or to provide any meaningful consent for use," as required by the Order. EPIC advised the Commission that, "Commercial actors should not deploy facial techniques until adequate safeguards are established. As such safeguards have not yet been established, EPIC would recommend a moratorium on the commercial deployment of facial recognition techniques."

2018 FTC Complaint

EPIC's 2018 FTC complaint on Facebook and Facial Recognition is signed by a number of other consumer privacy organizations, including the Campaign for a Commercial Free Childhood, Center for Digital Democracy, Constitutional Alliance, Consumer Action, Consumer Federation of America, Consumer Watchdog, Cyber Privacy Project, Defending Rights & Dissent, Government Accountability Project, Patient Privacy Rights, Privacy Rights Clearinghouse, Southern Poverty Law Center, and U.S. Public Interest Research Group.

The complaint concerns recent changes in Facebook's business practices that threaten user privacy and violate the 2011 Consent Order with the Federal Trade Commission. Facebook has begun to routinely scan photos, posted by users, for biometric facial matches without the consent of either the image subject or the person who uploaded the photo.

The complaint alleges that Facebook seeks to perfect its facial recognition techniques by enlisting Facebook users in the process of confirming their image identity. This automated, deceptive, and unnecessary identification of individuals undermines user privacy, ignores the privacy settings of Facebook users, and is contrary to law in many parts of the world. The Commission is required by law to undertake an investigation, to enjoin these unlawful practices, and to provide appropriate remedies to users of the service.

Facebook's Deployment of Facial Recognition

On December 19, 2017, Facebook stated in a press release that this change will roll-out to users in the United States in early 2018 on an opt-out basis. Facebook automatically enrolled users whose privacy settings enabled Tag Suggestions--a technology that Facebook turned on by default to all users without their knowledge or affirmative express consent in 2013 (1.19 billion monthly active users at the time).

Facebook's Tag Suggestions technique converts the photos uploaded by Facebook users into an image identification system under the sole control of Facebook. Many users remain unaware that Tag Suggestions applied to them by default in 2013, and that there is a choice to opt-out. Therefore, Facebook's reliance on this prior setting to infer consent for additional facial recognition practices, which gives Facebook unprecedented control over facial templates, signifies a detrimental disregard for consumer privacy and an urgent need for FTC intervention.

The change in business practice adversely subjects consumers to secretive, unnecessary, and undesired facial recognition and the disclosure of personal data to third parties without affirmative express consent or clear and prominent notice. Thus, it constitutes a violation of the 2011 FTC Consent Order which requires Facebook to "obtain consumers' affirmative express consent before enacting changes that override their privacy preferences."

Privacy Implications of Facebook Facial Scanning

Biometric data can be analyzed to produce sensitive inferences about personal traits, demographics, and behaviors. With photo uploads on Facebook totaling over 350 million per day, Facebook's unprecedented repository of facial recognition data puts at risk the sensitive personal information of 2.13 billion users to misuse by app developers, advertisers, data brokers, foreign state actors, and government agencies. The interest of approximately 214 million users on Facebook fall within the jurisdiction of the United States Federal Trade Commission.

Facebook's changes to its facial recognition techniques operate through an expansive machine learning system that automatically scans photos to notify users when their biometric face print is detected on an image, even if it has not been tagged by the uploader. Users are notified to "find" photos that they are in but have not been tagged, as long as the photo's privacy settings allow the user to view it as a Friend, Public, or Custom Audience.

After the changes in 2018, Facebook users can identify themselves in an otherwise anonymous image without any tags. Serious privacy implications arise from enabling photo subjects to contact the uploader about the facial recognition notification, as it would enable strangers to learn about an individual's precise identity on Facebook merely from taking a picture of them appearing in the background. The unnecessary identification of individuals also reveals personal and private details about where the image subject was, who they were with, and at what time. This technology imperils both consumer privacy and physical safety from harassment and stalking.

Prohibitions of Facebook Facial Recognition

Facebook's extension of facial recognition technology excludes users in Canada and Europe. Regulators in Canada and Europe have imposed strict limits on how companies can collect and store biometric data. Facebook's deployment of its facial recognition technology is an illegal invasion of citizens' privacy rights in Canada and Europe, yet continues to proliferate in the U.S without restriction.

Facebook's invasive facial recognition techniques may also contravene several state privacy laws that prohibit and limit the collection, use, and dissemination of biometric data. Illinois has enacted the Biometric Information Privacy Act, upon which Facebook faces a class action lawsuit on the tag suggestions technology. Texas and Washington also have biometric identifier laws that allow the state attorneys general to bring legal action on prohibited biometric data practices.

Violations of 2011 Consent Order

Facebook has violated Part I(A)-(B) of the Consent Order by misrepresenting the extent to which the user can control the privacy of biometric information, and the extent of Facebook's collection and disclosure of the facial templates and photo comparison data to third parties.

Facebook has violated Part II(B) of the Consent Order by failing to obtain affirmative express consent before implementing business changes to facial recognition techniques. Any claims of inferred or continuing consent from the user's prior setting on Tag Suggestions is invalid, as Facebook has never given users a choice to opt-in to facial recognition.

Facebook's recent notice to users on the changes to the extent of facial recognition does not conspicuously present an opt-out button as it merely links a "Go to Settings" button. The announcement appeared on top of the news feed as a text box, but did not distinguish itself clearly and prominently as a notice for a significant change to the user's privacy setting.

Facebook did not persistently remind the users to manage their privacy settings to address this significant change in biometric data practice. The disclaimer on top of the news feed disappeared after refreshing the page throughout the day.

Contrary to Part II(A) of the 2011 Consent Order, Facebook did not provide a "clear and prominent" notice to users prior to disclosing a user's nonpublic information with third parties, and materially exceeding the restrictions imposed by a user's privacy settings.

Importance Enforcing Consent Orders for Consumer Privacy

The effectiveness of the FTC depends primarily upon the agency's willingness to enforce the legal judgments it obtains. However, the FTC routinely fails to enforce its consent orders, which promotes industry disregard for the FTC. Companies under consent decree have no incentive to protect consumer data if they do not anticipate the FTC to hold them accountable when they violate consent decrees.

EPIC has routinely called attention to the numerous changes Facebook has made to its privacy settings without obtaining users' affirmative consent, in violation of the terms of its FTC consent decree.

FTC Authority to Act

The Commission has a non-discretionary obligation to enforce a final order.

To date, the FTC has failed to take any action with respect to Facebook's changes in biometric privacy practices. Critically, the Commission has not filed a lawsuit pursuant to, the Federal Trade Commission Act which states that the FTC "shall" obtain injunctive relief and recover civil penalties against companies that violate consent orders. 15 U.S.C. S 45(l).

The FTC has exclusive authority over the enforcement of its consent orders. The enforcement provision of the FTC Act, Section 5(l), makes clear that the agency action is not discretionary; a violating party "shall forfeit" a penalty and be subject to an enforcement action.

The FTC is charged with performing a "discrete agency action." A "discrete agency action" is a "final agency action" under the Administrative Procedure Act. "Agency action unlawfully withheld" is a defined as "discrete agency action that [the agency] is required to take."

Agency action is the "whole or part of an agency rule order, license, sanction, relief, or the equivalent or denial thereof, or failure to act." 5 U.S.C. S 551 (13). Agency action, including a "failure to act" is subject to judicial review.

EPIC may "compel agency action unlawfully withheld" pursuant to the Administrative Procedure Act. 5 U.S.C. S 706(1).

Legal Documents

EPIC Work on Facebook Privacy

News Stories

Share this page:

Defend Privacy. Support EPIC.
US Needs a Data Protection Agency
2020 Election Security