Skip to Main Content

Keeping you informed

FTC Cracks Down on Privacy Policy Transparency, Signaling Need for Clear Data-Sharing Disclosures

    Client Alerts
  • April 29, 2026

The Federal Trade Commission's recent enforcement action against the online dating platform OkCupid highlights a perennial compliance question for any consumer-facing business: Do the company's actual data practices match its privacy representations?

On March 30, 2026, the FTC filed a complaint and proposed stipulated order in federal court against Match Group Americas LLC and Humor Rainbow Inc., which together own and operate OkCupid. The complaint alleges that OkCupid shared users' personal information, including profile photos and location data, with an unrelated third party in a manner inconsistent with its privacy policy, and that this conduct amounted to a deceptive practice in violation of Section 5 of the FTC Act.

At the relevant time, OkCupid's privacy policy represented that it would share personal information only with service providers, business partners, and entities within its family of companies, or when it notified consumers of third-party data sharing and offered a corresponding opt out. According to the FTC, OkCupid instead shared personal data with Clarifai Inc., an artificial intelligence (AI) company that did not fit within any of those disclosed categories. Clarifai develops facial recognition and other computer vision systems, and datasets of real human faces are a core input to training those models.

According to the FTC, in September 2014, Clarifai's CEO emailed an OkCupid founder requesting access to large datasets of OkCupid photos. OkCupid's president and chief technology officer facilitated the transfer, which ultimately gave Clarifai nearly three million user photos along with associated demographic and location information. Clarifai paid nothing for the data, provided no services in return, and operated under no written agreement or contractual restrictions on its use of the information. Users received no notice of th and no opportunity to opt out. The FTC alleges that OkCupid's founders were personal investors in Clarifai, and that when The New York Times later inquired about the arrangement, the companies issued a statement that, in the FTC's words, "obscured" the relationship.

The proposed order states that OkCupid and Match Group are prohibited from misrepresenting:

  • The extent to which they collect, maintain, use, disclose, delete, or protect personal information
     
  • The purposes for which they do so, and
     
  • The way their privacy controls function.

Notably, the order does not impose the kind of affirmative privacy-program, assessment, or consumer-notice obligations that have accompanied many prior FTC privacy consent orders, a point of contrast worth watching as the current Commission continues to define its enforcement posture.

How the Enforcement Action Signals the FTC’s Priorities

Although the allegations are fact-specific, the action reflects the FTC's continued emphasis on transparency and accuracy in privacy disclosures, particularly where personal information is shared with third parties. It is also the first Section 5 privacy enforcement action under FTC Chair Andrew Ferguson, which makes it a useful early signal of the current commission's priorities: even amid heightened attention to AI and emerging technologies, the agency is continuing to police its core "say what you do, do what you say" territory.

The OkCupid action reinforces a point the FTC has made repeatedly across industries: A privacy policy is not merely an informational document. It is a set of representations to consumers, and under Section 5, those representations are enforceable commitments. The 12-year lookback between the 2014 conduct and the 2026 settlement is a reminder that historical data-sharing decisions do not age out of regulatory risk.

Takeaways for Businesses

  • Align privacy disclosures with actual practices. Run a documented data-mapping exercise at least annually, and revisit the privacy policy whenever the map changes. Confirm that each described use and recipient of personal information corresponds to an actual data flow, and that any new vendor, integration, analytics tool, or data-sharing arrangement triggers a review of the policy before it goes live.
     
  • Account for all third-party data sharing. Build a third-party inventory that identifies every recipient of consumer data, the category in the privacy policy that covers it, and the legal or contractual basis for the transfer. Define what "service provider," "business partner," and "affiliate" mean in the policy itself, and do not rely on those labels to cover recipients that have only a financial or investor relationship with the company or its principals. If a recipient does not clearly fit an existing category, either add disclosure and an opt out or do not share the data.
     
  • Paper the relationship. Every third-party data transfer should be governed by a written agreement before data moves, executed through counsel or procurement rather than by ad hoc email. At a minimum, the agreement should limit the recipient to specified purposes, prohibit onward disclosure and use for the recipient's own product development or model training without separate consent, require deletion or return at termination, and include security, audit, and breach-notification commitments. Informal or founder-level exchanges of user data, without a contract, are the pattern the FTC flagged here.
     
  • Treat AI training data as a distinct use case. If consumer data will be used to train, fine-tune, or evaluate AI or computer vision models, whether by the company or by a third party, say so in the privacy policy, describe the categories of data involved, and address it explicitly in the contract with any recipient. Governing terms should specify whether the recipient may retain model weights or derived outputs after the underlying data is deleted, and should prohibit use of the data to build competing or unrelated models. Generic "business partner" or "analytics" language will not carry this weight.
     
  • Recognize heightened risk for sensitive data. Photos, precise location, biometric identifiers, and data that can be used to infer health, sexual orientation, or other sensitive characteristics should be called out specifically in the privacy policy rather than buried in a general category. Apply stricter internal controls to these data sets, including access restrictions, logging, and a higher threshold for approving any external sharing. Treat a request to share sensitive data with a party outside the existing vendor roster as an event that requires legal review, not a routine procurement decision.

For more information, please contact us or your regular Parker Poe contact. Click here to subscribe to our latest alerts and insights.