Imagine uploading a photo to a dating app, hoping to find a connection. Now imagine that same photo being used, without your knowledge, to train a facial recognition system that can guess your age, sex, and race. For millions of OkCupid users, this isn't a dystopian fantasy—it's what happened.
According to a new report from Reuters, the AI company Clarifai has finally deleted **3 million user photos** it obtained from OkCupid over a decade ago. The data was used to build controversial AI tools, in a deal that flouted OkCupid's own privacy promises to its users.
The "Awesome Data" Email That Started It All
How did this happen? The story begins with a startlingly casual email. In 2014, Clarifai's founder, Matthew Zeiler, reached out to OkCupid's co-founder, Maxwell Krohn. "We’re collecting data now and just realized that OKCupid must have a HUGE amount of awesome data for this," Zeiler wrote, according to court documents.
OkCupid, whose executives had invested in Clarifai, then provided the trove of personal photos. They also shared other sensitive user data, like demographics and location. This exchange formed the basis of an FTC investigation which found that this sharing **should have been prohibited** by OkCupid's own privacy policies at the time.
A Twelve-Year Secret, Exposed by a Newspaper
This secret deal might have remained buried if not for a 2019 New York Times article. The report mentioned Clarifai's use of OkCupid images, finally triggering the FTC's investigation into a data grab that had occurred five years prior.
The fallout culminated last month in a settlement. While OkCupid and its parent company, Match Group, did not admit to deceiving users, Clarifai's confirmation that it deleted the data and models speaks volumes. The FTC also alleged that Match Group and OkCupid **deliberately concealed the behavior** and tried to obstruct the investigation.
What This Means for Your Digital Future
So, what's the consequence? The FTC has now permanently prohibited OkCupid and Match from misrepresenting their data practices. In essence, they are banned from doing something they should never have done in the first place.
This case is a stark reminder: the intimate details you share on an app are never just between you and potential matches. They are a potential asset, a line of code in a training dataset, traded in emails between founders. The real shock isn't just the breach—it's how easily and casually it was set in motion.