A researcher has discovered thousands of Tinder users’ images publicly available for free online.
Aaron DeVera, a cybersecurity researcher who works for security company White Ops and also for the NYC Cyber Sexual Assault Taskforce, uncovered a collection of over 70,000 photographs harvested from the dating app Tinder, on several undisclosed websites. Contrary to some press reports, the images are available for free rather than for sale, DeVera said, adding that they found them via a P2P torrent site.
The number of photos doesn’t necessarily represent the number of people affected, as Tinder users may have more than one picture. The data also contained around 16,000 unique Tinder user IDs.
DeVera also took issue with online reports saying that Tinder was hacked, arguing that the service was probably scraped using an automated script:
In my own testing, I observed that I could retrieve my own profile pictures outside the context of the app. The perpetrator of the dump likely did something similar on a larger, automated scale.
What would someone want with these images? Training facial recognition for some nefarious scheme? Possibly. People have taken faces from the site before to build facial recognition data sets. In 2017, Google subsidiary Kaggle scraped 40,000 images from Tinder using the company’s API. The researcher involved uploaded his script to GitHub, although it was subsequently hit by a DMCA takedown notice. He also released the image set under the most liberal Creative Commons license, releasing it into the public domain.
However, DeVera has other ideas:
This dump is actually very valuable for fraudsters seeking to operate a persona account on any online platform.
Hackers could create fake online accounts using the images and lure unsuspecting victims into scams.
We were sceptical about this because adversarial generative networks enable people to create convincing deepfake images at scale. The site ThisPersonDoesNotExist, launched as a research project, generates such images for free. However, DeVera pointed out that deepfakes still have notable problems.
First, the fraudster is limited to only a single picture of the unique face. They’re going to be hard pressed to find a similar face that isn’t indexed by reverse image searches like Google, Yandex, TinEye.
The online Tinder dump contains multiple candid shots for each user, and it’s a non-indexed platform meaning that those images are unlikely to turn up in a reverse image search.
There’s another gotcha facing those considering deepfakes for fraudulent accounts, they point out:
There is a well-known detection method for any photo generated with This Person Does Not Exist. Many people who work in information security are aware of this method, and it is at the point where any fraudster looking to build a better online persona would risk detection by using it.
In some cases, people have used photos from third-party services to create fake Twitter accounts. In 2018, Canadian Facebook user Sarah Frey complained to Tinder after someone stole photos from her Facebook page, which was not open to the public, and used them to create a fake account on the dating service. Tinder told her that as the photos were from a third-party site, it couldn’t handle her complaint.
Tinder has hopefully changed its tune since then. It now features a page asking people to contact it if someone has created a fake Tinder profile using their pictures.
We asked Tinder how this happened, what measures it was taking to prevent it happening again, and how users should protect themselves. The company responded:
It is a violation of our terms to copy or use any members’ images or profile data outside of Tinder. We work hard to keep our members and their information safe. We know that this work is ever evolving for the industry as a whole and we are constantly identifying and implementing new best practices and measures to make it more difficult for anyone to commit a violation like this.
DeVera had more concrete advice for sites serious about protecting user content:
Tinder could further harden against out of context access to their static image repository. This might be accomplished by time-to-live tokens or uniquely generated session cookies generated by authorised app sessions.
Latest Naked Security podcast
Click-and-drag on the soundwaves below to skip to any point in the podcast.