Facial Recognition for Porn Stars Is a Privacy Nightmare Waiting to Happen
Credit to Author: Jason Koebler| Date: Wed, 11 Oct 2017 19:11:23 +0000
The robots are watching porn. And amateur porn stars should be concerned.
Pornhub announced that it is using machine learning and facial recognition to detect over 10,000 porn stars across the site in a bid to make searching for your favorite star or fetish more efficient. So far, it’s scanned 50,000 videos in the last month of beta testing, and Pornhub plans to scan all five million videos currently hosted on the site by early next year. Eventually—with the help of users, who will upvote and downvote the AI’s guesses—the company says it’ll get to the point where the computer vision can discern hair color, sex position, and fetish.
Pornhub declined to tell me what computer vision company created the AI, citing a non-disclosure agreement, meaning we don’t know if, for instance, the company that created the AI will have access to the data the AI gleans. While the tech seems cool at first glance, and will no doubt be a boon to Redditors shouting “sauce” on every porn gif, there are two major things to consider about this technology: porn piracy and amateur porn star privacy.
Copyright and piracy
Like regular videos, porn clips almost never stay in one place. Just as a single video clip may be posted on YouTube, Vimeo, and other tube sites, porn scenes are often uploaded on Pornhub, YouPorn, xvideos, xHamster, and so on, and they are often uploaded by people who are not the original copyright holders. For instance, the Ted Cruz-approved clip, which was reposted on a porn aggregation Twitter account (and other streaming sites), actually belonged to Reality Kings, whose parent network is Pornhub.
This means that, like the rest of the entertainment industry, the porn industry has a piracy problem. In theory, machine learning could be used to automatically detect when copyrighted content is uploaded and could be used to remove it from Pornhub, creating a system that is similar to YouTube’s “ContentID” system, which matches audio and video tracks to identify copyrighted content uploaded by people who don’t own it.
For porn actors and companies that produce porn, an AI-fueled content ID system would be a lot more efficient than hunting down every pirated instance of their work and sending a takedown notice to the website owner or host—something independent artists of all kinds deal with constantly, to a distracting and soul-crushing degree.
“Imagine if a video of you had been uploaded by an former partner, for example, without your permission and, as a result of the application of AI to a site’s content library, it not only grouped together all videos involving you, but included your name, Facebook or Twitter profile, and so on”
In August, when we spoke about a user-created program to automatically crawl streaming sites to download videos locally, cam model Charley Hart told me that porn piracy is an issue that infringes not only on their creative autonomy, but their livelihood. People often seek out a favorite actor, and when viewers are “trying to get that fantasy for free,” even in the form of less views on their original host site, it hurts their bottom line—and the industry as a whole.
But according to Pornhub, this new AI effort isn’t about preventing DMCA infringements. “We already use a digital fingerprinting technology provider similar to YouTube’s ContentID to scan our videos,” a spokesperson for Pornhub told me. With facial and position recognition technology, this system could get better—from that standpoint, Pornhub’s AI system may be enticing for porn stars and porn companies. But experts warn that there are also grave privacy concerns for amateur porn models.
Actress and Model Privacy and Revenge Porn
Hart told me safety and privacy is of paramount concern for lesser-known actors, who often don’t want their true identities known. Pornhub says it will only use facial recognition on professional porn stars already in the database.
But even if Pornhub deploys this technology in an ethical way, its existence should be concerning. Such technology is unlikely to stay proprietary for long, and given that some people on the internet make a habit of identifying amateur or unwitting models, the underlying tech could supercharge some of these efforts.
But services like Pornstar.id and Megacam’s sex search engine already exist, and make it easier to find someone acting in porn with a reverse image search. These services frame these as finding “doppelgangers” for your fetish, but could easily be used by abusive partners trying to find their IRL victims working online.
Neil Brown, a lawyer specializing in technology and internet law, told me that the distinction between professional and amateurs in the database is very important.
“If the capability was applied to (genuinely) non-professional content, the possibility for harm is considerably higher,” Brown told me via Twitter message.
“This would be particularly true if it attempts were made to include identifiers from other sources, such as shared photographs on social networking sites,” he said. “Imagine if a video of you had been uploaded by an former partner, for example, without your permission and, as a result of the application of AI to a site’s content library, it not only grouped together all videos involving you, but included your name, Facebook or Twitter profile, and so on.”
It’s easy to imagine a future in which a third party uses machine learning, facial recognition, and social media accounts to identify people in a giant database of nudes. Even if Pornhub takes the proper precautions, this isn’t great news for anyone.