A Frightening AI Can Determine Whether a Person Is Gay With 91 Percent Accuracy
Credit to Author: Louise Matsakis| Date: Fri, 08 Sep 2017 20:30:18 +0000
Artificial intelligence now has the capability to guess a person’s sexual orientation based on photos of their face with startling accuracy, according to new research from Stanford University. In a peer-reviewed paper to be published in the Journal of Personality and Social Psychology, researchers demonstrate the capabilities of a novel machine learning algorithm that can tell from a handful of photos whether a person identified as gay or straight on a dating website.
The work, conducted by Yilun Wang and Michal Kosinski, is raising serious ethical concerns about how it could be abused to further marginalize members of the LGBT community. For one, the researchers’ pessimistic view of privacy is at odds with the work of individuals like Sarah Jamie Lewis, a cybersecurity researcher who studies privacy, specifically for queer people.
“All this paper does is reinforce stereotypes and categories that the queer community is fighting so hard to break,” Lewis told me over Twitter direct message. “But even if we accepted the paper’s premise that someone can appear visually queer then the paper still has major ethical issues around participant consent and the overall aim of the research,” she explained, referring to how the photos used were pulled from a publicly available dating site instead of gathered with the participants’ knowledge.
The deep neural network used by the researchers correctly distinguished between straight and gay men 81 percent of the time, and between straight and gay women in 71 percent of instances when provided one image. When the algorithm had five images of a person to analyze, it could predict whether a man was gay 91 percent of the time and a woman 83 percent of the time. It was trained on a sample of 35,326 facial images scraped from an unnamed US dating website, and used people’s stated preferences as evidence of whether they were gay or straight. The algorithm was also tested on a separate sample of Facebook data.
The research found that both gay men and women tended to have “gender-atypical facial morphology, expression, and grooming styles.” Gay men were found to have narrower jaws, longer noses, and larger foreheads than their straight counterparts. Gay women were found to have larger jaws and smaller foreheads compared to straight women.
“The differences in femininity observed in this study were subtle, spread across many facial features, and apparent only when examining averaged images of many faces,” the researchers wrote in the paper.
The research was limited in terms of what it could determine about human gender and sexuality. For one, people of color were not included, nor were transgender or bisexual people. In fact, the paper does not mention transgender individuals at all. Still, the research has troubling implications for an already marginalized group, and demonstrates how artificial intelligence can be trained to perform ethically dubious tasks.
It’s not hard to imagine the algorithm being used by an anti-LGBTQ authoritarian regime to target or track individuals, or by parents who want to determine the sexual orientation of their children. But the researchers insist they’re aware of the implications of their work.
“We were really disturbed by these results and spent much time considering whether they should be made public at all. We did not want to enable the very risks that we are warning against,” Kosinski and Wang wrote in author notes published along with the paper.
But they say that they ultimately decided to publish the work in order to warn LGBTQ activists of the risks they are facing: “We believe that people deserve to know about these risks and have the opportunity to take preventive measures,” the researchers said in the notes. “We did not create a privacy-invading tool, but rather showed that basic and widely used methods pose serious privacy threats.”
Wang and Kosinski argue that privacy on the internet is already dead. “Essentially, we believe that further erosion of privacy is inevitable, and the safety of gay and other minorities hinges not on the right to privacy but on the enforcement of human rights, and tolerance of societies and governments,” they write in their study notes. “In order for the post-privacy world to be safer and hospitable, it must be inhabited by well-educated people who are radically intolerant of intolerance.”
“There may very well be useful, fascinating research to be done here that will benefit the queer community—but it needs to be done ethically and consensually and by people who understand the issues at hand,” Lewis told me.
https://motherboard.vice.com/en_us/rss