IE 11 is not supported. For an optimal experience visit our site on another browser.

Controversial AI 'Gaydar' Study Spawns Backlash, Ethical Debate

Research suggesting that artificial intelligence tools could accurately predict a person's sexuality by assessing a photo is now under ethical review.
Image: Profile outline of man's head over circuit board
Profile outline of man's head over circuit boardGary Waters / Getty Images/Ikon Images

Following a backlash from academics, technology experts and LGBTQ advocates, a controversial study suggesting artificial intelligence can predict a person's sexual orientation by analyzing a photo of his or her face is now facing additional scrutiny.

The study — which was conducted by Stanford University researchers, peer reviewed and accepted for publication by the American Psychological Association's "Journal of Personality and Social Psychology" — came under fire soon after The Economist first reported on it last week. A spokesperson from the American Psychological Association confirmed to NBC News on Wednesday that the organization is taking a "closer look" at the research given its "sensitive nature."

“At a time where minority groups are being targeted, these reckless findings could serve as [a] weapon to harm both heterosexuals who are inaccurately outed, as well as gay and lesbian people."

The study, titled “Deep neural networks are more accurate than humans at detecting sexual orientation from facial images,” involved training a computer model to recognize what the researchers refer to as the "gender-atypical" traits of gay men and lesbians.

"We show that faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain," says the abstract of the paper, written by researchers Yilun Wang and Michal Kosinski. "Given a single facial image, a classifier could correctly distinguish between gay and heterosexual men in 81% of cases, and in 74% of cases for women. Human judges achieved much lower accuracy: 61% for men and 54% for women."

"Consistent with the prenatal hormone theory of sexual orientation, gay men and women tended to have gender-atypical facial morphology, expression, and grooming styles," the paper's abstract continued.Related: 'Trans Women Are Women': Single-Gender Schools Revisit Admissions Policies

Among those taking issue with the research are LGBTQ advocacy groups GLAAD and the Human Rights Campaign. The organizations released a joint statement slamming the study and how its findings could potentially be used.

“Technology cannot identify someone’s sexual orientation. What their technology can recognize is a pattern that found a small subset of out white gay and lesbian people on dating sites who look similar," GLAAD Chief Digital Officer James Heighington stated, referring to the method the researchers used to obtain the images used in their study.

“At a time where minority groups are being targeted, these reckless findings could serve as [a] weapon to harm both heterosexuals who are inaccurately outed, as well as gay and lesbian people who are in situations where coming out is dangerous," Heighington continued.

"Blaming the technology deflects attention from the real threat which is prejudice, intolerance and the other demons of human nature."

Jae Bearhat, who identifies as gay and nonbinary, expressed personal fears about the possibility of this type of technology, saying it could be dangerous for LGBTQ people.

"At the very least, it resurrects discussions over 'gay genes' and the concept of homosexuality and queerness as physically identifiable traits," Bearhat said. "Setting it within that sort of strictly biological framework can easily lead to perpetuation of ideas around curing, preventing and natal identification of homosexuality, which can backslide into precedents around it as a physiological deviation or mental illness that needs 'treatment.'"

Also sounding the alarm are academics like Sherry Turkle, a professor at the Massachusetts Institute of Technology and author of the book “Reclaiming Conversation.”"First of all, who owns this technology, and who has the results?" Turkle said in a phone interview. "The issue now is that 'technology' is a catchphrase that really means 'commodity.'"What it means is, your technology can tell my sexuality from looking at my face, and you can buy and sell this information with purposes of social control."

Turkle also speculated that such technology could be used to prevent LGBTQ people from employment and could make institutional discrimination more efficient.

"If it turns out the military doesn't want anyone like me, they or any other organization can just buy the data," she said. "And what about facial recognition that could tell if you have Jewish ancestry? How would that be used? I am very, very not a fan."

Alex John London, director of the Center for Ethics and Policy at Carnegie Mellon University, said the research out of Stanford underscores the urgency of promoting human rights and strengthening antidiscrimination law and policy within the U.S. and around the globe.

"I think it is important to emphasize that this research was carried out with tools and techniques that are widely available and relatively easy to use," London said. "If the reported findings are accurate, it is another stunning example of the extent to which AI techniques can reveal deeply personal information from the accumulation of otherwise mundane items that we willingly share online."

He added, "I can’t imagine how anyone could put the genie of big data and AI back into the bottle and blaming the technology deflects attention from the real threat which is prejudice, intolerance and the other demons of human nature."

For his part, Kosinski has defended his research, saying on Twitter that he's glad his and Wang's work has "inspired debate."

The two also pushed back in a statement, in which they characterized criticism of their findings as coming from lawyers and communication officers lacking in scientific training.

"If our findings are wrong, we merely raised a false alarm," the statement reads. "However, if our results are correct, GLAAD and HRC representatives’ knee-jerk dismissal of the scientific findings puts at risk the very people for whom their organizations strive to advocate."

Follow NBC Out on Twitter, Facebook and Instagram