This Algorithm Can Supposedly Tell You If You’re Gay or Straight

A new development out of Stanford University proves controversial.

This Algorithm Can Supposedly Tell You If You’re Gay or Straight

A new development out of Stanford University proves controversial.

Two researchers at Stanford University claim to have produced an algorithm that can determine your sexuality by simply looking at a single photograph.

Michal Kosinski and Yilun Wang put their findings into a new study—which is currently in draft form and has yet to be peer-reviewed, but has been accepted for publication by the Journal of Personality and Social Psychology. “[These] findings advance our understanding of the origins of sexual orientation and the limits of human perception,” the researchers write.

Here’s how it supposedly works: Kosinski and Wang lifted 36,640 photos of men and 38,593 from online dating profiles and funneled the shots through their program. They then coded the program to pick up on characteristics like weight, hair style, jaw width, and nose length—and now, they’re claiming that, by presenting the program with a photo, it can identify the sexuality of the subject with 81 percent accuracy, for men, and 74 percent accuracy, for women. (When the program was provided with five images, those numbers jump to 91 and 83 percent, respectively.)

“Consistent with the prenatal hormone theory of sexual orientation, gay men and women tended to have gender-atypical facial morphology, expression, and grooming styles,” Kosinski and Wang explain of their findings. In other words: they argue that gay men and women have a naturally more androgynous appearance than straight men and women.

Suffice it to say, the study has raised more than a few eyebrows. Mere hours after it was published, GLAAD and the Human Rights Campaign put out a joint statement condemning the research.

“At a time whe[n] minority groups are being targeted, these reckless findings could serve as weapon to harm both heterosexuals who are inaccurately outed, as well as gay and lesbian people who are in situations where coming out is dangerous,” said Jim Halloran, GLAAD’s chief digital officer. Ashland Johnson, the HRC’s director of public education and research, echoed that sentiment: “Stanford should distance itself from such junk science rather than lending its name and credibility to research that is dangerously flawed and leaves the world—and this case, millions of people’s lives—worse and less safe than before.”

The statement goes on to point out a few critical flaws in the study—that it only included white people, that it didn’t factor for bisexual individuals, and that the authors did not verify the age or sexual orientation of any photo subjects—before going on to say that “media headlines that claim AI can tell if someone is gay by looking [at] one photo of [their] face are factually inaccurate.”

In an update to the study’s abstract—notably added this morning, three days after the GLAAD and HRC statement was released—the duo notes that, given the increased use, by governments and corporations, of computer vision algorithms to determine “people’s intimate traits,” their “findings expose a threat to the privacy and safety of gay men and women.”

For more amazing advice for living smarter, looking better, and feeling younger, follow us on Facebook now!

GET YOUR FREE GIFT
SUBSCRIBE