Scary Software Uses Your Face Pic To Determine if You Are Gay

man staring blue eyes

Condemned by HRC and GLADD

Just when you think you’ve heard everything, something like this comes along.

According to a report in the Economist, new software has been developed by Researchers Michal Kosinski and Yilun Wang from Stanford University that can detect a person’s sexual identity.

According to the article, the developers used a total of 35,326 pictures of both women and men, gay and straight. All were evenly sampled from an unnamed dating website.

More: Does your face pic scream you are gay?!

The images were then uploaded to a piece of software called VGG-Face; a technology that creates “faceprints” of imagery through a numerical sequence.

Once completed, the software uses a predictive statistical model involving logistic regression to predict the person’s sexual orientation via the faceprint.

You may be wondering about its accuracy?

When fed one picture of a gay man and one picture of a straight guy, both randomly chosen, the software accuratly guessed 81% of the time.

But there’s more …

When fed five pictures of each man, accuracy increased, correctly choosing orientation 90% of the time. Apparently, the more photos uploaded, the greater ability this technology has to figure out someone’s sexual orientation.

Women didn’t do as well as the guys with this program. When fed a photo of a straight woman and one of a lesbian, it had an accuracy rate of approximately 70%. That number increased when 5 photos were uploaded.

That said, the technology seems to predict sexual orientation better than human beings.

As reported by the good folks at Queerty:

“When researchers gave the same images to people, they could only tell the gay man from the straight man 61% of the time, and only 54% of the time for the women.

Kosinski tells the Economist that he conducted the experiment primarily to remind people of the power and potential dangers of machine vision. He believes that there will be a day when privacy as we know it no longer exists and that the AI systems may eventually be trained to determine other intimate traits, such as a person’s IQ or political views.”

The Human Rights Campaign and GLADD have both come out and condemned the software, calling it anti-gay. Both statements appear on Towleroad’s website.

From the HRC website:

“This is dangerously bad information that will likely be taken out of context, is based on flawed assumptions, and threatens the safety and privacy of LGBTQ and non-LGBTQ people alike. Imagine for a moment the potential consequences if this flawed research were used to support a brutal regime’s efforts to identify and/or persecute people they believed to be gay.

Stanford should distance itself from such junk science rather than lending its name and credibility to research that is dangerously flawed and leaves the world — and this case, millions of people’s lives — worse and less safe than before.”

Obviously, the implications for this kind of technology are profound and dangerous. GPB will provide updates on this story as it becomes available.

h/t: Queerty and Towleroad

About John Lannoye 178 Articles
John Lannoye is editor and founder of Men's Variety. Based in Chicago, he blogs on topics related to health, grooming, wellness, relationships and men's grooming. Follow him on Twitter.