A computer algorithm can tell if a person is gay or straight simply by scanning a picture of their face, researchers have shown.
Artificial intelligence software developed at the University of Stanford can predict a person’s sexuality with far more accuracy than humans, suggesting a “gaydar” app may not be far away.
The research has potentially troublesome implications for citizens’ privacy and safety.
Police around the world, including in countries with questionable human rights records, are increasingly turning to facial recognition to monitor crowds, and the combination of CCTV and an algorithm that can detect sexuality could have worrying consequences in states where homesexuality is outlawed.
The algorithm was able to tell if a man is gay or straight using one picture 81pc of the time, and could determine a woman’s sexuality 74pc of the time. Humans were much less accurate in comparison, correctly guessing just 61pc of the time for men and 54pc for women.
When the computer was given five pictures of a person, it answered correctly 91pc of the time for men and 83pc for women.
The researchers trained the AI using pictures of 36,630 men and 38,593 women, taken from online dating profiles of gay and straight people. The algorithm was able to detect subtle differences in facial structures that humans are incapable of picking up.
The differences may relate to the level of hormones such as testosterone that foetuses are exposed to in the womb, which may determine sexuality, the researchers told The Economist.
Facial recognition technology is becoming increasingly speedy, reliable and accurate. It is being included in the latest smartphones as a security feature and being employed by governments to tackle crime.
The Metropolitan Police has used facial recognition technology during the Notting Hill Carnival for the last two years, albeit with limited success, while crowds around the Champions League final in Cardiff were also monitored.
Homosexuality is illegal in dozens of countries, and hate crimes against gay, lesbian, bisexual and transgender people in the UK have skyrocketed in recent years, so the technology could put gay people at risk.
“Given that companies and governments are increasingly using computer vision algorithms to detect people’s intimate traits, our findings expose a threat to the privacy and safety of gay men and women,” Michal Kosinski and Yilun Wang, the researchers behind the project, said.
The researchers found that the computer program was less reliable in the real world, outside the confines of the experiment using dating site photos, but when it was asked to pick the people it was most confident were gay, nine out of 10 proved to be right.