advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

Scientists at Google debunk “gaydar” AI

During the second half of 2017 researchers at Stanford University claimed they had developed an artificial intelligence which could determine a person’s sexual orientation using only a photo.

The AI was able to determine whether a person was straight or gay with 71 percent accuracy earning it the moniker of “gaydar”.

Now, researchers at Google and Princeton University are saying that research is about as accurate as a game of Pin the Tail on the Donkey.

The team consisted of Margaret Mitchell, Agüera y Arcas and Blaise Agüera y Arcas from Google, as well as Alex Todorove a psychology professor at Princeton University.

Together the team of researchers set out to debunk the sexual orientation AI and they did it in a manner of speaking.

The team found a number of flaws in the original study such as some faces used by the AI wearing glasses or women wearing more eye shadow. Essentially the team posited that the original AI was looking at patterns in grooming, presentation and lifestyle rather than the facial structure as it alleges.

The researchers found that a number of behaviours and choices might visually imply a person’s sexual orientation but it’s as accurate as a coin flip.

For one, the original paper alleged that the way straight men take selfies might account for an AI determining those individuals are straight.

The crux of the matter is that the Google team found that the differences in gay and straight faces can be seen if the photo contains makeup, eye-shadow, facial hair, and glasses. The selfie angle and amount of sun exposure can also help determine orientation.

Simply put, Stanford’s AI is not as great at determining sexual orientation as previously thought and it appears as if it is simply looking for correlations that are affected by our day-to-day influences rather than unchangeable aspects of your physiology.

“This doesn’t negate the privacy concerns the authors and various commentators have raised, but it emphasizes that such concerns relate less to AI per se than to mass surveillance, which is troubling regardless of the technologies used,” said the Google team.

So AI can’t really determine whether an individual walking through a city centre is straight or gay based on your bone structure but that shade of eye-shadow might.

 

advertisement

About Author

advertisement

Related News

advertisement