thenewyorker | “The face is an observable proxy for a wide range of factors, like your
life history, your development factors, whether you’re healthy,” Michal
Kosinski, an organizational psychologist at the Stanford Graduate School
of Business, told the
Guardian earlier this week. The photo of Kosinski accompanying the interview
showed the face of a man beleaguered. Several days earlier, Kosinski and
a colleague, Yilun Wang, had reported the results of a
study, to be published in the Journal of
Personality and Social Psychology, suggesting that facial-recognition
software could correctly identify an individual’s sexuality with uncanny
accuracy. The researchers culled tens of thousands of photos from an
online-dating site, then used an off-the-shelf computer model to extract
users’ facial characteristics—both transient ones, like eye makeup and
hair color, and more fixed ones, like jaw shape. Then they fed the data
into their own model, which classified users by their apparent
sexuality. When shown two photos, one of a gay man and one of a straight man,
Kosinski and Wang’s model could distinguish between them eighty-one per
cent of the time; for women, its accuracy dropped slightly, to
seventy-one per cent. Human viewers fared substantially worse. They
correctly picked the gay man sixty-one per cent of the time and the gay
woman fifty-four per cent of the time. “Gaydar,” it appeared, was little
better than a random guess.
The study immediately drew fire from two leading L.G.B.T.Q. groups, the
Human Rights Campaign and GLAAD, for “wrongfully suggesting that
artificial intelligence (AI) can be used to detect sexual orientation.”
They offered a list of complaints, which the researchers rebutted point
by point. Yes, the study was in fact peer-reviewed. No, contrary to
criticism, the study did not assume that there was no difference between
a person’s sexual orientation and his or her sexual identity; some
people might indeed identify as straight but act on same-sex attraction.
“We assumed that there was a correlation . . . in that people who said
they were looking for partners of the same gender were homosexual,”
Kosinski and Wang wrote. True, the study consisted entirely of white
faces, but only because the dating site had served up too few faces of
color to provide for meaningful analysis. And that didn’t diminish the
point they were making—that existing, easily obtainable technology could
effectively out a sizable portion of society. To the extent that
Kosinski and Wang had an agenda, it appeared to be on the side of their
critics. As they wrote in the paper’s abstract, “Given that companies
and governments are increasingly using computer vision algorithms to
detect people’s intimate traits, our findings expose a threat to the
privacy and safety of gay men and women.”
0 comments:
Post a Comment