This Artificial Intelligence Is Going to TRIGGER LGBT Leftists After What It Is Revealing

In what is becoming a controversial new study, an AI system has been able to correctly identify a person’s sexuality just by analyzing their photos. The accuracy is so precise it’s alarming to the studies authors and points very clearly to a binary system of sexuality, even though gay and lesbian activists like to say that the spectrum of sexuality is fluid.

Artificial intelligence can accurately guess whether people are gay or straight based on photos of their faces, according to new research that suggests machines can have significantly better “gaydar” than humans.

The study from Stanford University – which found that a computer algorithm could correctly distinguish between gay and straight men 81% of the time, and 74% for women – has raised questions about the biological origins of sexual orientation, the ethics of facial-detection technology, and the potential for this kind of software to violate people’s privacy or be abused for anti-LGBT purposes.

According to The Independent, the researches were “very disturbed” with their artificial intellgence accuracy which came in at 91 percent for correctly identifying homosexual men and 83 percent for correctly identifying homosexual women. (Related: EcoSexual Professor Claims Having Sex With Earth Will Save It — Feminism and Liberalism Going Off the Deep End)

The outlet added that the system’s “gaydar” was more accurate than humans and the study’s findings were published in a paper titled “Deep neural networks are more accurate than humans at detecting sexual orientation from facial images.”

“We used deep neural networks to extract features from 35,326 facial images. These features were entered into a logistic regression aimed at classifying sexual orientation, ” the study explained.

The experiment was co-authored by Stanford University’s Yilun Wang and Michal Kosinski, and found that after analyzing the photos, “The accuracy grew significantly with the number of images available per person, reaching 91 percent for five images.”

 An illustrated depiction of facial analysis technology similar to that used in the experiment. Illustration: Alamy
An illustrated depiction of facial analysis technology similar to that used in the experiment. Illustration: Alamy

“The accuracy was somewhat lower for women, ranging from 71 percent (one image) to 83 percent (five images per person),” the study added.

The study was able to see that gay men and women had “gender-atypical” faces, with homosexual men having “narrower jaws, longer noses and larger foreheads than straight men,” and homosexual women having larger jaws and smaller foreheads than straight women.”

What it essentially said is that gay men looked more feminine and gay women looked more masculine, meaning that these two sexuality types are very clearly and distinctly binary.

Gay activists nuts would have us believe that sexuality is not only predetermined from birth, but also exists on a sliding scale. This technology actually debunks that by being able to correctly identify that these two sexual preferences are clearly categorical.

The study did not attempt to tackle the transgender issue, but our guess is it that if it did, it would correctly guess that a “trans man” was actually a woman and a “trans woman” was actually a man because facial features don’t change that significantly with hormone therapy. (Related: Neil Patrick Harris criticizes James Woods ‘Ignorant and Classless’ over LGBT Parade Tweet – Woods responds)

The researchers of the study suggested that this software could be used to identify the sexuality of people without their consent by using publicly uploaded images.

“It’s certainly unsettling. Like any new tool, if it gets into the wrong hands, it can be used for ill purposes,” said an associate professor of psychology at the University of Toronto, Nick Rule. “If you can start profiling people based on their appearance, then identifying them and doing horrible things to them, that’s really bad.” (Related: See also: Young Australian boy transitioned to female — changes his mind two years later)

“What the authors have done here is to make a very bold statement about how powerful this can be. … Now we know that we need protections,” he added.

The CEO of Kairos, a face recognition company, suggested that with enough information, AI could tell you anything about a person.

AI can tell you anything about anyone with enough data,” Brian Brackeen said.  “The question is as a society, do we want to know?”

He also added that as the data from the Stanford study was “startlingly correct,” there needs to be more of a focus on privacy tools. (Related: Children of LGBT Parents Have Grown Up, Now They’re Spilling the Sick Reality of it all)

Rule also added, “We should all be collectively concerned,” about the AI being used to discriminate people based on the interpretation of their faces.

So much for their argument. Liberals like to pretend to use science to debunk anything they can but now science has debunked them.

Boom.

Sources:

The Guardian

Text Example

Share this article on Social Media by clicking the share buttons on screen, support our independent journalism! Get the word out!