Monday 11 September 2017

Everything that's wrong with that study which used AI to 'identify sexual orientation'

TwitterFacebook

Advancements in artificial intelligence can be extremely worrying, especially when there are some pretty serious intimate and privacy issues at stake.

A study from Stanford University, first reported in the Economist, has raised a controversy after claiming AI can deduce whether people are gay or straight by analysing images of a gay person and a straight person side by side. 

LGBTQ advocacy groups and privacy organisations have slammed the report as "junk science" and called it "dangerous and flawed" because of a clear lack of representation, racial bias and reducing the sexuality spectrum to a binary.  Read more...

More about Artificial Intelligence, Lgbtq, Gay, Straight, and Facial Recognition

http://ift.tt/2w1TIZs from Tech http://ift.tt/2wVrLjz
via IFTTT

No comments:

Post a Comment