> I thought revealing the ugliness of the system was part of the point of this project?
It 100% is. From the link to the artist's website:
"Things get strange: A photograph of a woman smiling in a bikini is labeled a “slattern, slut, slovenly woman, trollop.” A young man drinking beer is categorized as an “alcoholic, alky, dipsomaniac, boozer, lush, soaker, souse.” A child wearing sunglasses is classified as a “failure, loser, non-starter, unsuccessful person.” You’re looking at the “person” category in a dataset called ImageNet, one of the most widely used training sets for machine learning."
I was just about to edit and correct myself; it is ImageNet who has made the decision to delete the offensive images, after the reaction to the exhibitors' work. It's too bad the exhibitors didn't make their own mirror/cache of the dataset. Judging from some tweets I saw, I think this project really helped people to understand how much of current artificial intelligence is human-driven. It's not a sentient computer deeming you to be a "slant-eye", it's a bunch of random Internet users. (not that this makes you feel better about the world, but at least the hate's coming from an expected source)
It 100% is. From the link to the artist's website:
"Things get strange: A photograph of a woman smiling in a bikini is labeled a “slattern, slut, slovenly woman, trollop.” A young man drinking beer is categorized as an “alcoholic, alky, dipsomaniac, boozer, lush, soaker, souse.” A child wearing sunglasses is classified as a “failure, loser, non-starter, unsuccessful person.” You’re looking at the “person” category in a dataset called ImageNet, one of the most widely used training sets for machine learning."