Because if it were able to recognize dick-pics, or whatever other manifestation of that "one subject" you might have in mind, then it would sometimes recognize them wrongly and respond to a picture of a toothbrush as if it were a picture of a penis. If the person who drew the toothbrush were of a sensitive disposition, they might get Very Upset at this and make a big fuss. Public attitudes to that One Subject being what they are, this might well blow up into a big public highly-visible fuss, at which point Google might start losing advertisers keen to sell things to people of a sensitive disposition.
Google would prefer not to risk losing a shedload of money just so that their sketch-processing neural network can amuse people by correctly recognizing penises.
It's just this one subject because few others get people so upset.
(And contemplating others that might suggests that actually it's not strictly just this one subject. False positives for "decapitated corpse" or "big pile of excrement", say, might be just as problematic. Want to guess whether the system is good at recognizing decapitated corpses and piles of excrement?)
Oh I understand the PR reasons. But given the importance of sex as a motivating factor in art (not least if you want to count the entire activity as a variety of reproductive signaling) I am very troubled by the notion of making a tool which restricts the scope of acceptable subject matter.
I know this is just an experiment right now but I want to put sex/nudity on the table as a subject of debate because it is central to artistic endeavor because arbitrary standards can become almost universal and institutionalized through path dependence (such as the QWERTY keyboard you are probably using right now). Imagine a not-too-distant future with a Magic Brush that easily allows you to paint the colors and shapes of your choice with the aid of some technological wizardry, but prevents the creation of nude or sexualized figures. That would not be a healthy development.
Google would prefer not to risk losing a shedload of money just so that their sketch-processing neural network can amuse people by correctly recognizing penises.
It's just this one subject because few others get people so upset.
(And contemplating others that might suggests that actually it's not strictly just this one subject. False positives for "decapitated corpse" or "big pile of excrement", say, might be just as problematic. Want to guess whether the system is good at recognizing decapitated corpses and piles of excrement?)