Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It seems you've got it backwards: "tendency for images portraying different professions to align with Western gender stereotypes" means that they are calling out their own work precisely because it is skewed in the direction of Western American biases.


You think there are homogenous gender stereotypes across the whole Western world? You say “woman” and someone will imagine a SAHM, while another person will imagine a you-go-girl CEO with tattoos and pink hair.

What they mean is people who think not like them.


The very act of mentioning "western gender stereotypes" starts from a biased position.

Why couldn't they be "northern gender stereotypes"? Is the world best explained as a division of west/east instead of north/south? The northern hemisphere has much more population than the south, and almost all rich countries are in the northern hemisphere. And precisely it's these rich countries pushing the concept of gender stereotypes. In poor countries, nobody cares about these "gender stereotypes".

Actually, the lines dividing the earth into north and south, east and west hemispheres are arbitrary, so maybe they shouldn't mention the word "western" to avoid the propagation of stereotypes about earth regions.

Or why couldn't they be western age stereotypes? Why are there no kids or very old people depicted as nurses?

Why couldn't they be western body shape stereotypes? Why are there so few obese people in the images? Why are there no obese people depicted as athletes?

Are all of these really stereotypes or just natural consequences of natural differences?


The bulk of the trained data is from western technology, images, books, television, movies, photography, media. That's where the very real and recognized biases come from. They're the result of a gap in data nothing more.

Look at how DALL-E 2 produces little bears rather than bear sized bears. Because its data doesn't have a lot of context for how large bears are. So you wind up having to say "very large bear" to DALL-E 2.

Are DALL-E 2 bears just a "natural consequence of natural differences"? Or is the model not reflective of reality?


That's true for some things, but the "gender bias for some professions" is likely to just be reflecting reality.


Don't really know that, either. They said they didn't do an empirical analysis on it. For example, it may show a few male nurses for hundreds of prompts or it may show none for thousands. They don't give examples. Hopefully they release a paper showing the biases because that would be an interesting discussion.


Yes, the idea is that just because it doesn't align to Western ideals of what seems unbiased doesn't mean that the same is necessarily true for other cultures, and by failing to release the model because it doesn't conform to Western, left wing cultural expectations, the authors are ignoring the diversity of cultures that exist globally.


No, it's coming from a perspective of moral realism. It's an objective moral truth that racial and ethnic biases are bad. Yet most cultures around the world are racist to at least some degree, and to they extent that the cultures do, they are bad.

The argument you're making, paraphrased, is that the idea that biases are bad is itself situated in particular cultural norms. While that is true to some degree, from a moral realist perspective we can still objectively judge those cultural norms to be better or worse than alternatives.


You're confused by the double meaning of the word "bias".

Here we mean mathematical biases.

For example, a good mathematical model will correctly tell you that people in Japan (geographical term) are more likely to be Japanese (ethnic / racial bias). That's not "objectively morally bad", but instead, it's "correct".


Although what you stated is true, it’s actually a short form of a commonly stated untrue statement “98% of Japan is ethnically Japanese”.

1. that comes from a report from 2006.

2. it’s a misreading, it means “Japanese citizens”, and the government in fact doesn’t track ethnicity at all.

Also, the last time I was in Japan (Jan ‘20) there were literally ten times more immigrants everywhere than my previous trip. Japan is full of immigrants from the rest of Asia these days. They all speak perfect Japanese too.


Well that's not the issue here, the problem is the examples like searches for images of "unprofessional hair" returning mostly Black people in the results. That is something we can judge as objectively morally bad.


Did you see the image in the linked article? Clearly the “unprofessional hair” are people with curly hair. Some are white! It’s not the algorithm’s fault that P(curly|black) > P(curly|white).


It absolutely is the responsibility of the people making the algorithm available to the general public.


Western liberal culture says discriminating against one set of minorities to benefit another (affirmative action) is a good thing. What constitutes a racial and ethnic bias is not objective. And therefore Google shouldn't pretend like it is either.

> from a moral realist perspective we can still objectively judge those cultural norms to be better or worse than alternatives

No, because depending on what set of values you have, it is easy to say that one set of biases is better than another. The entire point is that it should not be Google's role to make that judgement - people should be able to do it for themselves.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: