Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, the idea is that just because it doesn't align to Western ideals of what seems unbiased doesn't mean that the same is necessarily true for other cultures, and by failing to release the model because it doesn't conform to Western, left wing cultural expectations, the authors are ignoring the diversity of cultures that exist globally.


No, it's coming from a perspective of moral realism. It's an objective moral truth that racial and ethnic biases are bad. Yet most cultures around the world are racist to at least some degree, and to they extent that the cultures do, they are bad.

The argument you're making, paraphrased, is that the idea that biases are bad is itself situated in particular cultural norms. While that is true to some degree, from a moral realist perspective we can still objectively judge those cultural norms to be better or worse than alternatives.


You're confused by the double meaning of the word "bias".

Here we mean mathematical biases.

For example, a good mathematical model will correctly tell you that people in Japan (geographical term) are more likely to be Japanese (ethnic / racial bias). That's not "objectively morally bad", but instead, it's "correct".


Although what you stated is true, it’s actually a short form of a commonly stated untrue statement “98% of Japan is ethnically Japanese”.

1. that comes from a report from 2006.

2. it’s a misreading, it means “Japanese citizens”, and the government in fact doesn’t track ethnicity at all.

Also, the last time I was in Japan (Jan ‘20) there were literally ten times more immigrants everywhere than my previous trip. Japan is full of immigrants from the rest of Asia these days. They all speak perfect Japanese too.


Well that's not the issue here, the problem is the examples like searches for images of "unprofessional hair" returning mostly Black people in the results. That is something we can judge as objectively morally bad.


Did you see the image in the linked article? Clearly the “unprofessional hair” are people with curly hair. Some are white! It’s not the algorithm’s fault that P(curly|black) > P(curly|white).


It absolutely is the responsibility of the people making the algorithm available to the general public.


Western liberal culture says discriminating against one set of minorities to benefit another (affirmative action) is a good thing. What constitutes a racial and ethnic bias is not objective. And therefore Google shouldn't pretend like it is either.

> from a moral realist perspective we can still objectively judge those cultural norms to be better or worse than alternatives

No, because depending on what set of values you have, it is easy to say that one set of biases is better than another. The entire point is that it should not be Google's role to make that judgement - people should be able to do it for themselves.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: