Hacker News new | past | comments | ask | show | jobs | submit login
Study: ChatGPT has 'undeniable gender bias' in its views on jobs (asahi.com)
5 points by rntn on Nov 10, 2023 | hide | past | favorite | 5 comments



So, considering a job that is statistically dominated by a particular gender, ChatGPT was more likely to mention that gender in it's response. What an unexpected revelation.


I hope this is not surprising to anyone. The dataset it was trained on is not bias-free, it was produced by a very much not bias-free society.

A headline like "human language has undeniable gender bias" wouldn't get much attention though.


A surprising number of people, including many people on this very website, seem to believe that these things aren’t just stochastic parrots, that they’re in some way comprehending or reasoning about the world. If you’re inclined that way (I’m not myself, but it does seem to be a thing), then this sort of research makes some sense.


It's worth noting that a bot trained on general data will be as biased, bigoted and superstitious as that data reflects.

This is a warning to those using AI for sensible advice, that any advice will be pretty warped.

Worth mentioning. Don't know if it was worth spending research money on; still it's a wake-up to folks embracing AI uncritically.


Did someone, anyone, pay for this research? I really hope not.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: