Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

wouldn't it make more sense to add more images, not remove existing data?


If I understand correctly, they are removing incorrect tags, not the data. It's just news because some of the incorrect tags are racist bias. If they removed a bunch of cats tagged as dogs, you wouldn't hear about it.


IIUC the database is intended to identify the actual contents of images, not collect data about stereotypes. Even keeping the categorized face images in a separate database wouldn't be that useful, since turkers are not a random sample.


I think it's temporary. Sensible defaults to protect users who don't understand potential weaknesses of the model. Also, if someone does train an AI on a bunch of photos and ends up with a biased result, it won't be ImageNet's maintainers' liability.


It seems to be largely the tags that are biased. Long term, I'm sure people want big unbiased databases of images of people, but it will take a lot of time and effort to build them and ensure they aren't biased.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: