Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wonder where it's going to end.

Once you have started taking sides, started to take down videos you fund spreading incorrect points of view, you can consider to extend it. Videos that doubt the efficaciousness of masks. Videos that say the quarantine is a plot by world government of reptiloid aliens. Videos that promote the idea of flat Earth. Well, this is a dead end.

How about videos that Taiwan is in fact independent from China? Videos that allege that unelected bureaucrars have more power than elected officials? Videos that suggest that there is trouble with voting counting during US elections?

You can easily end up doing way, way more censorship than strictly required to keep your resource legally clean (hate speech, copyright violations, etc). If I were Google (YouTube, Alphabet, whoever is responsible), I would stay as far away as possible from this kind of censorship, purely because of the cost of doing it, and the constant getting into hot water when not censoring enough, or censoring too much. Being a neutral pipe could be plainly better for the business.

Edited: typos.




I would have agreed with you 5 or 10 years ago, full stop.

Today, we live in a society where crackpot ideas are being shared by acquaintances, but unlike 10 years ago there is no common sense and logic being applied at the value and factual nature of the information. The systems are designed and layered in such a way that a person gets a bit of information, that information is highly compounded in their world view, other ideas are not in their funnel. Even if they go and try to validate it (which they may not because everything else in the funnel on their information streams supports/compounds the view) there are whole markets of pseudo news that are entertainment channels masquerading as news supporting the same insane ideas.

These systems act as badly as a schizophrenic thought process slowly detaching the participants from reality as the information they see compounds their world view.


I see the problem. But I don't think that filtering input is a solution. Fixing the processing, that is, critical thinking, looks more efficacious to me.

I think Bertrand Russel, as quoted elsewhere in the comments in https://news.ycombinator.com/item?id=25362730, explained it well.

That is, to me the solution is to help people become less religious / partisan / rooting for the in-group, and become more independent and critical. It's indeed pretty hard to do when most major information streams work in the opposite direction.

Some of my hope is in the built-in protest of teenagers and youth; some in the idea that people with a developed taste would stay away from echo chambers and outrage machines for esthetic reasons, among others.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: