Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Doesn't this just feed into people's fears about conspiracies? I can't understand why people are signing off on this. Let the Like/Dislike button do its job.


Youtube's recommendation algorithm means that the people who see these videos are already predisposed to believe what is in them. That overrides the usefulness of the Like and Dislike buttons.


Also, even if people outside of those bubbles see and dislike those videos, the Dislike button is more likely to feed and reinforce their primary audience's persecution complex than to cause them to reconsider.


I am reminded of the videos we have probably all seen of Flat Earthers doing experiments that are supposed to show whether the Earth is curved, the experiment does show a curve, and then they start to question why their experiment is failing rather than question their beliefs. Beliefs are a pretty strong thing.

It is much easier to stop harmful ideas from spreading rather than trying to argue down these harmful ideas once they have already taken a foothold in someone's mind.


I know at lest 2 flat earthers who were given anti-psychotic medication and stopped being flat earthers.


> It is much easier to stop harmful ideas from spreading rather than trying to argue down these harmful ideas once they have already taken a foothold in someone's mind.

It’s fucking terrifying that anyone holds this view, considering the implications. That’s CCP logic. Pro-censorship, “for their own good”?

Have we lost our fucking minds?


Do you think Germany lost their fucking minds when they outlawed Holocaust denialism?

You have full access to facts here. Nothing is being truly hidden. Hell, all of these videos can still be shared a million other ways. It’s insane, systematic misinformation that YouTube doesn’t want to actively help spread.


Down votes cause of whole set of reactions. Many of which prompt further inquiry and research. Humans have a wide range of reactions to things. An analysis that lumps them into a narrow set of reactions is useless.


So perhaps the problem actually lies with the recommendation algorithm, and youtube should do something about that.


No argument here. I would imagine a good deal of the people who think misinformation is a serious problem on Youtube would agree that the recommendation algorithm is a huge problem.


I wonder if this would be less of an issue if these videos were unlisted or shadowbanned instead. I'm fairly sure YouTube already does this (either intentionally or through unintended consequences of the recommendation engine) so why couldn't they do that here?


Not really. I see a couple videos about guns and YouTube fills my recommendations with nut jobs. I see a couple of video game reviews and YouTube assumes I want to watch videos about SJWs performing white genocide. And so on.


I think you're overestimating the average internet user. A lot of people think that because they view something online, it gives it authority. They fail to understand anyone can (and does) post content online.

I've often thought this way of thinking is a remnant of the media we've consumed in the past, where there were higher standards because producing media was more exclusive. Think print, radio, TV. On those mediums not just anyone could produce content.


It is so strange to me specifically because the cohort that seems to be eating this up(my parents age, I mean) were the same ones telling me NOT to believe anything I read on the internet when I was young and the internet was new.

What happened?


They started getting targeted with stuff aimed at their existing biases.

(Along with an endless stream of Minions memes, it would seem.)


It's easy to disbelieve what you read on the internet when it's one voice in a thousand. But increasingly, all thousand voices are telling you what you want to hear.


[flagged]


And forcing a private company to carry content it disagrees with is what exactly?


Age-related cognitive decline.


You mean those dumb boomers and generation X'ers who were raised on the boob tube rather than YouTube.


This plays exactly into their narrative. "If they didn't steal it, they wouldn't have to work so hard to cover it up!"

Megacorps should never become arbiters of truth.


Publishers have always been the arbiters of truth. "Anyone can publish anything targeting anyone" is a very recent development.


Sure, but are they publishers, or hosters? Isn't that the whole CDA Section 230 debate? Either be an impartial host (other than legal requirements) or be a publisher with all the trappings that come with that.


> Isn't that the whole CDA Section 230 debate? Either be an impartial host (other than legal requirements) or be a publisher with all the trappings that come with that.

No. That is not how 230 works at all. 230 does not add any requirements for web hosts. They don't have to conform to some "impartial host" or "publisher" distinction. It just means that websites cannot be sued for illegal content on their platforms, regardless of whether they attempt to remove other similar illegal content.


You (and Ted Cruz) are under the illusion that platforms must be neutral. There is nothing in Section 230 mandating neutrality. Section 230 deals with liability of user content. The First Amendment gives platforms the right to censor.


I think we'll see if anyone is willing to fight the case that removing disinformation campaigns is enough to trip Section 230 immunity.

I can't imagine a judge hearing "we removed a malicious and harmful disinformation campaigns using our platform that were trying to undermine election integrity and trust in government and scientific bodies during a global pandemic" and saying "Yep, obviously that's editorial slant."


> Sure, but are they publishers, or hosters?

If a platform chooses a video to play next, based on "what we know you like, what we think you'll like" then it cannot be a mere hoster. Choosing means making a choice.

Youtube has been called a "radicalisation engine" with some evidence for that (search terms: "youtube radicalization", "Algorithmic Extremism"). This might just be a terrible consequence of "maximising engagement" rather than pushing a fixed political agenda. But evidently, they want to do better.


I would footnote truth to show it means whatever that publisher believes. There are plenty of older published texts that contain incorrect information, and sometimes outright lies (propaganda).


"Anyone can publish anything targeting anyone" happened in the past when the printing press was developed and paper got cheap enough that people could just print up pamphlets or newspapers and give them away on the streets. Around the 1880's. Yellow journalism. All sorts of different points of view getting spread. Communism, anarchism, etc. Then going father back at the very beginning of the printing revolution we had protestantism and religious wars that lasted a century (Hundred Years War, etc.) The internet is this but with even lower hurdles to publication and extremely wide and inexpensive distribution.


The invention of the press certainly made broad publishing much more viable than it was previously, but the capital requirements to distribute a broadsheet to the entire world, even in 1990 were beyond the reach of pretty much everyone except a few very large corporations and governments. Hence the aphorism "Freedom of the press belongs to those who own one."

The capital requirements today are an internet connection, a phone, and about ten minutes of typing. For the vast majority of people today, the ability to publish to the world is beyond the wildest dreams of William Randolf Hearst.


YouTube was originally about "you" publishing whatever you want to the "tube". Now it's turning into some kind of censored truth filter, which is not very appealing.


It is about being appealing to the advertisers. Nothing else matters.


Why would an advertiser not like this type of content? They would know exactly who to target if they were watching this content.


Legally, Youtube is not a publisher. If it was it would lose the Section 230 protections, that is the whole point of that law, it is separate who is a publisher (like a news company) and who is just a "dumb wire" (a ISP for example, was the original goal).

Section 230 was created after people sued ISPs using laws intended to allow lawsuits against newspapers.

Section 230 is overall, a good law, but the bigtech abuse of it is greatly risking it to be killed for good, they are supposed to be either a platform, or a publisher, not Frankenstein's Monster that has parts of both, having the protections of a platform while having the powers of a publisher.


>Section 230 was created after people sued ISPs using laws intended to allow lawsuits against newspapers.

No, they sued a Prodigy as a forum, not as an ISP. The distinction is critical. ISPs are closer to common carriers.

https://en.wikipedia.org/wiki/Stratton_Oakmont,_Inc._v._Prod....


Facts haven't influenced their narrative one bit. The "overturn the election" effort is about 1-51 in the courts, across states, Federal levels, and partisanship of judges.

There's a point where enough is enough, and a blatantly dishonest treasonous war on the very fabric of the government is over the line.


For existing users maybe, but there's an argument to be made that you're saving newer users that haven't yet seen this content from being brainwashed.


I'd remove this kind of content from a public pinboard anytime if I was in charge of one, so I personally find it hard to object if Youtube removes it.

However, I also believe that any algorithmic removal without human oversight should be prohibited by law. The abuse of DMCA requests should be addressed, namely by prohibiting algorithmically generated requests and by treating repeated bogus DMCA takedown requests in a way similar to commercial copyright infringement.

Just letting companies take down whatever they like in whatever way they want is not the right solution. They are basically "information utilities" and provide public goods of a value to society that goes beyond those company's personal advertisement income.


For the existing folks, almost certainly, but it prevents the spread of misinformation; and more importantly, it prevents the _repetitive reinforcement_ of misinformation.


Maybe for that one platform on this one issue. It wouldn't prevent the major networks from repeatedly spreading misinformation, nor on unidentified issues. I remember for years the major networks were stating that the gender wage gap was the difference between a man and a woman in the same job with all else being equal. Yet the BLS study they were getting the numbers from show a fundamentally different issue.

YouTube can censor whatever and however they want. I just don't see this as making much of an impact either way.


Assuming it's misinformation, this time but especially in the future.


Likes and disklikes are way too easy to game imo


Right, if you looked at likes/dislikes you'd think Trump won re-election by a landslide and Bernie won the primary easily.

Joe Biden, who won the most votes of any candidate in American history with 66% turnout and 51.3% of the vote, still had his announcement video [0] at 50/50 like/dislike.

[0]: https://www.youtube.com/watch?v=VbOU2fTg6cI&feature=youtu.be


If he only won 51.3% of the vote then a 50/50 like/dislike sounds rather representative actually, much more so than likes/dislikes on Youtube usually are.


The point is that Likes are feelings, not truthometers.


But what if people don’t “Like” the truth?


This is the critical question, and it baffles me that people don't reflect on this a bit more. The "marketplace of ideas", whether it be through back-and-forth dialogue expressed through media platforms, or through likes or comment sections or follower amounts... there's no reason to believe that that process leads to the best ideas coming to the top.

So that can't and shouldn't be cited as a reason not to be concerned over the spread of misinformation.


> there's no reason to believe that that process leads to the best ideas coming to the top.

I disagree, the reason is because when people have access to multiple ideas they can choose what ideas are best for themselves and other people can judge the outcome based on both the ideas and their effects on people.

> So that can't and shouldn't be cited as a reason not to be concerned over the spread of misinformation.

Without free and open exchange of ideas its not nearly as easy to even tell what is misinformation.


The dislike button might inform The Algorithm, and it might inform users who are using the YouTube web UI (if they care to look); but it remains silent on video embeds on Facebook/Twitter/etc, and on "less-interactive" watching interfaces like the YouTube Apple/Android TV apps.


> Let the Like/Dislike button do its job.

Community moderation has completely and totally failed. Giant, enormous lies are being told to half the population without barrier. People who want to believe in a completely fabricated story about election fraud have all their eyeballs can get.

Fine. I get the free speech angle. But... who cares about free speech when democracy is dying in front of us? Free speech was supposed to have prevented this. It made it worse.

I don't have the answer here. But all I can see is people trying to deny the problem.


Millions of people believe the election was rigged in the US. This is a total failure of building trust in elections, regardless weather its true or not.

I personally believe in "Trust but Verify". There is no way to verify this election. The process is made impossible, because signature check was not done on mail in ballots in any meaningful way. And has not been done now either in any of the recounts. We only know that currently 0.02% of ballots were rejected vs 6% previous elections.

To my common sense, this election could have been spammed with mail in ballots, and almost everything accepted. No meaningful signature match was performed, when only 0.02% of ballots were rejected.


> This is a total failure of building trust in elections

Rather it's a success in tearing down that trust. And the tools used in that effort were lies like this one:

> We only know that currently 0.02% of ballots were rejected vs 6% previous elections.

See: https://www.reuters.com/article/uk-factcheck-georgia-rejecte...

It's lies. It's all lies. You're reading lies and choosing to believe them[1]. So is it any wonder that Youtube is thinking that maybe they shouldn't have a hand in spreading those lies?

[1] As proven by your reply that the link does not "prove a lie"! That's not the way the burden of proof works. You are choosing to believe this categorical statement given to you without support. That is a "lie" in the vernacular.


The fact check does not prove a lie. At most it proves that Trump is not making the statement based on publicly available data. Since the final rejection rate for 2020 have not yet been released.

From the fact check:

"Georgia rejected 6.42% of mail-in ballots in total in the 2016 general election "

"The higher percentage he mentions for past years is likely based off the total rejected ballots (here) which can not be compared with 2020, as this information is not available."


This site seem have the most up to date numbers. "Rejected absentee/mail-in ballots as a percentage of total absentee/mail-in ballots returned, 2016-2020"

Georgia 2016 6.42%

Georgia 2020 .60%

Source https://ballotpedia.org/Election_results,_2020:_Analysis_of_...

Which cites the official Georgia election website as source of the raw data.


The Reuters link specifically addresses that: it's a tabulation error, the older data shows ballots rejected for all causes, the 2020 data is limited to those rejected for signature mismatches only, they haven't released the all-cause number yet. And the signature rejection numbers broadly match up, both being sub-1%.


It isn't just Georgia. The other states reporting have a order of magnitude drop in absentee ballot rejections. Seems like a problem?


I mean if you even tried to use logic you could understand why there would be drastically less rejected mail in ballots then normal years. There were massive information campaigns about how to correctly fill out ballots and the things to be careful of leading up to this election because millions of people were going to be voting by mail this year. In previous years there is 0 information about it at all. Also most people learned about the mistakes when their ballots got rejected in the primary.


I don't think you could get 99.98% of people to tie their shoes correctly. Let alone fill out a paper ballot, and match their signatures. Have you ever worked with below average intelligence people?


When we covered the 1st Amendment in school and talked about censorship I always thought “who could possibly be against the 1st Amendment?”.

Apparently a lot of people!


It literally begins “Congress shall make no law...”


Oh don't try and weasel out of it. America's respect for freedom of speech is codified in the constitution, but goes far beyond that and is a core tenant of our social contract.

Yes, Google is entirely within their right to censor whatever they want on their platform. Does it make them look like anti-free speech goons? Yeah, it does.


> America's respect for freedom of speech is codified in the constitution

Is it? Where is this implication that the constitution is simultaneously both a legal document and also a list of broad values held by the nation's people?


I mean, it's a founding document? It wouldn't be in there unless there was general agreement it is important?

That's a bit of a silly question.


There are lots of things that I believe to be important for the government but not at all important for other contexts. I suspect that's true for you too.

You probably support democratic control over the government where each citizen gets one vote. But very few Americans support democratic control over private corporations (that would be socialism after all). There are enormous numbers of such examples.

I think it is unreasonable to claim that because the constitution limits the state's ability to restrict speech that Americans believe that other actors should not be able to restrict speech in places where they exhibit control. It certainly could be the case that Americans support free speech more broadly, but it definitely does not follow from just being in the constitution.


I mean, I’m not basing it off the constitution alone, but also the pervasive reference to free speech across American discourse. I mean, some joker will put a giant middle finger on his lawn in defiance of a developer who wants to buy him out and Americans will rally around his right to do so.

The 1st amendment doesn’t exist in a vacuum.


> I’m not basing it off the constitution alone

IMO, this does not come across in the posts I responded to.


Perhaps? But those people probably already think Youtube's part of any number of conspiracies already.


I doubt this policy is based on any sort of broad principle. Youtube just doesn't want its platform to play any part in the currently ongoing coup attempt.


If there is widespread election fraud (which has not been proven and includes an undefined term used as a negative persuasion tool "widespread" anyway) then that is the coup attempt you refer to and their platform is host to uncountable instances of people claiming a certain person has been elected and now there will be no counter information for people to consider the alternatives.


Yes, that's true. Again, I don't think this is about any higher principle; many people just aren't willing to maintain editorial neutrality with respect to a coup, and I don't think it's reasonable to expect them to.


[flagged]


Hell yeah. I'm all for it.

I think we can all agree that putting information on the internet was a mistake. Ask Google who invented running, or how many legs a horse has. The answers are clear, authoritative, and obviously wrong.

And they make billions of dollars. We're expected to believe some Podunk in the middle of Podubkstan has more access to facts? Please.

The internet, as in its companies and its users, cannot separate truth from fiction. Thus, we need to simply remove all facts. An internet without the pretence of truth would be a much more honest place.


> We're expected to believe some Podunk in the middle of Podubkstan has more access to facts?

Certain facts, yes. A lot of great minds are disconnected from institutions and corporations (Gwern comes to mind-https://www.gwern.net/index). And when some of the things they've taken the time to research or present arguments on go against the preferred narrative of those institutions, now, they can be disappeared with this decision as precedent.

Putting that into an example: say Corporation X releases a new frozen food that triggers a rare autoimmune disease leading to death in certain people. A small-town doctor ("Podunk") has experience with that disease, understands why the food triggers the response it does, and speaks out.

If Corporation X doesn't like that message, with moves like this, now it will be justified (or at least, rationalized) to get rid of that doctor's information. The argument you're presenting here—that they're not credentialed enough—means more people are ultimately in harm's way.


But what could possibly separate this from a hit piece by a competitor? Or lies made up by health-food advocates? Or someone with too much time on their hands making up things for fun?

If it's amplified enough, if it agrees with what people already think, the actual truth or reliability is lost. It's impossible to tell a doctor from someone pretending to be a doctor from someone pretending to be someone who saw a doctor. The impenetrable layers of deception practicable by average people, combined with the unfortunate necessity of credulity for the average person's sanity, means there's no way that outrageous lies won't be hopelessly amplified in an endless feedback loop.

I'm sure you can fill in your own example (flat earth, 5G-covid lies, people who believe wifi gives them headaches...) When it's to something someone has vested interest in? The world has no chance.


> But what could possibly separate this from a hit piece by a competitor? Or lies made up by health-food advocates? Or someone with too much time on their hands making up things for fun?

Ultimately? Nothing but personal responsibility.

If people want to be ignorant, they should be free to, whether you or I like it. Adults can and should make their own decisions. There is no need for babysitting or thought policing. Arguably, that's the beauty of natural selection—the water finds it's level.

The solution, and arguably where efforts should be focused (as opposed to on censorship) is on education. Very few people know how to think critically, primarily because it's not taught. The inability for adults to parse truth from fact, or conversely, be critical of what they're told is fact should be core curriculum.

It's not because, well, it doesn't benefit the system. A dumbed down mass is easy to control and cheap to employ, but that manufacturing of the mind has consequences.

The reality is that companies like YouTube (and other social media co's) are reaping what they sowed. They built their networks and platforms on encouraging short-form, entertaining content and designing experiences that take advantage of human psychology. This is just the end result.

Sadly, instead of working on remedying this—by educating critical thought—now they're just taking the 1984 route.

This interview is telling: https://www.youtube.com/watch?v=d6e1riShmak


So, what, Google should be funding schools? Facebook needs a Critical Thought Corner? Twitter should have enshrined the Steak-umm tweets?

Social media can't solve people being uneducated. It simply adds to the noise.


They certainly could. I'd love to see a guide to navigating social media/social networks that gets delivered to school teachers for addition to their curriculum. These things are a part of our world and who better to suggest how to navigate them than the people making them?

Far more constructive than censorship.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: