Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What is too outlandish to you is not too outlandish to others. Some believe the 9/11 attacks were orchestrated by the CIA. Some believe the earth is flat. Some believe a Nigerian prince got their email address from their dead friend’s cousin’s mother’s embassy and is about to send them a billion dollars.

How can you be so totally confident that your own bullshit meter is accurate? Why do you think your confidence in your own ability to decipher fact from fiction is any different than people who believe complete bullshit?

If your main objection to censorship is that it is used to hide the truth, why don’t you object equally to people with megaphones intentionally flooding the zone with shit and distributors that enable that behaviour?

Just as the left–right political spectrum is actually more of a circle than a straight line, it seems to me that the spectrum of censorship vs free speech is also a circle. A bad actor may obfuscate truth through censorship; a bad actor may also obfuscate truth through massive volumes of misinformation. The latter is what is happening here. It’s a denial of service attack on human brains, and like a DoS attack, the only way to weather it is to filter the malicious traffic—you can’t just add capacity in the form of “good counterpoints” since human brains can’t scale like that.




" why don’t you object equally to people with megaphones intentionally flooding the zone with shit"

That analogy is simply incorrect. You can spend your life on YouTube watching cat videos, no conspiracy theory in sight. Nobody can force you to watch their video on YouTube. So nobody has a megaphone in YouTube. Only YouTube itself has the megaphone, they can choose what to push to people.

It is NOT TV where you have a single stream that everybody watched, and if you insert shit, everybody watches it.


https://www.dailymail.co.uk/news/article-8360073/More-60-peo... (caution Daily Mail; mildly NSFW sidebar)

"internal ⁦@Facebook⁩ research that found over 60% of people who joined groups sharing extremist content did so at Facebook’s recommendation."

So you're right: youtube has the megaphone. I wonder what the proportion of people watching extremist/disinformation content on youtube because of suggestions is? In some ways it's worse than TV, because if you publish shit on TV you get people writing to the regulator to complain (qv Janet Jackson superbowl nipple ridiculousness). On youtube you may never know what your fellow citizens are watching until they say "of course the world is ruled by lizards, here's the video that proves it".

I wonder if people would accept the compromise that youtube would host this content but force it to "unlisted". After all, the recommendations are their speech, not yours.


... and Facebook's recommendation would be based on their prior activity. If their interests on Facebook (what they like, follow, etc) were mostly cat videos, Facebook wouldn't be recommending extremist groups.

In the meantime, there's a heck of a gulf between whether or not Facebook lets a group be recommended, and actively censoring content dissenting to the chosen narrative.


> If their interests on Facebook (what they like, follow, etc) were mostly cat videos, Facebook wouldn't be recommending extremist groups.

The whole point of recommendation algorithms is to find missing edges in the graph, so it can easily lead you to misinformation in 1 or 2 hops.

Think of it this way: the misinformation content is highly valuable - it generates a lot of engagement. There is always a “potential energy” (people like you also liked...) between low-value content and high-value content that the platforms are attempting to convert to “kinetic energy” (engagement - views, clicks, comments) in order to monetize it. The goal is to find the shortest path to the high value content.


Proof required, from my personal experience moderating a political forum, and from that of other mods, the issue is the flooding of our information networks with Information prions and virii targeting our limbic systems. Social media is currently heavily polluted.


> If their interests on Facebook (what they like, follow, etc) were mostly cat videos, Facebook wouldn't be recommending extremist groups.

You're right, it's not that obvious, it's far more sinister. Cat videos are unlikely to end with you being recommended extremist groups, because there likely isn't much engagement from cat video viewers and extremist groups.

People who are deeply unsatisfied with life, however, might engage if they see it as a way out of their dissatisfaction, inadvertently training the recommendation algorithm to promote extremist content to dissatisfied people. That strikes me as at least plausible, though I don't know if the data is out there to find out what people are recommended what content under what criteria.

> In the meantime, there's a heck of a gulf between whether or not Facebook lets a group be recommended, and actively censoring content dissenting to the chosen narrative.

I disagree with this part. I don't have numbers handy for Facebook, but YouTube gets 500 hours of video uploaded every minute. It is physically impossible for you to see everything that gets uploaded to YouTube. Even if they stopped accepting uploads right now, you'd probably still die before you saw a significant portion of the content available.

Removing something from recommendations is, in most cases, tantamount to censoring it. If there are 500 hours uploaded per minute, and we assume that each video is 15 minutes long (which is likely longer than the reality), that's 2000 videos uploaded per minute. Assuming random distribution of views (which it's not, because of the recommendations), your video has a 0.05% chance of being viewed out of the videos uploaded in the same minute as yours. If you widen that to videos uploaded in the same hour, it goes down to a 0.00083% chance. Widen it to a day and you're down to a 0.0000347% chance. You would get 1 view per 2.8M views if YouTube deleted everything before that day, and killed recommendations entirely. I don't know how typical my usage patterns are, but I only search for probably 1 out of every 25 or 50 YouTube videos I watch. If that's a typical usage pattern, then you would actually get 1 view per 75M - 150M views. If everyone in the US logged on and watched a random video, you would get ~2-4 views.

It's all theoretical napkin math, but there is a staggering amount of data in the hands of Facebook, Google, et al. I do agree that actually removing the content is more significant, but the difference between removing the content and just making it so obscure that it's hard to see unless you're looking for it is basically the same. It's like if newspapers would agree to publish your stuff, but only if you encoded it as the first letter of each line of text. They have technically published your views, they've just made it hard enough to find that the only people who see it is people who already knew it was there.

I don't know what to suggest though. This is almost an inevitable outcome of collecting this amount of content; a lot of it is going to be relegated to some esoteric corner where no one ever sees it.


The 60% number sounds big, but how many people actually joined groups with extremist content? Without that context, the 60% doesn't say much.

It also doesn't say why people joined those groups. Maybe they are just curious to see what the crazy people are up to.


It's crazy. I started writing why I disagreed based on my visceral reaction to the topic. But as I constructed my arguments, they were not sound. So I suppose I agree? Hm.

The above paragraph is sincere--that did happen. And interestingly, it shows the power of consuming the opposing view point. We all know that the government is currently spewing lies, and it is indeed a disgusting and corrosive thing. I want to silence it, but it's easy enough to contempt it from a distance. Let the truth and the lies be heard so that we as a people will grow wise to it all.


Personally I am not from the US, and I don't know that the government is spewing lies. What makes you so sure? And if you are so sure, why are you worried people could be swayed by the lies - why not make them equally sure with the information you have?

However, I am happy with letting the courts decide. Where is the problem?

I have seen lies from all big political parties in the US.


More broadly, I think the trouble is that "lies" are often more appealing than truths by design, while truth is what it is. For example, some Americans may have been swayed to support the Gulf War by the Nayirah testimony, or in 2003 by Saddam Hussein's alleged people shredder. I don't think this justifies censorship, but the ability to sharpen people's BS filter and the amount of bunk they may receive is somewhat asymmetrical, echoing Goering's quote from the Nuremburg trials.

[1] http://www.mit.edu/people/fuller/peace/war_goering.html


But weren't those lies perpetuated by mainstream media? Where, if not YouTube, would you find the counter narratives? And wouldn't people who believe the MSM not then considered the YouTube debunking to be "lies" and called for censorship?


Unfortunately as YouTube and the rest of the internet has grown larger, and more consolidated, [1] the positions allowed have narrowed in scope and counter narratives have become less acceptable. While websites with counter narratives (WikiSpooks for example) do exist, they're generally not very visible anymore. I think what you're describing is largely what is happening will happen, and those who present counter narratives will be de-legitimized, including and conflating both those who are genuinely illegitimate (Dr. Gene Ray/time cube) and those who aren't.

The closest alternative I can see is reading media with opposing spin (People's Daily, RT) and yours and hoping together they composite a clearer picture. For example I would not expect to see this [2] headline in a US paper.

[1]https://www.ncta.com/sites/default/files/platform-images/wp-...

[2]https://canada.constructconnect.com/dcn/news/others/2020/05/...


it's funny the lies this time are coming not from the government but from the opposition and their propaganda machinery.

I'd never consider myself government supporter, but with Trump it's like the last bastion before the country is overrun with far-left SJW hordes swayed by misinformation.

It's ironic the ultimate win for democracy manifests itself in stolen elections.


It's funny how from my perspective the reality is quite nearly the exact opposite of what you puport


yeah, there are 80M people like you and just 74M like me, congratulations.

Both groups are influenced by media and social circles but the first group tends to trust others opinions more, methinks.

The fact mainstream media was pretty much unified in anti-Trump stance strengthen that theory.

If every day for 4 years you hear just how bad is the orange man (from someone you trust) it would definitely shape a certain reality in ones mind.


Well, I have a belief system that's coherent and arrived at through my personal experience, which had me thinking very poorly of the 'orange man' LONG before it became a political thing. In fact, I'm damn horrified at how far the guy got, and I think I understand quite well how it was done.

It's not just some abstract 'otherwise neutral orange man' whose identity is entirely constructed by news media, and that's a strange argument to make. I think many people thought 'Al Capone bad' too, particularly if he'd robbed them or shot somebody they liked. I'm sure the greedy news media HELPED people get mad at Al Capone, and that there were redeeming factors in the guy, but the notion that there were automatically as many redeeming factors in Capone as in everybody else is NOT sensible. Maybe he just was mean, and sucked.

Likewise with 'orange man'. Way before he was a political figure, he was mean and sucked REALLY bad relative to my sense of how things work in the world. Some people just suck very, very much.

If you assume anyone who has success automatically does not suck, I admire your optimism but I sure don't share it. Seems to me that without considerable oversight, the opposite is usually true, and that the worst people, entities, companies etc. win. Hence, the invention of means of oversight, and the attempt to codify what's good and bad.


Yeah, I think this comes back to the false balance. Just because a large portion of the mainstream news dislikes someone doesn't make them biased. Should you trust every story they write about him? Probably not. Is he clearly a demagogue, as can be seen in his unedited speeches? Absolutely. Do other politicians lie? Yeah. Does he lie a lot more brashly and obviously? I'd say so. So it's a bit of a crying wolf situation. It fits his behavior patterns quite clearly to pick up on conspiracy theories, simultaneously exploiting them for his own benefit and seemingly being convinced by them. It also fits the behavior patterns of established Republicans to avoid speaking out against him lest their radical base turns against them, without making strong stances unless it fits their agenda as well. If this so happened to be an instance where orange man right, then I think a lot of reasonable people have dismissed that possibility long ago because of the firehose of misinformation he has historically put out.


"I have a belief system that's coherent and arrived at through my personal experience"

Other people also have coherent belief systems they arrived at through their personal experience, that contradict yours.


Never said I was automatically right, timeeater. All belief systems are coherent to the believer.

They're tested by reality. It seems to be that a lot of the people who say 'orange man bad' and think that's the heart of my position, are currently dying of COVID or giving it to others. And that is their experience, though a lot of those same people are sticking with their belief systems UNTO death, not being shaken from them by their experience.

I will keep an eye out for when things in my belief system seem to be not lining up with reality. I wish those 'other people' would do likewise, but I think I'm better at it.


[flagged]


You're again misrepresenting their statements. They aren't saying that only Trump fans get covid, but that an oversized portion of Trump fans get covid due to fictional ideas about the virus.


Same type of claim, that is not supported by data. If you have the data, please provide it.

In the same vein, you could assume Democrats are more at risk because they put too much faith in masks, thereby entering more risky situations. Not saying that's the case. The point is, your expectation of who gets infected is merely your partisan belief, not anything rooted in evidence.



25 million people participated in the BLM protests this summer... This paper then goes and picks on Trump supporters.


BLM has a purpose, and was despite of covid. Trump rallies are entirely pointless, and everyone there makes a statement of not wearing masks.


If you look at it from a neutral point of view, you’re making a very politically biased statement.


No. You don't have to agree with the purpose, but my statement is factual.


Give me a break...that's obviously your opinion.

First of all, the disease doesn't care about your political opinion; it will spread in protests whether you are a crusader or an infidel. So it doesn't matter what you are protesting about; what matters is disease spread.

Now regarding BLM's purpose, which was police violence presumably. Police kill around 1000 Americans a year. Not an insignificant number but pales in comparison to the pandemic.

Trump rally, pointless, entirely your biased viewpoint. They were protesting the lockdown, which has crippled the economy, shut down a massive number of small businesses, made tons of people lose their jobs, and come January, will evict tons of people. Their protests had a point, but you're obviously misrepresenting them to fit your biases.

So no, your statements were not factual.


No, there has been protests on lockdowns... Trump rallies are not it. He's the president of the US of A, he has actual power to affect things. He just doesn't like responsibility. It's a purely vain exercise.

And I'm not defending some logic around the numbers of BLM vs covid, and it's unfortunate they coincided. I'm saying that the BLM protests had been bubbling for years and through a few incidents came to a real boil this year. I fully agree that it's irresponsible covid-wise to be out in the streets.

Feel free to disagree about scale, but what if the March on Washington of 1963 coincided with a viral outbreak. Should it not have happened? I'll respect your opinion, I'm merely stating that it served a real purpose, and it's hard to pick the right time for it.


You are trying to ignore facts.

Fact:

- He never shared his tax returns - He is a sexual predator - He supports white supremacy - He’s incredibly corrupt

Let’s talk about facts.


thanks for proving my point.


I exercise care and critical judgement in choosing my sources of information, and do my best to be educated and aware.

So no, I don’t prove your point. You just dislike the facts I state.


I think that anyone who doesn't believe that the election was legitimate - i.e. that no meaningful fraud occurred - should be banned from the HN community.

Their "arguments" are full of shit and are a bunch of pseudo-philosophical, pseudo-analytical, pseudo-objective cant.

Their "evidence" is literally disinformation / propaganda.

They act exactly like those crypto-racists who know that their true belief would be deemed unpalatable or unacceptable by the community so they'll blow as much smoke around as possible without ever stepping out to say what they're actually driving towards.

Enough is enough. It's clearly dangerous to allow these views the shroud of legitimacy by giving them space in the public square. It's gone so far that anything other than a rejection functions as a legitimization.

By allowing the lie of election fraud to be presented as just another thing to be discussed on HN, HN is enabling those people and their cause, which is to overturn the results of a legitimate election.

The irony is anyone was to be banned it would probably be me for making this comment. Think on that! ; )


You have a good point, but I think you're downplaying the power of clickbait. I consider myself a relatively smart, educated, and rational person, and I can't tell you the number of outlandish headlines I've clicked on just to see what they say.


The thing with censorship is they aren't going to censor the clickbait. Google knows exactly what clickbait looks like and they could have purged it years ago with a few algorithm tweaks that nobody would have minded. They're going to censor stuff that makes people ask hard-to-answer questions and/or challenge consensus positions.

That sounds like a good idea until it clicks that good scientists ask hard to answer questions and challenge consensus positions. Censorship is fundamentally anti-evidence. People can't present evidence that the channel owners don't like, and people can't model how to handle untrue opinions in conversation because they never come up.


Even if you click on the clickbait, it doesn't imply that you automatically believe everything it delivers.


Indeed you probably clicked on it because it seemed unbelievable


We are talking about an adult audience, right? They are allowed to vote. So you have to assume they are independent enough to make their own decisions. Period. Your argument for censorship sounds a lot like patronisation.


Black-or-white thinking like this is false and damaging. The world has nuance, and the idea that “[people are] independent enough to make their own decisions. Period” rejects that nuance and substitutes a simplified and idealistic model of human behaviour which is alluring but does not reflect how human brains actually work.

It is not patronising to say that people are not all created with the same set of skills, beliefs, and values, and that some will engage with obvious bullshit. Many are not capable or interested in engaging with complex topics in a way that does not self-reinforce their pre-existing opinions. I have discussed this elsewhere[0].

When a group of people are motivated to exploit the weaknesses of others in order to get them to do things that are damaging to our democratic institutions, and they use misinformation to do it, it is unpleasant but not unreasonable to me to suggest that spreading lies through misinformation is as serious as suppressing truth through censorship and that they should both be treated equally seriously.

[0] https://news.ycombinator.com/item?id=25359003


I think you can argue that that assumption is wrong, and that that is also the biggest problem with democracy and voting. You are asking people to vote on a topic they are often not well informed on, or are unwilling/unable to spend time to educate themselves thoroughly. Also anybody of a certain age is allowed to vote, regardless their level of education, and their ability to make sound decisions.

If you want to drive a car, you have to proof you are able to do so. Why not demand the same if you want to vote?


In the US you vote on people, not policy. Most people can't understand all the ramifications of a policy, but a lot of people can spot a crooked liar if the know them a little, we do it every day. That is why I like the older idea of picking state legislators and letting them pick the president and maybe the senators. And up the number of people in the house. The idea is that you should know or know a lot of people who know the person you are voting for. Media won't come into it, civics education won't come into it. It will be like picking a plumber in your neighborhood.


If both of the candidates are crooked liars, how do you choose? I picked the party that I think is closer to being right.


Again, this is patronising. Apparently, you believe your opinion is more worthy then the opinion of others because you assume you are more intelligent and more informed then they are. So in essence you are saying you are the better human. This is a very slippery slope. In fact, I would consider this borderline fascism. But hey, if a lefty utters ssuch nonsense, nobody seems to notice.


Another wrong assumption. On many occasions i haven't voted because i felt i didn't know enough of the topic to make the right vote.


Good question and the answer is no one knows how to create rules about who is or isn’t fit to vote (beyond the most basic things like age, and even with that there’s an interesting history) without also giving so much power to whomever is able to set those rules that the result isn’t a democracy at all.


Exactly my point, and the same applies to corporate censorship. Since we have allowed corporations to perform censorship completely independent and on their own, free speech is a thing of the past.

Another way to see this is that those with less education or influence are basically the new women. The argument against giving women the right to vote was basically based on the same viewpoint. They dont know about the world, so all they can do is harm if allowed to vote.


I think I'm confused, then.

> If you want to drive a car, you have to proof you are able to do so. Why not demand the same if you want to vote?

Not sure if you meant this as a rhetorical question, an ironic one, or a real suggestion about creating restrictions around who can vote.

Either way, "free speech is a thing of the past" is a bit all-or-nothing. It's always been a battle, never as good as we thought it was (such as the argument Noam Chomsky makes in 'Manufacturing Consent'), and always been tricky to figure out what to do when, and where.

But it's not a thing of the past.


Indeed, you must be confused, since I never wrote anything about cars.


Driving on public roads is a privilege. Voting is a right.


Consider an opposite suggestion- if we need an educated voting group for educated decisions, we should move to educate the entire voting bloc.


Agreed, and that's one thing i do vote for: free/cheap education for everyone. I benefit from other people getting educated, so it's a no brainer that government should invest in this.


To be fair while its very distasteful to our current sensibilities, this was not always the assumption in the US, at least among some of the Founders. My understanding of the rationale for the support for the requirement of land ownership was that those who were not finically independent or secure would instead vote for those who issued wild campaign promises (giant walls, closing Guantanmo Bay or withdrawing troops for example) in order to get themselves elected.


> How can you be so totally confident that your own bullshit meter is accurate? Why do you think your confidence in your own ability to decipher fact from fiction is any different than people who believe complete bullshit?

I don't. I also don't believe the people at companies like YouTube are any likely to be better than myself or the average person.


It’s nice to think that everyone is on equal footing, but even ignoring differences in education and genetics, the universal existence of motivated reasoning, emotional reasoning, confirmation bias, and other cognitive distortions tends to refute your assertion that everyone is equally good at determining whether or not any given idea is bullshit.

All humans—myself included—will go to great lengths to reject reality when it feels like it is a threat to a core value. This is especially the case when people have tied their identity too tightly to a given subject (i.e. hyper-partisans.) The average person, sufficiently prejudiced toward believing a given falsehood, is not going to be able to determine that it is false because their brain will start to play tricks on them. I have written about this elsewhere[0], and The Story of Us[1][2] goes into this process in much more detail.

[0] https://news.ycombinator.com/item?id=25359003

[1] https://waitbutwhy.com/2019/12/political-disney-world.html

[2] https://waitbutwhy.com/2019/08/story-of-us.html (table of contents)


> It’s nice to think that everyone is on equal footing, but even ignoring differences in education and genetics, the universal existence of motivated reasoning, emotional reasoning, confirmation bias, and other cognitive distortions tends to refute your assertion that everyone is equally good at determining whether or not any given idea is bullshit.

The universal existence of motivated reasoning, emotional reasoning, confirmation bias, and other cognitive distortions is precisely why no person or group can be expected to perform better at evaluating evidence and determining which ideas are too dangerous to allow other people to be exposed to.

> All humans—myself included—will go to great lengths to reject reality when it feels like it is a threat to a core value. This is especially the case when people have tied their identity too tightly to a given subject (i.e. hyper-partisans.) The average person, sufficiently prejudiced toward believing a given falsehood, is not going to be able to determine that it is false because their brain will start to play tricks on them.

This is precisely why no person can be expected to perform this role as a gatekeeper of truth.

Thanks for your links, I'm a big fan of waitbutwhy.com. With regard to [1], there's no reliable test to see where a person is on the psych spectrum, even heavy doses of introspection can lead to limited and imperfect insight, and its likely the case that the same person will move up and down on the psych spectrum depending on a variety of factors. Because cognition is costly, if someone reaches a conclusion while they are in tribal mode, its going to be difficult to reevaluate their position later when they are in scientist mode. This is why its so important to have access to a variety of opinions, thinkers, and perspectives. Even the best of us are vulnerable to motivated reasoning and other cognitive biases.


> The universal existence of motivated reasoning, emotional reasoning, confirmation bias, and other cognitive distortions is precisely why no person or group can be expected to perform better at evaluating evidence and determining which ideas are too dangerous to allow other people to be exposed to.

This isn’t what I was hoping folks would take away from my comment.

Have you ever been too invested in some problem, or too upset by something, to see the truth of the situation? And when you ask someone else, who is not emotionally invested, they easily point out the way forward? That is the kind of phenomenon I am talking about.

Most humans have the capability to objectively assess ideas in general, but when someone has a strong emotional attachment to a specific idea—such as partisans who are predisposed to believe voter fraud misinformation—they are not going to be capable of evaluating that specific idea as well as someone who isn’t predisposed to believe it.

I am not claiming that there is any single entity that is capable of being a gatekeeper of all truths. I am saying that there are people who are going to be less capable than others to evaluate the truthfulness of a specific idea. In this case, YouTube moderators are almost certainly going to be more capable of objectively evaluating whether or not a video is proof of widescale voter fraud than a poster who is strongly motivated to lie (intentionally or not), or an audience member who is strongly motivated to agree with that lie. And in that way, there are some people who are more capable of evaluating the objective truth than others.

> Because cognition is costly, if someone reaches a conclusion while they are in tribal mode, its going to be difficult to reevaluate their position later when they are in scientist mode. This is why its so important to have access to a variety of opinions, thinkers, and perspectives. Even the best of us are vulnerable to motivated reasoning and other cognitive biases.

Yes. Exactly. I’m confused how we are arriving at such different conclusions from the same basic understanding. I hope that this added explanation helps.


> Have you ever been too invested in some problem, or too upset by something, to see the truth of the situation? And when you ask someone else, who is not emotionally invested, they easily point out the way forward? That is the kind of phenomenon I am talking about.

Yes I have. Unfortunately, I am not aware of a reliable test to identify whether a person is capable of rational thought on a given subject. This means that when we witness a dispute between people about an issue, we are unable to reliably and provably evaluate the disputants meta-level reasoning. This is complicated by the fact that evaluating someone's reasoning on the meta level is nearly always complicated by object-level concerns, including our own cognitive biases.

> Most humans have the capability to objectively assess ideas in general, but when someone has a strong emotional attachment to a specific idea—such as partisans who are predisposed to believe voter fraud misinformation—they are not going to be capable of evaluating that specific idea as well as someone who isn’t predisposed to believe it.

Thats why its so harmful to put people in a position of being an information gatekeeper. You have no way of ensuring that your information is being filtered by a person who is emotionally detached. In fact, due to the potential for influencing other people, those positions are much more likely to be occupied by partisans who will then use it to advance their own perspective, often without even realizing it.

> In this case, YouTube moderators are almost certainly going to be more capable of objectively evaluating whether or not a video is proof of widescale voter fraud than a poster who is strongly motivated to lie (intentionally or not)

This is exactly the problem. There is no reason to suppose that a youtube moderator is non-partisan, a high rung thinker, or emotionally detached from the subject they are evaluating.

> And in that way, there are some people who are more capable of evaluating the objective truth than others.

Yes but there is not a reliable, objective way to identify them or to check their reasoning process for errors.

> Yes. Exactly. I’m confused how we are arriving at such different conclusions from the same basic understanding. I hope that this added explanation helps.

I feel exactly the same way. I'm at a loss to explain this except that either we have some different premises that have not been revealed in this discussion so far, or perhaps Aumann's agreement theorem [0] is not applicable to this issue for some reason.

[0] https://en.wikipedia.org/wiki/Aumann%27s_agreement_theorem


> There is no reason to suppose that a youtube moderator is non-partisan, a high rung thinker, or emotionally detached from the subject they are evaluating.

The other way to frame this is that there is a reason to suppose that, at this point, someone uploading videos to YouTube claiming evidence of widespread voter fraud—which is all that YouTube are prohibiting here—is almost certainly not a high rung, emotionally detached thinker.

> Yes but there is not a reliable, objective way to identify them or to check their reasoning process for errors.

No, but you can make a reasonable deduction on which of these two groups of people—YouTube moderators or mass voter fraud video uploaders—is more likely to succeed at being objective. There are no certainties when dealing with humans and we just have to roll with it using the best heuristics we can.

> I'm at a loss to explain this except that either we have some different premises that have not been revealed in this discussion so far,

I don’t know. This may be a wrong assessment on my part, but it feels to me like you are extrapolating negatively into the future far more than I am in terms of the potential harm this one change to YouTube’s terms could cause.

Believe me, I totally understand the potential for harm caused by powerful interests controlling a narrative—there’s a limitless number of examples to choose from, many of which have been targeted directly at people like me—but the harm that is happening right now is that other powerful interests (the President of the United States & his political party) have made up a story about voter fraud and are encouraging a bunch of their partisans to flood the zone with shit, and they are doing that on YouTube.

In my eyes, this ban might help to disrupt the ongoing voter fraud misinformation campaign enough that it ends up collapsing before permanent damage is done to our democracy. It seems vanishingly unlikely that it is the start down some path that ends with people being brainwashed by YouTube into believing a bunch of falsehoods, especially compared to the alternative of letting people continue to upload this garbage to their site.

I’m far more concerned about YouTube’s recommendations algorithm and the way it is deliberately designed to funnel people toward more and more extreme content. I suspect YouTube would not need to ban this content at all if their algorithm hadn’t spent the last decade optimising for engagement over all else.


> The other way to frame this is that there is a reason to suppose that, at this point, someone uploading videos to YouTube claiming evidence of widespread voter fraud—which is all that YouTube are prohibiting here—is almost certainly not a high rung, emotionally detached thinker.

I'm not aware of that reason. Could you explain further?

> No, but you can make a reasonable deduction on which of these two groups of people—YouTube moderators or mass voter fraud video uploaders—is more likely to succeed at being objective.

I disagree, but I'm really saying that I'm unable to make that deduction. Perhaps if you shared your reasoning process I might agree.

> I don’t know. This may be a wrong assessment on my part, but it feels to me like you are extrapolating negatively into the future far more than I am in terms of the potential harm this one change to YouTube’s terms could cause.

Almost certainly that is the case. I'm of the "sunshine is the best disinfectant" persuasion. When you tell people they aren't allowed to question the integrity of an election it is absolutely consistent with the interpretation that those questions might lead to unwanted answers. Its clearly consistent with other interpretations but the problem with closing down debate is that you lose the opportunity to compare those interpretations.

> powerful interests (the President of the United States & his political party) have made up a story about voter fraud

If they have made it up, wouldn’t the best response be to ask them for evidence so everyone could see how the allegations were baseless?

> encouraging a bunch of their partisans to flood the zone with shit, and they are doing that on YouTube.

If you consider that partisans of the other side are also flooding social media with their own claims about the integrity of the election, it starts to look more like a healthy debate that needs to be hashed out rather than suppressed.

> In my eyes, this ban might help to disrupt the ongoing voter fraud misinformation campaign enough that it ends up collapsing before permanent damage is done to our democracy.

There has already been a considerable amount of damage. I don’t know how permanent it is, but allowing President-elect Biden to take office under this cloud of suspicion while suppressing the means by which the suspicion can be removed would be devastating to the perception of legitimacy of the government. The only hope for our democracy is to investigate these allegations and demonstrate that our election system is robust enough to handle the challenge of people asking questions about facts they interpret to suggest fraud or misconduct.

> It seems vanishingly unlikely that it is the start down some path that ends with people being brainwashed by YouTube into believing a bunch of falsehoods, especially compared to the alternative of letting people continue to upload this garbage to their site.

Its not clear to everyone that these videos are garbage, yet suppressing them will prevent people from critically examining them while providing evidence that the other side has something to hide.

> I’m far more concerned about YouTube’s recommendations algorithm and the way it is deliberately designed to funnel people toward more and more extreme content.

Platforms that do not prioritize engagement will retain fewer users than those that do. Sadly, this is the same process in effect all over the world as a result of late capitalism converting everything into attractive commodities. This is the same emergent process that creates addictive snack foods and Netflix. If youtube didn’t prioritize engagement then they would be replaced by a platform that did.


>> What is too outlandish to you is not too outlandish to others

Sounds like we have an education problem then.

Censorship of a viewpoint and media that displays a giant FACT/FALSE in its headline amplifies exactly the problem you wish to solve.


These are extreme cases where you think there is consent on any single example, but it is still not a good argument for a general case.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: