Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

While the sarcasm is dripping, it's also true. There clearly are people that can view a video and realize that it's is 100% BS. Yet, at the same time, there are other people that will watch it and 100% believe it. If it wasn't effective, it wouldn't be used.


At the same time, getting used to curation of informal speech and having intermediaries censoring this type of content has its own, potentially greater risks.


First, Google is already heavily curating information on their platform. Don't kid yourself into thinking otherwise. Secondly, Youtube is not a public good. It is a product owned by a company who answers to their shareholders. Google is free (and should remain free) to censor whatever they want on the platform that they own, and users/content creators are free to leave for other platforms. Finally, before folks say that Youtube is a monopoly in video content on the internet, no they aren't. Are they the biggest? Sure. But by no means are they the only platform.


Yes, obviously.

The point is, we have a whole bunch of mores and law that work together to form a free society work that were formed in times when most interpersonal communications didn't go through a few concentrated intermediaries. This new equilibrium a really big concern for free communication and freedom of speech: just letting whatever dangerous stuff circulate at massive scale isn't good, but fully empowering corporate entities being able to squash and marginalize categories of speech isn't great either.


I doubt most of the people (like myself) disagree with anything you say. We can still think it's absolutely ridiculous that they're doing it.

I don't think they should face legal sanctions or anything. I think they should step out of politics. But yeah, they don't have to.


Generally, a company need not be the only one on the market in order to be considered dominant for the purpose of anti-trust laws. Anti-trust laws apply to non-monopolies as well.


> Google is free (and should remain free) to censor whatever they want on the platform that they own

So, since ISPs also own their platform, should they also be free to block content that competes with their services along with content they’re required to block by law?


If only we had a government mechanism for holding service providers to higher standards for neutral delivery of services to citizens regardless of economic incentives.


I see what you did there.


On the other hand, we have sufficiently large number of people believing that Trump actually won this year - enough for it to be a potential political issue. I wouldn't count out someone getting killed over this.

So, the relative merits and dangers of "censorship" should be evaluated against this. The bar for the censorship being "potentially greater risk" is reasonably high.


"Potentially saving lives" is a really dangerous way to choose whether censoring speech is ethical.

Squashing what one might think is dubious criticism of police potentially saves lives, too.

Not too long ago Youtube was removing videos criticizing the CDC for not advocating for masks... Squashing criticism of the CDC during a pandemic seems like an action that could potentially save lives, too.


Maybe I chose a poor example - I didn't mean that "potentially saving lives" is the only important metric.

What I meant was that giving a platform to Trump's deranged claims is harming the fabric of society, eroding trust in each other and in the society itself, sowing discord, and widens the opportunity for an aspiring tyrant to seize the public's interest, declare a bunch of "undesirables" as enemy of the state, and infringe upon individual rights while his supporters cheer him on. (And also kill some lives along the way.)

Just like censorship could potentially lead to a similar kind of outcome.

So, the danger of "censorship", in our current context, should be gauged against the danger of "non-censorship" in the same context.


I just don't believe that having powerful intermediaries squashing views work well for democracy.

It doesn't work so well to stop dangerous ideas, and it has a whole lot of risk of being used against unpopular ones.

Trump's ideas are doing damage to democracy, but attempting to suppress them in this way adds credence to many peoples' belief that there's some dangerous tech/liberal cabal squashing the true majority conservative opinion, too. And even attempts to use these in a bright-line, careful fashion have often squashed the wrong speech and silenced the wrong people; this is before any actual malicious use which is sure to come if the precedent strengthens.

In our rush to get rid of the most repugnant ideas, too, we tend to lose our ability to discuss nuance. To avoid COVID quackery, we can't have an open discussion of the merits of specific policies and interventions. In our rush to squash the dangerous idea of election fraud, we can't have the discussion about how to make our actual election system robust and beyond reproach.

> So, the danger of "censorship", in our current context, should be gauged against the danger of "non-censorship" in the same context.

Yes, which is why I acknowledged those risks in my first comment and said that I believed that these risks are potentially greater.

Think about how much important speech from the past few centuries has seemed dangerous and repugnant at the time it was uttered.


You have the potential of getting only curated information any time you only use one source for that information.


Here we have many social platforms effectively colluding and setting common standards for curation.

Concentrated power is scary. I'd agree it's mostly being used for good here...


I have news for you. Radio, TV, movies, etc have been censored to hell and back since they were invented. The roof hasn't collapsed.

If anything, they're less censored than they used to be.


I'll take the potentially greater risks over the immediate risks in this context.


It is funny how short people memory is when YT was removing videos of people saying you should wear a mask and the YT Censorship gods said "no that is misinformation you can not say that"


This is correct.

> The Australian white supremacist who killed 51 people at two mosques in Christchurch, New Zealand was radicalized by YouTube, according to a 792-page report on the March 2019 shooting.

https://www.theverge.com/2020/12/8/22162779/christchurch-sho...


I have a feeling he'd say that was ridiculous, based on this quote from his manifesto:

Q: Were you taught violence and extremism by video games, music, literature, cinema?

Yes, Spyro the dragon 3 taught me ethno-nationalism. Fortnite trained me to be a killer and to floss on the corpses of my enemies. No.

People are more intelligent than these criticisms give them credit for. Their beliefs come from opinions and thoughts, not just from above. See Ross Douthat's recent column, "Why Do So Many Americans Think the Election Was Stolen?"

"The potency of this belief has already scrambled some of the conventional explanations for conspiratorial beliefs, particularly the conceit that the key problem is misinformation spreading downward from partisan news outlets and social-media fraudsters to the easily deceived. As I watch the way certain fraud theories spread online, or watch conservatives abandon Fox News for Newsmax in search of validating narratives, it’s clear that this is about demand as much as supply. A strong belief spurs people to go out in search of evidence, a lot of so-called disinformation is collected and circulated sincerely rather than cynically, and the power of various authorities — Tucker Carlson’s show or Facebook’s algorithm — to change beliefs is relatively limited."

https://www.nytimes.com/2020/12/05/opinion/sunday/trump-elec...


I invite you to read up on the psychological effect known as "priming."

Top-down disinformation primes the minds of the less-scrupulous, who then seek to rationalize and harden their half-baked beliefs through their everyday experiences.


When you read up on priming you will discover that it is something people want to believe in, but does not replicate scientifically.

https://replicationindex.com/2017/02/02/reconstruction-of-a-...

People used to believe in brainwashing too. We like to think people who disagree with us are easily swayed idiots. It keeps our opinions safe and justifies censoring their opinions instead of engaging with them.


Please do not quote a spree killer as an authoritative source.


He is an unreliable narrator, for sure. But more than his claim about whether he was brainwashed, does he sound like the kind of person who just mindlessly clicked from Youtube to Youtube, accepting every claim uncritically? Does he sound like the kind of person who can't be trusted to think for themselves? Engineer is a common profession for terrorists. Regular, thoughtful people can radicalize themselves, and censoring the masses to stop that is a power grab in search of a problem.


Everyone has to be trusted to think for themselves. We can put barriers in place to harmful actions resulting from those thoughts but I would not want to live in a world so dystopian and authoritarian that we intentionally take away someone's ability to think.


I find this grossly offensive. Please, please, stop quoting and attempting to normalise a serial killer.


I don't really get this. People regularly quoted Osama bin Laden's statements about how he did 9/11 to force the US into overreacting. It may have "normalised" him to see that he had rational reasons for what he did, rather than being a religious maniac. But it didn't make it right, and it didn't lead to people flocking to his cause. If it had been censored, we might have had more people believing the harmful "religious war" frame, just from not knowing any better.

"All that we have to do is to send two mujahedeen to the furthest point east to raise a piece of cloth on which is written al Qaeda, in order to make generals race there to cause America to suffer human, economic and political losses without their achieving anything of note other than some benefits for their private corporations" -- Osama bin Laden


I don't get why you think it's insightful to repeatedly quote terrorists at all, spreading their preferred messages for them.

You are only confirming that terrorism allows someone to be powerfully heard.


He is not trying to normalize a serial killer. He is trying to show that his action are not the results of watching youtube.

I do not think there is any reason to be offended.


It's been pretty widely dissected and explained why he said and wrote all of that idiotic internet leetspeak in his "manifesto".

There is also a reason why many of these folks who finally decide to kill people also consume "alt-lite" content like Ben Shapiro / Jordan Peterson etc, it isn't a coincidence. Regulating that and keeping people from spreading racist propaganda, or content that just exists to undermine democracy is fine in my book.


I tried googling "why did christchurch use leetspeak" and didn't get results. I presume it was to get more views from incels and because he found it funny. Not because he was radicalized by the Navy Seal copypasta.

With Peterson and Shapiro I feel like you're trending toward a standard of "anything terrorists like to read, but I don't, should be censored". They're close to as popular as, say, Rachel Maddow, which means almost all of their readers are peaceful.



Thanks for the Vox link, it really explains what you meant.


So, this is one example of many, many users who haven't killed anyone.

If "one incident is too many", then marijuana must stay strictly illegal and violent video games must be banned, because both have nonzero body count IRL. Remember Harris and Klebold?


Yeah, I just don't know what to do anymore. It's clear that the idea of good information rising to the top of vigorous discourse among an informed populace just isn't working. We're tearing the country apart right now with lies.

Is this fixable at all? Do we just give up? Do we fight the lies directly by not spreading them? No good options as long as the lies are spreading.

I have a very hard time seeing Youtube or Twitter as the bad guys here. At least they're trying. Screaming about censorship as the civilization falls apart seems to really be missing the point.

If freedom of speech won't save us, what will?


If we take that position then how is that reconcilable with democracy?


Making some assumptions about what you mean by "democracy", that's actually a good question. We believe in freedom of expression because we don't want the government suppressing information for the purpose of manipulating us, but in the information age it has become obvious that one can manipulate people by spreading information just as easily as by suppressing it.

For this reason, I am forced to wonder if democracy can survive the information age at all.


Hypothetically, you can think that you need institutions to properly frame and prioritize issues in public discourse, and only then is democracy ideal. And at some scale it's true: if a bunch of us were suddenly dropped into a locked room and told to discuss and then vote for the best mayor of Mumbai with no resources, we'd fail.

Democracy isn't looking too healthy nowadays though, I agree.


Traditionally, it's been extremely reconcilable. News agencies had a responsibility to be more accurate than not. The government itself keeps a very hands-off approach to that information flow, and the people reward papers that succeed with subscriptions and eyeballs. This detente is why the media has traditionally been referred to as "The fourth estate."

Unclear what the solution is as that system breaks down because the subscription-free ad-fueled Internet dwarfs the traditional model.


Ultimately, people (the audience) will go to sources that are credible and abandon sources that are not, regardless of how the source pays their bills.


You cant, that is why the US is not a democracy nor should it become one

The US is a constitutional republic who's institution where designed in a way that the ONLY democratic part of that system of governance was the US House of representatives

Unfortunately for those that like individual freedom and see democracy as mob rule (people like myself) there is a strong push for more and more democracy into the system which is a net negative for individual liberty. Things like abolishing of electoral collage, even past changes like Electing Senators via Popular vote have been a net negative for liberty


Americans largely have trouble distinguishing facts from opinions, sadly.

https://www.journalism.org/2018/06/18/distinguishing-between...

>The main portion of the study, which measured the public’s ability to distinguish between five factual statements and five opinion statements, found that a majority of Americans correctly identified at least three of the five statements in each set. But this result is only a little better than random guesses. Far fewer Americans got all five correct, and roughly a quarter got most or all wrong.


That's a might broad brush you're painting with. "A frighteningly large percentage of people have trouble..." would be much more accurate. You make it sound like Europeans have never be hoodwinked by people with agendas. History is rife with examples of charismatic people duping followers.


Europeans are no better than Americans.

Source: am European, speak Czech, German and (worse, but usable) Polish. We are all human, with all the expected flaws.


It's pretty accurate. I would never want a jury trial because of how easily the common American can be manipulated. Reports now indicate that 3/4 of GOP members won't accept the outcome even though the closest state is nowhere near the 500 vote margin Gore lost by in 2000. That's how the modern propaganda machine leads the gullible.


Okay, but the survey only surveyed Americans, I can't use it to make conclusions about Europeans.


fact or opinion:

there are two biological sexes?

do you want a corporation determining that for you and banning you from seeing anything otherwise? what if their view of the facts is different?


It would be classified as a factual statement for the survey study even though it's an inaccurate statement (intersexed people exist)

As

>Respondents were asked to determine if each was a factual statement (whether accurate or not) or an opinion statement (whether agreed with or not).

I never gave my opinion on the matter, but since you asked, no, I don't personally care if corporations ban me because I disagree with their TOS (which had inaccurate statements of fact in it), that's their right. I probably wouldn't be doing business with them in the first place, though it's also my right to petition them to change it if I wanted to. Example: When I was a kid Walmart only sold censored CDs so everyone knew not to buy CDs from Walmart.


Sex and gender aren't the same thing.


then go somewhere else for content


says the same people who cry if someone makes a mean tweet or a comedian tells an offcolor joke and calls for them to be deplatformed.

how about i argue for them not to censor, and you go to another platform if you want your safe space.


Remarkable to see this comment unironically on HN.


It’s a fact that a lot of people believe the misinformation. I don’t understand what’s so remarkable about pointing this out.


Who decides what information is


Why does the public get to decide what a private company hosts?


They do via proxy of government laws. The public vs megacorp is not a balanced relationship anyway and will eventually get regulated.


Actually, shareholders do get a say in what a publicly traded company hosts!


That's not the public. But I'm fine with owners of a company deciding what they would like to host.


It often does. You want it to, in many cases.


Because this publically traded company (not private) invites the public to view and share content. Now the public is voicing concern over who they are censoring.

If Google doesn't like it they are free to shutdown the service.


The same people who decide that 2+2=4 and that the sky is blue.


If only the virgin mary were contrary. Science has lost it's power. Someone will choose an interpretation that benefits themselves over the truth. It's normal. Too many tech people are coasting on the libertarian dream of the past.


So I was correct when I said you don't believe in science.


You are the person choosing your own interpretation.

I believe in science, what I don't believe is that people will listen to truth when it is presented. They are two different things. Science has lost its power to object to a higher cultural malaise.

If you understood what it meant for the virgin mary to be contrary, you would understand the words I wrote on the page and my meaning.

You were not correct, you didn't understand my post. Stop dreaming. There are real truths being expressed here.


Great, I agree on both those things and now I'm deciding that the election was frauded by Elvis.


Uncited fact?

Anyways, I honestly can't blame "them". Less than a year ago, the mainstream media was full of "information" like, masks don't work, it's racist to think that hugging a Chinese could make you infected, and people questioning the origin of the virus were deemed conspiracy theorists. Now some of these are suddenly considered "misinformation" or "COVIDiots" and you have "conspiracy theory" on front page of Washington Post

What's someone who's not 100% plugged in supposed to think?!


Let this be a lesson that a lot of people are susceptible to all sorts of "remarkable" ideas, including objectively dangerous ones like believing that Covid is a hoax.


I'm unironically inclined to agree with it, but I'd be interested in reading a more substantial refutation.


There's no refutation because there's no argument to refute. It's always some form of "There are bad people doing bad things and we need to stop them."

Notice how every thread here is about acting against alleged disinformation agents? It's not about any argument at all. It's a generic outgroup argument.

Should we expect people to just never question elections? It's completely bizarre.


Come on. That same group questioning elections 1) has spent 4 years decrying the questioning of an election 2) actively prevented measures being taken to secure elections and 3) hasn't come up with any proof of vote changes. We should expect people to come at things in good faith, which is clearly not the case in this discussion.


Why? Its true.


If you don't believe it's true, then explain to me how Measles, something that was more or less a solved problem, suddenly had a huge resurgence, leading to 50% increase in the number of deaths from 2016 to 2019?

While flat earthers are mostly harmless, other misinformation have real tangible cost. People are literally dying due to the anti-vaxx and anti-mask misinformation. Not blocking these videos is equivalent to having blood on your hands.


You do realize that the largest outbreak of measles in 2019 was not because of Anti-Vax YouTube Videos or other misinformation right? It was directly linked to religious fundamentalism that barred a large group of people from getting vaccinated

So all the YT censorship in the world will not stop that unless you are going to advocate prohibition of religion as well which I feel you likely will not get as much support for

YT and online misinformation is a good scapegoat, much like Usenet was in the 1990's for people that do not understand the real, actual problem

you are not going to fix these problems by censorship


> It was directly linked to religious fundamentalism that barred a large group of people from getting vaccinated

And it just happened to be perfectly in sync with the rise of anti-vaxx content on social media? You do realize these "religious fundamentalism" with fringe ideas also use Youtube, FB and other sites to share their anti-vaxx ideas and have their own ideas re-affirmed?


The issue is that people see it as a failure of YouTube, when it is in fact, a failure of our education system.


There's some role for education but Youtube driving tons of people to watch these kinds of videos is Youtube's mistake, not anyone else's.


Sure, but improvement to the education system won't come into effect for decades, and in the short term, real people are dying from anti-vaxx and anti-mask misinformation. I know 2020 has been wild, but let's not forget the huge surge in Measle deaths, something that was mostly solved beforehand. Blocking certain content now literally saves people's lives.


For sure, but I think it's important not to lose sight of the fact that YouTube is merely dealing with the symptoms of a much broader societal problem. We can't truly fix the issue until we address the root cause. Everything else is whack-a-mole.


Again, why not both? This is a short term solution, while we work in parallel in improving the education system. Youtube itself can be a great source of education if you promote scientific videos over trashy conspiracy videos.


Correct, the exact same way "some people" think indenting with tab is 100% BS. Yet, at the same time, there are "other people" that will indent TAB + 2 spaces every new line.


> there are "other people" that will indent TAB + 2 spaces every new line.

I hope I never encounter these psychopaths!


> Yet, at the same time, there are other people that will watch it and 100% believe it

When has that ever not been the case? Fortunately there are natural barriers preventing most people with wild ideas from gaining enough support to be a detriment to society. Should I really care if someone else chooses to believe x, y, or z in the face of contradictory evidence? If they try and do something public with a wrong idea, their failure will be the teacher, not me.


> Fortunately there are natural barriers preventing most people with wild ideas from gaining enough support to be a detriment to society.

The internet has all but nullified these barriers. Just look at all the QAnon garbage. That would have gone nowhere 30 years ago, but today a sizable portion of this country believes it because there is no real difference in authority between one Facebook post, Youtube video, or Tweet and another.

>Should I really care if someone else chooses to believe x, y, or z?

You shouldn't, unless that view is harmful to society. I don't care if you think the COVID vaccine has a microchip in it. But if you delay our return to a normally functioning society because you refuse to get that vaccine, your stupidity is starting to infringe on my rights.


"That would have gone nowhere 30 years ago"

Oh, the entire history of religious and pseudoreligious movements begs to differ.

A relatively recent, but decidedly pre-Internet example with a high death toll: https://en.wikipedia.org/wiki/Boxer_Rebellion


I don't understand the comparison you are making. How are those equivalent to QAnon?


"The Boxers, armed with rifles and swords, claimed supernatural invulnerability towards blows of cannon, rifle shots, and knife attacks."

The combination of absurdity of the core claims of the movement and width of audience gained seems fairly similar to me.


So really you are just lumping in QAnon with all religion. I'm not sure who would be more upset with that QAnon folks or religious non-QAnon folks.


I know a lot of religious people, but none of them think themselves exempt from basic laws of nature (e.g. immune to artillery fire). This kind of belief goes way beyond usual religion.


Does it really matter if some people believe in QAnon?

If enough people decide that the COVID vaccine is important enough for everyone to receive (regardless of their personal beliefs), then that will be codified into law. No need for censorship to try and manipulate public perception.


>Does it really matter if some people believe in QAnon?

When it starts to get dangerous, yes. People are inspired toward violence when they believe that other people are killing children to drink their blood. For those unaware, that is at the heart of QAnon beliefs.

>If enough people decide that the COVID vaccine is important enough for everyone to receive (regardless of their personal beliefs), then that will be codified into law. No need for censorship to try and manipulate public perception.

Do you realize this is an exact analogy to what Youtube is doing here? They tried to let the people decide. We ended up with a result that was bad for society. So they instead tried to codify the "right" choice into the laws of their platform.


> When it starts to get dangerous, yes. People are inspired toward violence when they believe...

I see this argument a lot, but it fails to address the clear distinction between beliefs and actions. If people are actually violent, we have clear laws to deal with those actions.

If the argument is that certain beliefs shouldn't be allowed because they could be construed as "inspiring violence", then I'd love to hear about how tolerant you are towards Islam's idea of jihad or countless others who believe violence is justified in circumstances that you disagree with.


>I see this argument a lot, but it fails to address the clear distinction between beliefs and actions. If people are actually violent, we have clear laws to deal with those actions.

Why outlaw threats and fighting words then? They aren't violence.

Some of us want to stop easily predictable violence before it gets to the point of actual violence.

>If the argument is that certain beliefs shouldn't be allowed because they could be construed as "inspiring violence", then I'd love to hear about how tolerant you are towards Islam's idea of jihad or countless others who believe violence is justified in circumstances that you disagree with.

It is curious that you use Islam as your example here. Various religions preach violence. The Old Testament establishes the death penalty for people who break the Sabbath. What matters is the actual practice and how likely they are to inspire violence. The QAnon conspiracies are more dangerous in this regard than thousand plus year old religions.


> Some of us want to stop easily predictable violence before it gets to the point of actual violence.

Are you being serious? I honestly can't tell. This has played out in countless movies and books, and the result is never good. It has also played out in real life, and the result is even worse.

> What matters is the actual practice...

Bingo! Sounds like maybe you're beginning to see the error in trying to police thoughtcrime. It's the actions that matter, not the beliefs alone.


Yes, I am serious. The problem is there is no clear delineated line between "thoughtcrime" and plain old crime prevention. Where is the line for you when a threat of violence is equivalent to violence? When does a thought become a plan? Threats are just words, so I imagine I can threaten to kill you. What about if those threats are through deliberate and premeditated actions like mailing you a death threat? Is it any different if I tell other people to attack you? Those are just words, right? Is it different if I pay them? Can I brandish a knife if I am 20 feet away from you? I don't pose an immediate threat in that instance. Can I pull a gun on you without any fear of reprisal? That isn't a direct act of violence either yet. Do I need to pull the trigger before you respond?


> Where is the line for you when a threat of violence is equivalent to violence?

The line is "imminent lawless action" [1], with case law clarifying that "advocacy of illegal action at some indefinite future time" is not considered "imminent" (and therefore protected free speech). It's a pretty clear line, and one that most of the censored material being discussed objectively does not cross.

Google, Twitter, Facebook, etc. are within their rights as private companies to enforce content rules as they wish, but these recent censorship actions have strong implications as to their protections under Section 230, and are alarming insofar as they represent a trend that crosses the line of free speech protections normally recognized by the government and content platforms.

[1] https://en.wikipedia.org/wiki/Imminent_lawless_action


> Google, Twitter, Facebook, etc. are within their rights as private companies to enforce content rules as they wish, but these recent censorship actions have strong implications as to their protections under Section 230,

No, they don't; 230 exists to promote censorship, it does not involve a bar to it.

> and are alarming insofar as they represent a trend that crosses the line of free speech protections normally recognized by the government and content platforms.

They aren't the government, and there has never been a set of free speech protections “normally recognized by content platforms”, especially since 230 was adopted specifically to remove legal disincentives to active moderation.


I never stated or implied that Section 230 barred censorship. It does, however, protect service providers from the liability that a publisher would take on for publishing content that otherwise should be censored. As these companies voluntarily embrace more censorship, they are calling into question their status as "service providers" since they are effectively operating as publishers; i.e., not protected under 230.

> there has never been a set of free speech protections “normally recognized by content platforms”

I agree; legally there hasn't been anything like that, but in the past, those platforms were demonstrably more reluctant to censor political content (e.g., views that didn't align with the company's political views) because they knew that more active involvement might jeopardize their classification as neutral platforms (along with their protections under 230 as described above). In effect, they stayed out of politics not by law, but out of fear of being forced to censor all content if they became "publishers". Now that machine learning has made the censoring part easier, they're less concerned about that happening. However, at the moment they want to have their cake and eat it too – controlling content as they wish while also enjoying the protections of 230.


>I never stated or implied that Section 230 barred censorship. It does, however, protect service providers from the liability that a publisher would take on for publishing content that otherwise should be censored. As these companies voluntarily embrace more censorship, they are calling into question their status as "service providers" since they are effectively operating as publishers; i.e., not protected under 230.

No. That's not what section 230 says.

There is no distinction in section 230 between "platform" and "publisher."

This has been noted and detailed repeatedly in this discussion.

Please see this[0] which will explain, in explicit detail, why you are wrong about section 230.

[0] https://www.techdirt.com/articles/20200531/23325444617/hello...


The objections you're raising (and repeated on sites like the one you posted) are a matter of interpretation of the law, and people on both sides of the political spectrum are now realizing that the law needs clarification. It is not a settled matter by any means, and our lawmakers are still debating the issue.

When a company like Twitter censors the president of the United States, while also embedding their own editorial comments over the content he posted, those actions could easily be seen as falling outside 230 (even if courts haven't decided that in the past). No one denies the fact that the internet today is very different from when 230 was drafted, and from a moral standpoint, we absolutely need more clarification codified into the law.

If your town's public square were seized by one of the richest companies in the world, and they began exerting political control over who was allowed to speak in the town square, it would certainly raise some red flags and likely encourage legal changes (even if, for a time, it was perfectly legal).

The 230 debate isn't even the core of my argument (if you read my previous comments). The point is, whether through legal means or simply by way of market pressure, we should not be allowing these companies to control the political discussion in such heavy-handed ways. Diversity of opinion is diversity, and we need more of it - not less (it's ironic how some push so hard for diversity, yet seem to think we can't handle it when it comes to speech).

I'm sure it's hard to imagine, but if they started silencing liberal views, there's no doubt there would be an uproar among democrats. Apart from any legal changes that may come, we vote with our clicks and platform usage, and there's a growing number of people who are tired of these political censorship games, so they're leaving for other platforms with less political bias. As censorship increases, that will likely accelerate.

This is the last of my comments in this thread.


> The objections you're raising (and repeated on sites like the one you posted) are a matter of interpretation of the law,

No, they are a matter of clear and unambiguous historical fact.

> and people on both sides of the political spectrum are now realizing that the law needs clarification.

No, subsets within each major party are adopting preferences for regulation with opposed purposes to those for which CDA Section 230 was originally adopted. Which we could debate the merits of, but it's simply factually wrong to describe actions of the type that both the plain text and the legislative history of Section 230 show clearly to be exactly what 230 was adopted to remove existing barriers to are somehow in conflict with Section 230’s protections or purpose.


>I'm sure it's hard to imagine, but if they started silencing liberal views, there's no doubt there would be an uproar among democrats. Apart from any legal changes that may come, we vote with our clicks and platform usage, and there's a growing number of people who are tired of these political censorship games, so they're leaving for other platforms with less political bias. As censorship increases, that will likely accelerate.

Please remember that Section 230 doesn't just apply to the big players. It applies to any internet resource that allows third-party content. Including any site that you may host/own.

I suggest you actually read the sharp end of Section 230 (section (C)(1), which pretty much all litigation around it has been resolved). I present it here for your review[4]:

"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."

That's it. Full stop.

I don't care about platforms like Twitter, Facebook, YouTube, etc. I don't use them (well, okay, sometimes I listen to songs on YT and once in a while I'll dig up an amusing or enjoyable clip from a movie or tv show) because I find their business models personally offensive.

But you attack Section 230 at your own (and everyone else's) peril. And there are a bunch of reasons for this.

The impetus for Section 230 came out of the court decision in Stratton, Oakmont v. Prodigy[0], where the court ruled that if Prodigy did any moderation at all, they were then liable to be sued for third-party content they hosted.

But that didn't just apply to Prodigy. It applied to any connected device that hosted any content, whether that content originated from the owner or a third-party.

Try to imagine what the world would look like under such a legal regime:

A site like HackerNews, if they (as they do now) allowed the upvote/downvote/flag moderation system, would be liable to be sued for just about any post that someone didn't like, or for a submission that wasn't sufficiently up or down voted.

In such an environment, HackerNews (and every single other website, mailing list, Usenet group, Mastodon instance, Github repo, etc., etc., etc.) would be liable to be sued for just about anything that anyone posted if they did any form of moderation (like blocking spam, porn or any content unrelated to the purpose of the site).

If there was no Section 230, you could be sued if you hosted a mirror of the lkml[1] list and someone didn't like a snarky response from Linus about a rejected patch merge request.

You could also be sued just for forwarding an email that contained statements that someone didn't like. In fact, Section 230 protections stopped just such a lawsuit[2] in 2006.

Companies with deep pockets like FB, Twitter, YT, Reddit, etc., have the resources to fight most such lawsuits, but what about sites like HackerNews?

Do you really think we'd be having this pleasant conversation right now if YC could be sued for any post or submission on this site?

YC would run for the hills, because they don't want that sort of liability. If they moderated anything (and that includes user up/downvotes/flags), they could be sued for any content hosted here. The only alternatives they would have would be to shut down or not make or use any moderation tools at all.

Which would quickly turn this site into a cesspit of spam, porn, irrelevant postings and other garbage (essentially, 4chan/8chan/8kun).

Do you have a github repo? If there were no Section 230, and you blocked even one PR that contained spam, porn, discussions about placentas and/or other irrelevant content, you are now liable to be sued for any statements made by others in that repo.

As such, the result of removing Section 230 protections would create two kinds of Internet resources:

1. Sites which do not allow any third-party content;

2. Sites which allow all third-party content without any limit (think gay, midget furry porn plastered all over a knitting website)

And so, no. I wouldn't mind at all if a particular site moderated in favor of a political (or any other) view with which I disagree. If I don't like it, I'll go elsewhere.

Because of all this, I say that Section 230 is essential to free speech, not a hindrance to it.

Feel free to disagree, but you'll be wrong[3]

[0] https://en.wikipedia.org/wiki/Stratton_Oakmont,_Inc._v._Prod....

[1] https://lkml.org/

[2] https://en.wikipedia.org/wiki/Barrett_v._Rosenthal

[3] https://www.techdirt.com/articles/20200531/23325444617/hello...

[4] https://www.law.cornell.edu/uscode/text/47/230

Edit: Fixed spacing issues.


> It applies to any internet resource that allows third-party content.

Also to users, on sites where user action can affect the visibility of other content. Were 230 not in place, users making use of such features (not just site operators) could face civil liability.


I didn’t go far back enough other than parent. So apologies if this is out of context.

But I don’t think the goal should be to attack 230, but a desired goal would be stop platforms like Twitter and Facebook and YouTube to stop acting like a publisher. They are simply abusing 230 privileges while still acting like a publisher with editorial muscle.


>But I don’t think the goal should be to attack 230, but a desired goal would be stop platforms like Twitter and Facebook and YouTube to stop acting like a publisher. They are simply abusing 230 privileges while still acting like a publisher with editorial muscle.

The term "publisher" has no legal meaning in the context of section 230.

I (and at least a half-dozen other folks) have explained this repeatedly in this discussion.

I won't do so again, but in the interest of expanding knowledge, I'll point you over here[0] so you can understand the deal as it stands.

If you (or anyone else) would like to see changes to Section 230, that's perfectly fine with me. I suggest you write your congressperson/senators and demand the changes for which you advocate.

That said, what you are describing is not the law as it is now. Whether you (or I for that matter) agree or disagree, that's irrelevant to current jurisprudence.

But we have ways to change our laws and we should take advantage of them where we feel it appropriate.

[0] https://www.techdirt.com/articles/20200531/23325444617/hello...


never mind. i should have never commented. thanks.


> As these companies voluntarily embrace more censorship, they are calling into question their status as "service providers" since they are effectively operating as publishers; i.e., not protected under 230.

230 was expressly adopted to let service providers (and users!) of interactive computer services take actions that would otherwise make them publishers without the liability that goes with that, with regard to content that is created by someone else. That's it's whole purpose. Key operative text: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” and ”No provider or user of an interactive computer service shall be held liable on account of [...] any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected”

https://www.law.cornell.edu/uscode/text/47/230#


I wasn't asking you a legal question. We all know that QAnon isn't literally illegal. I was asking you a series of moral questions, many of which can't be answered with "imminent lawless action". For example, is it considered a "thoughtcrime" if the danger isn't imminent? If someone is working on detailed plans to kill the president, but the plan would take multiple years, should this person be stopped or should they be allowed to continue their plans until the danger is imminent?


We've got to this place because the people with wrong ideas were succeding.


And if the wrong idea is censorship itself?


> We've got to this place because the people with wrong ideas were succeding.

Hate to break it to you, but if those ideas are succeeding, maybe they weren't wrong after all...


The marketplace of ideas can remain irrational longer than free society can remain solvent.


History is full of people with very clearly wrong ideas who were succeeding at the time. Something something, Godwin's law.


History is more full of examples where censorship has lead to tyranny and the downfall of civilizations.


Government censorship, yes. Fortunately, that is not happening here.


Godwin's law, but Hitler succeeded for a very long time in the decade before WW2. This isn't a useful metric for determining whether ideas are right or wrong.


Until their actions affect you. Like shooting up a pizza place, or voting in a region that also contains you.


All of recorded human history is evidence that this is false. Consider the Holocaust, the Salem Witch Trials, Lysenkoism, The Great Leap Forward, Aztec human sacrifices, and so many others. Even if you just consider the relatively modern United States just look at Prohibition, the War on Drugs, Jim Crow Laws, and so many others.

I am against censorship, but not because I believe that bad ideas can't take hold and cause enormous damage. I just believe that the benefits of living in a society with free speech outweigh the costs.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: