> So no site should be allowed to have moderation policies that include things like "no trolling," "no flaming," "no swearing," or anything like else like that for things that don't cross into "illegal" abusive speech?
Any site can have any rules they want. But they shouldn't expect to be able manipulate and control discussion and face no backlash from users.
> Without becoming a target for filling up with illegal shit to overwhelm the moderators and cause legal trouble for the owners?
'Trolling', 'flaming' and 'swearing' are not illegal, and honestly, most complaints about 'trolling' I see are just people who disagree with what they're reading.
> But they shouldn't expect to be able manipulate and control discussion and face no backlash from users.
You're saying they should face backlash from government regulators not from users.
If there's a big user backlash, those users are already free to move to whatever site they want, or make their own.
---
Re: your comment about trolling, etc, not being illegal: yes, exactly. So if a site wanted to enforce some sort of "family friendly" policy, you could simply bomb them with truly illegal shit instead and get them in trouble with the regulators for not being able to keep up as they would be liable for that, since they aren't a "anything legal goes" platform.
> You're saying they should face backlash from government regulators not from users
I never said that. But it's probably a good idea to have some sort of oversight for a massive platform that censors and manipulates discussion of millions of people. You may think their policies are totally justified and correct (they aren't), but the specific opinions they choose to promote may change to something you are against.
> If there's a big user backlash, those users are already free to move to whatever site they want, or make their own.
Yes, they are also free to shit on the website and point out it's flaws.
> So if a site wanted to enforce some sort of "family friendly" policy, you could simply bomb them with truly illegal shit instead and get them in trouble with the regulators for not being able to keep up as they would be liable for that, since they aren't a "anything legal goes" platform.
Trolling is not illegal, it's not even a specific thing. It's a very ambiguous term used to describe 'comment's that make me feel bad'. Trolling and posting illegal shit is completely different, sure ban illegal stuff - you kind of have to, but don't use 'trolling' as an excuse to ban stuff you disagree with.
Separate reply for the other side of this thread. I honestly don't know where you're getting it with your comments re: trolling.
Do you honestly believe that a discussion forum site that wanted a family friendly policy shouldn't be able to have any liability protections? That if someone started flooding them with child porn, say, at a rate they couldn't keep up with, they should have to shut down because they'd otherwise be liable for all that content due to their having a content moderation policy that forbid certain types of legal speech (say, swearing)?
Dude, I honestly don't think you understand what 'trolling' means. You already agreed that trolling is not illegal, why do you keep trying to use examples of illegal content to demonstrate why 'trolling' should be banned?
And what does child pornography have to do with swearing?
> Do you honestly believe that a discussion forum site that wanted a family friendly policy shouldn't be able to have any liability protections?
You keep bringing up trolling as a term in general, I tried to make it specific.
The pitch being made in this subthread was "I don't think Reddit should be able to censor lawful content and also claim protections under Section 230 of the Communications Decency Act."
So you lose liability protection if you censor lawful content.
So if you want to ban swearing, you become liable for your user's content. Which makes you a WIDE OPEN target for bad actors who could post whatever illegal content they want faster than you could moderate it.
I'm trying to illustrate why I think `cwhiz took an absurd position, and am not using the term "trolling" in general now. I've given a specific example of "legal content some communities might want to prohibit" as well as "illegal content that bad actors could use to get the owner of that site in trouble in this proposed world."
Okay, to make it clear - I don't think sites should lose 230 protections for having a moderation policy. I do think that the excessive manipulation and censorship of dissenting opinions on sites such as reddit is a bad thing however.
> But it's probably a good idea to have some sort of oversight for a massive platform that censors and manipulates discussion of millions of people. You may think their policies are totally justified and correct (they aren't), but the specific opinions they choose to promote may change to something you are against.
Think through the mechanics of how this oversight would work.
Someone will have to decide "is this site sufficiently neutral in how they moderate UGC." So let's spin up a government department more that, maybe call it part of the FCC, maybe make it a new one, who knows.
Now let years, maybe even decades go by. Now let's say some site you like gets essentially taken out - since massive sites aren't going to be able to do manual moderation of everything - by a regulator who leans to the opposite side of the political spectrum you do.
Now you don't have much resource, since you aren't free to go off and try again - the regulator would just get that site too.
I have a hard time seeing how this is better.
And this is a weird stance for conservatives to be preaching, since the previous similar thing, the fairness doctrine, was no favorite thing of theirs.
This is one of those things where you can't say "only the government can solve this problem" unless the only speech you want to see protected is fairly-mainstream government-endorsed speech.
> Think through the mechanics of how this oversight would work.
There are many ways this can work, and it's not necessarily simple. I don't have a specific proposal.
> Now let years, maybe even decades go by. Now let's say some site you like gets essentially taken out - since massive sites aren't going to be able to do manual moderation of everything - by a regulator who leans to the opposite side of the political spectrum you do.
1) Manual moderation of massive sites is already going away for the most part.
2) Sites like reddit have swarms of moderators, the ability to report, and send legal notices to take down specific content and also be made.
3) We have laws about 'illegal content' these websites are already complying on a massive scale.
4) My suggestion was an oversight of excessive moderation and manipulation of opinion, not what your example suggests, which is the opposite. You literally don't have to moderate anything to not be charged with 'excessive censorship' or whatever.
> Now you don't have much resource, since you aren't free to go off and try again - the regulator would just get that site too.
We have a justice system for a reason.
> I have a hard time seeing how this is better.
We can all come with with 100 different imaginary scenarios where some very generic suggestion may fail. I don't see the point. Do you think there should be no laws about content? No libel laws? No copyright enforcement? Where do you draw the line?
> And this is a weird stance for conservatives to be preaching, since the previous similar thing, the fairness doctrine, was no favourite thing of theirs.
I'm not a conservative, but what I find more jarring is the people who want sites like facebook forced to be censored and moderated politically, while reddit can do whatever it pleases as long is it aligns with their political ideology.
> This is one of those things where you can't say "only the government can solve this problem" unless the only speech you want to see protected is fairly-mainstream government-endorsed speech.
Again, I did not suggest that there should be laws about what content is allowed. I actually think we already have too many such laws - I would personally do away with DMCA and other such bullshit. What we should have is a check on mass-scale censorship and manipulation of opinion by corporations.
> There are many ways this can work, and it's not necessarily simple. I don't have a specific proposal.
I was talking about the proposal from `cwhiz, in which the government would revoke protections under existing current law. I think there are holes in that suggestion that are miles wide, that will extend to pretty much any other proposal that depends on the government deciding if a site is over- or under-moderated.
Without a specific proposal to talk about, or specific opinions on "should a site have to relinquish editorial control or rule-setting ability to enjoy liability protections from stuff done by users" (I say "no"), I don't see much more interesting stuff to debate. Those are real, active questions and proposals being made; other hypotheticals could go on forever but are less relevant to now.
But if you aren't interested in those specific points, I don't know why you're participating in a thread that was specifically about the questions around that current law.
So, you're arguing with me about something I never even said, while ignoring my point about excessive censorship and manipulation of public opinion? 'Revoke 230' or 'keep 230' are not the only possibilities in this universe.
That's actually not the point of Section 230. The point is that if you remove Section 230 and then you accuse someone of trolling, flaming, or swearing, that could be considered libelous.
If you would like to learn more, and see exactly what happened when Section 230 was not in effect, check out this case:
Stratton Oakmont, Inc. and Daniel Porush,
v.
Prodigy Services Company, "John Doe", and "Mary Doe".
Stratton Oakmont (yep, the one that was made by Jordan Belfort, the Wolf of Wall Street), won that case against Prodigy and, yes, John Doe and Mary Doe. You, the user, are John Doe or Mary Doe. Section 230 protects you.
What's not the point of section 230? I literally didn't say anything about section 230 or suggest that sites that moderate should have those protections revoked.
You are right, my apologies. You were responding to another person who was having a discussion about Section 230. I thought you were adding on to their thoughts about Section 230. If your point is entirely that users can backlash against a platform, then your point is correct.
I'll also add that the excessive censorship and manipulation of opinion on massive sites such as reddit, facebook, youtube, etc. is somewhat disturbing. People tend to think that only governments can censor excessively, but the fact is most public discussion in the West now happens on a few massive private platforms, and these platforms have a lot of power over public opinion.
Any site can have any rules they want. But they shouldn't expect to be able manipulate and control discussion and face no backlash from users.
> Without becoming a target for filling up with illegal shit to overwhelm the moderators and cause legal trouble for the owners?
'Trolling', 'flaming' and 'swearing' are not illegal, and honestly, most complaints about 'trolling' I see are just people who disagree with what they're reading.