Removal of speech is not a consequence of speech -- it's preventing speech in the first place. That's what happens when Facebook blocks or deletes "misinformation" -- they are removing the speech itself. That's not the same thing as "consequences" for speech.
Look at what HN mods do -- they ban trolls, but they don't delete what the trolls posted. It's there for everyone to see -- in fact, if you look at "dead" comments you can see flagged stuff too. In terms of free speech, that's very different from deleting the comments entirely, which is what people seem to want Facebook to do.
And for the sake of argument, even if we accept that "consequences" ought to include the right to free speech being taken away from bad actors -- who can be trusted to decide who ought to be punished? Again, surely not Facebook. Surely not the government either -- the winners of every election would punish their enemies by taking away their rights. So even if we could tell, 100% reliably, who were trolls and who were not, we still should not give any corporation or government the power to take away the right of free speech.
What should be consequences? What are consequences when you say something stupid to your family or friends? What should be consequences when you knowingly lie to slander somebody?
> And for the sake of argument, even if we accept that "consequences" ought to include the right to free speech being taken away from bad actors
This already exists in the law. Just as your right to move freely is taken away in certain situations (for example due to restraining order).
The usual when you get up in front of a group and act like a jackass: social shame and ostracization. Something that's hard to do on internet platforms. Even when people are not anonymous, they have plenty of ways to "hide", and it's easy to unfollow and block people who criticize you for spreading misinformation.
So I don't know. Removal and deplatforming, IMO, is not the answer. You don't fix extremism through censorship; that just makes it worse and drives it underground.
Removal of speech is not a consequence of speech -- it's preventing speech in the first place. That's what happens when Facebook blocks or deletes "misinformation" -- they are removing the speech itself. That's not the same thing as "consequences" for speech.
Look at what HN mods do -- they ban trolls, but they don't delete what the trolls posted. It's there for everyone to see -- in fact, if you look at "dead" comments you can see flagged stuff too. In terms of free speech, that's very different from deleting the comments entirely, which is what people seem to want Facebook to do.
And for the sake of argument, even if we accept that "consequences" ought to include the right to free speech being taken away from bad actors -- who can be trusted to decide who ought to be punished? Again, surely not Facebook. Surely not the government either -- the winners of every election would punish their enemies by taking away their rights. So even if we could tell, 100% reliably, who were trolls and who were not, we still should not give any corporation or government the power to take away the right of free speech.