> I'm almost a free speech absolutist, but I also really hate all the crazy, conspiratorial thinking that is so common these days. It puts me odds with myself. I can't say I'm sad when crazy and dangerous ideas are removed from the internet. However, I think it's clear to see how this could be abused. A system where the "right" ideas are allowed depends on the people in charge having the "right" ideas.
On the other hand, I'm not a free speech absolutist (obviously) because I don't trust other people to behave reasonably. But if everyone were perfectly rational, I probably would be; and maybe you and I are rational enough that we can handle "dangerous" ideas without succumbing to emotional manipulation (but then again, maybe not). However, while perhaps we can watch a purposefully mis-informative video, evaluate the merits of those claims, and dismiss them out of hand, do you trust everyone else to do the same?
That isn't to say you or I should be the arbiter of truth, but there should be some bare minimum standard. In my mind, there are two pillars for public discourse:
1. Facts, upon which arguments are based.
2. Deductive/inductive arguments, which appeal to values.
Obviously different people will have different values, so they will evaluate the weight of the arguments differently; however, facts are facts -- those should be the common ground. When someone denies reality by positing conspiracies without meaningful evidence, public discourse is impossible.
> What if instead, it's the company which is telling the truth, but the government which is malicious, and wants to censor them?
If the government is malicious, the correct response is voting. Unlimited free speech won't help: a malicious government could censor free speech; or a malicious government could promote un-truths with as much weight as truths, transforming "matters of fact" into "matters of opinion".
(As an aside, I'm nearly a democracy absolutist despite my distaste for free speech absolutism. While that might seem contradictory, democracy thrives in public discourse, and unlimited free speech can hamper that: think Citizen's United, where money can give someone an outsized influence; or conspiracies, which deny facts.)
>On the other hand, I'm not a free speech absolutist (obviously) because I don't trust other people to behave reasonably.
I don't either, but I still believe free speech is important, even if that means it certainly be abused, and cause some damage on a long enough timeline.
As a personal anecdote, I used to spend time on 4chan. (ie, within the last few months I finally blocked it.) I never picked up any of the crazier conspiracy theories that people might associate with the site. But, I was affected by reading all the extremist content. My intuitive sense for the likelihood of country-wide instability, as well as my sense for how bad that instability could be were well skewed. Having finally quit, I feel a bit silly now. I live in a very peaceful, suburban neighborhood, and it's the safest town in my whole state. I generally don't consider myself as susceptible to misinformation, but I think it's clear to me now that anyone can be affected, and everyone has a different blind spot for it.
Were I DO believe in censorship is in the home. ie, I believe it's ok to determine what media / ideas I ingest, since my abstinence won't prevent anyone else.
All of that said, I completely agree with you when it comes to facts and common ground. However, I believe that technology has fundamentally changed what it means to release and censor information. I haven't figured out a good way to articulate this yet. So I'll just say that pamphlets in the 1800s are obviously very different from youtube or social media now. I don't believe it's the same issues all over again, and I'm not sure what the right solutions are.
>If the government is malicious, the correct response is voting.
No doubt. The recent brush with populism has made me a bit nervous in this regard. It can be argued whether the founders really successfully produced enough bulwarks against populism, however I certainly agree with the intent of their efforts.
On the other hand, I'm not a free speech absolutist (obviously) because I don't trust other people to behave reasonably. But if everyone were perfectly rational, I probably would be; and maybe you and I are rational enough that we can handle "dangerous" ideas without succumbing to emotional manipulation (but then again, maybe not). However, while perhaps we can watch a purposefully mis-informative video, evaluate the merits of those claims, and dismiss them out of hand, do you trust everyone else to do the same?
That isn't to say you or I should be the arbiter of truth, but there should be some bare minimum standard. In my mind, there are two pillars for public discourse:
1. Facts, upon which arguments are based.
2. Deductive/inductive arguments, which appeal to values.
Obviously different people will have different values, so they will evaluate the weight of the arguments differently; however, facts are facts -- those should be the common ground. When someone denies reality by positing conspiracies without meaningful evidence, public discourse is impossible.
> What if instead, it's the company which is telling the truth, but the government which is malicious, and wants to censor them?
If the government is malicious, the correct response is voting. Unlimited free speech won't help: a malicious government could censor free speech; or a malicious government could promote un-truths with as much weight as truths, transforming "matters of fact" into "matters of opinion".
(As an aside, I'm nearly a democracy absolutist despite my distaste for free speech absolutism. While that might seem contradictory, democracy thrives in public discourse, and unlimited free speech can hamper that: think Citizen's United, where money can give someone an outsized influence; or conspiracies, which deny facts.)