Perhaps I wasn't clear about one important point: while safe harbour schemes are one example of corporations like Google pushing back the frontiers of what is legal, I did not mean to imply that all such moves rely on the safe harbour principle. It is merely one example. Something like the proposed Google Books settlement would be another. A lack of effective privacy laws is a third.
As for the difficulty of running things without safe harbour provisions, people are quick to assume that there are no or only limited alternatives, and that without the current scheme we would somehow "kill the Internet", but I challenge that assumption, based on the argument that follows.
Firstly, if there's one thing the Internet has managed to do successfully for a long time, it is evolving creative solutions to large scale problems. The Web started as a small network of pages, with only a few carefully chosen links between them. Then we evolved web rings to help people find related content. After that, ever-improving search engines became the dominant way to find material. Today we have link aggregators, status updates/tweets, etc. There is no reason to believe that removing one form of expression would make things worse; on the contrary, if history is anything to go by, it might well catalyse the development of a better replacement.
On which note, I think we are going to move towards "Web 3.0" where user-generated content is balanced with editorial control by real people anyway. Web 2.0 was an interesting experiment, but the signal-to-noise ratio is terrible, the volume of information is overwhelming, and the simple, user-supported voting systems of today are widely abused and not really up to the job. Even the poster child, Wikipedia, now has policies that prevent open contributions to certain articles, internal politics among the volunteer editors about what should be allowed, etc. I think the kind of purely automated content hosting sites with unlimited user generated content are already dead, they just don't know it yet.
Next is a key point that I imagine some here may disagree with: I have never believed in absolute freedom of speech (of the kind where you can say whatever you like with no responsibility for any consequences). Words are very powerful things, and in the Internet era that power is magnified many times over. I believe that people should be held responsible for something they say that is unreasonably damaging to others, whether it be betraying a confidence, starting a malicious rumour, giving misleading "expert" advice in a field such as health or finance... This is just common decency and courtesy, and indeed almost nowhere has really had freedom of speech in law for many years, because of issues like defamation, national security, court-ordered anonymity for victims of certain crimes, etc.
These things have been part of the law in many jurisdictions since long before the Internet. The difference is that on the Internet, there is no such thing as an off-hand comment quickly forgotten. Society needs to adapt to this new reality, appreciating the benefits of giving everyone the power to speak to millions of others, but imposing suitably harsh penalties for those who use that power irresponsibly.
I just don't see how that is compatible with broad measures like absolute freedom of speech or complete safe harbour rules for those republishing material to potentially huge numbers of people. Those are merely convenient blanket laws designed for ease of implementation, not because they reflect the ethical thing to do.
So I'm not convinced that a law that imposed responsibility and as a side-effect made it difficult to run unlimited user-generated content sites would cause the loss of something truly valuable. If you want to write a dozen spam comments on a blog about how hot Hayden looked on Heroes last night, you are welcome to do it on your own personal web site, where your deeply-considered and original thoughts will no doubt receive the attention they deserve. And if you say something unfair in your next blog post that destroys someone's career, you can look forward to a personal letter telling you when to turn up in court and defend that action or be held accountable.
Meanwhile, I don't see the kinds of rules I'm talking about halting the development of the Web, because I think we're going to move toward more editorial control on the big, popular sites for other reasons anyway. And even if you do believe in absolute freedom of speech, you can still publish what you want on your personal web site, you would just have to take personal responsibility for it accordingly instead of hiding semi-anonymously behind some giant organisation who will republish your material but shield you from being identified and thus held responsible for the consequences. I, for one, don't have a problem with that.
Unless you have a peering relationship with a backbone provider, someone is responsible in a safe-harbor sense for any content you put on the internet. Period. That's what the parent post meant in saying that elimination of safe harbour would favor the big players over the small, and the internet would cease to exist as we know it.
This fact is indifferent to whether you use web-rings, NNTP, your own "personal website," or YouTube to post content.
Sure, but there is a difference between a pure communications medium (providing a unique point-to-point connection between identified hosts that temporarily transmits arbitrary data) and a content hosting service (which stores persistent data from one party and republishes it to arbitrary others). Treating an ISP analogously to a mail or telephone service provider is a reasonable comparison. Assuming that a hosting site like YouTube should be treated in the same way by default is a stretch, IMHO.
How about companies like RackSpace which " stores persistent data from one party and republishes it to arbitrary others" - are they safe?
How about your local web hosting company, which " stores persistent data from one party and republishes it to arbitrary others" ?
How about your ISP, running a caching proxy such as Squid which " stores persistent data from one party and republishes it to arbitrary others"?
There are a lot of independent entities out there storing, publishing, and forwarding your content. Most make fractions of a cent on each "piece of data" - photo, video, blog, whatever that flows through their system. They cannot commercially afford to screen it all, or even sample it.
Can you read every single article on the new page of Hacker news? 24/7? How about on Digg?
Throwing up a few random examples doesn't further the debate very effectively. There are obvious differences between, for example, anonymous links in the infrastructure of the Internet and an identified end product hosting site. There are obvious differences between a local web hosting company, which has a specific commercial arrangement with an identified individual, and a generic hosting service that allows arbitrary, effectively anonymous individuals to post arbitrary material.
I'm not saying there should be no provisions in law to support the effective running of the Internet. I've never said that. I'm just saying that companies who want to establish a certain business model on the Internet shouldn't get a free pass just because it is difficult to run a business with that model while still complying with the same laws as everyone else. If that means some companies cannot continue, so be it: as I said before, I don't think anything of significant value will ultimately be lost, and I would rather that than effectively legislate certain businesses above the law just because they can't work out how to do things legally otherwise.
Your final comments are a straw man. Hacker News and Digg aren't republishing those articles, they're just linking to other sites that do.
There are the comments. I wouldn't be trivial to just link to them instead of actually hosting them.
Also, I am not certain this would be a good thing to always require content to have someone be accountable for it. If I want to express, say, a dissenting political opinion, I may want to be anonymous, and I may not want to pass the burden of accountability to someone else.
> If I want to express, say, a dissenting political opinion, I may want to be anonymous, and I may not want to pass the burden of accountability to someone else.
I am certainly not saying that no protected speech should exist. Indeed, I am all in favour of a law that protects certain classes of speech, with political views probably the most important class.
I just think that such laws should be crafted carefully, striking a balance between freedom of expression and protecting people from the harm when others abuse that freedom, and that once such laws have been made, they should apply on the Internet as much as anywhere else.
There are complications with jurisdiction in the on-line world, but there is no reason that most of the international community can't reach a consensus on these issues, just as they have on many others before.
I have always struggled with the idea of anonymity as a vehicle for free speech, for three reasons. Firstly, most people who think they are anonymous on-line really aren't, if someone tries hard enough to identify them, so it is often an illusion. Secondly, actively protecting anonymity automatically removes any responsibility from the speaker, whether or not what they are saying is within a protected area, creating a huge loophole in the laws. Finally, while anonymity may have a perceived value in protecting those opposing an undemocratic government, we don't have that situation anywhere in the west (to the point where political dissidents are routinely threatened or "disappeared"), and if we ever reach that point again, the correct response will be one of the three boxes after "soap".
Wikileaks is, as far as I am concerned, the textbook example of a site that should not exist.
Firstly, if you need Wikileaks in the first place, you have bigger problems.
Secondly, Wikileaks actively tries to place itself above the law. No-one should be above the law.
Thirdly, there is little that has been revealed via Wikileaks that could not have been revealed in the traditional way via a free press. Wikileaks may make things marginally easier, but if you're in the business of leaking private stuff only if it's easy, maybe you should reconsider your world view.
Finally, before anyone comes along and tells me how much good Wikileaks does, consider this: they also released the private membership list of an unpopular political party, causing very serious consequences for many members of that party. Whether or not you agree with their politics, that sort of action is way over the line. What about the anonymity of those party members?
Something like Wikileaks has advantages and disadvantages, the former it being unbiased, unlike the free press, which, at least in my country, is ripe with political affiliation and business interests. And yes, the need for Wikileaks signals bigger problems. What bothers me about the loss of anonymity is the inherent loss of ways to fix those problems, but that may just be the necessary tradeoff for the evolution of the web.
As for the difficulty of running things without safe harbour provisions, people are quick to assume that there are no or only limited alternatives, and that without the current scheme we would somehow "kill the Internet", but I challenge that assumption, based on the argument that follows.
Firstly, if there's one thing the Internet has managed to do successfully for a long time, it is evolving creative solutions to large scale problems. The Web started as a small network of pages, with only a few carefully chosen links between them. Then we evolved web rings to help people find related content. After that, ever-improving search engines became the dominant way to find material. Today we have link aggregators, status updates/tweets, etc. There is no reason to believe that removing one form of expression would make things worse; on the contrary, if history is anything to go by, it might well catalyse the development of a better replacement.
On which note, I think we are going to move towards "Web 3.0" where user-generated content is balanced with editorial control by real people anyway. Web 2.0 was an interesting experiment, but the signal-to-noise ratio is terrible, the volume of information is overwhelming, and the simple, user-supported voting systems of today are widely abused and not really up to the job. Even the poster child, Wikipedia, now has policies that prevent open contributions to certain articles, internal politics among the volunteer editors about what should be allowed, etc. I think the kind of purely automated content hosting sites with unlimited user generated content are already dead, they just don't know it yet.
Next is a key point that I imagine some here may disagree with: I have never believed in absolute freedom of speech (of the kind where you can say whatever you like with no responsibility for any consequences). Words are very powerful things, and in the Internet era that power is magnified many times over. I believe that people should be held responsible for something they say that is unreasonably damaging to others, whether it be betraying a confidence, starting a malicious rumour, giving misleading "expert" advice in a field such as health or finance... This is just common decency and courtesy, and indeed almost nowhere has really had freedom of speech in law for many years, because of issues like defamation, national security, court-ordered anonymity for victims of certain crimes, etc.
These things have been part of the law in many jurisdictions since long before the Internet. The difference is that on the Internet, there is no such thing as an off-hand comment quickly forgotten. Society needs to adapt to this new reality, appreciating the benefits of giving everyone the power to speak to millions of others, but imposing suitably harsh penalties for those who use that power irresponsibly.
I just don't see how that is compatible with broad measures like absolute freedom of speech or complete safe harbour rules for those republishing material to potentially huge numbers of people. Those are merely convenient blanket laws designed for ease of implementation, not because they reflect the ethical thing to do.
So I'm not convinced that a law that imposed responsibility and as a side-effect made it difficult to run unlimited user-generated content sites would cause the loss of something truly valuable. If you want to write a dozen spam comments on a blog about how hot Hayden looked on Heroes last night, you are welcome to do it on your own personal web site, where your deeply-considered and original thoughts will no doubt receive the attention they deserve. And if you say something unfair in your next blog post that destroys someone's career, you can look forward to a personal letter telling you when to turn up in court and defend that action or be held accountable.
Meanwhile, I don't see the kinds of rules I'm talking about halting the development of the Web, because I think we're going to move toward more editorial control on the big, popular sites for other reasons anyway. And even if you do believe in absolute freedom of speech, you can still publish what you want on your personal web site, you would just have to take personal responsibility for it accordingly instead of hiding semi-anonymously behind some giant organisation who will republish your material but shield you from being identified and thus held responsible for the consequences. I, for one, don't have a problem with that.