The most recent, the oldest, the closest match? That doesn't make it a recommendation system. Maybe try and read my post and make an effort to understand it rather than just responding with the first thing that comes to mind, because it is as if you have not understood my post at all and you seem to have not made any effort thereto.
Do you not recognize how lousy of a video sharing website this would be? Spammers are going to be constantly uploading marketing and other low-quality content with irrelevant keywords, while users that actually put work into making good quality videos will see their results pushed to the bottom quickly. How will you deal with that without implementing a system that can identify and recommend non-spam videos? Even the oldest versions of Youtube were boosting videos that got lots of likes.
>the closest match
How is deciding the "closest match" not considered a recommendation? They all have the user's keyword, what other criteria will you use?
>Do you not recognize how lousy of a video sharing website this would be? Spammers are going to be constantly uploading marketing and other low-quality content with irrelevant keywords, while users that actually put work into making good quality videos will see their results pushed to the bottom quickly. How will you deal with that without implementing a system that can identify and recommend non-spam videos? Even the oldest versions of Youtube were boosting videos that got lots of likes.
Not sure why that's my problem, I'm not the one making money by promoting reactionary videos to reactionaries.
>How is deciding the "closest match" not considered a recommendation? They all have the user's keyword, what other criteria will you use?
Because it's not a recommendation, some are better matches than others, thats' all. Some match the entire keyword, some just parts, some in different places... I don't understand what is difficult about this for you.
And what do you do when there's 10,000 exact keyword matches, how do you sort them? If it's newest the entire thing is just going to be spam accounts reposing the same video(s) on any major keyword.
"top", or anything notable is also likely to be gamed and abused too, especially if you fuzz "top" sorting because then its not really neutral, you're deciding the order and therefore making a recommendation.
Then there might be a circumstance where it is promoting something. Your point? The law shouldn't make this illegal because then YouTube would have to have greater regard for what it surfaces? I'm not sure that's a bad thing, that's the entire point of the thread.
My point wasn't the frequency of it but rather that it might be the case that some of YouTube's operations do work that way... so what? Is YouTube's convenience the point of law? No. So why does it matter?
>Not sure why that's my problem, I'm not the one making money by promoting reactionary videos to reactionaries.
The reason I think we should see it as our problem is because I think the solution companies arrive at is just to turn the internet into cable TV, where only approved media organizations are able to share content because of liability concerns.
I'm not sure why YouTube should be able to operate the service it does with the little content filtering it does. In what other industry would you be allowed to post child pornography because it's too difficult to make sure it doesn't get posted? No newspaper could take that excuse. Toys R Us couldn't say "oh jeez, we didn't realize that a corner of our store was being used by child pornographers to spread child pornography and also recruit children" and not be liable. I'm not sure why we think it's good to give an excuse to YouTube and Facebook for this and anything else anyone else would normally be liable for.
>No newspaper could take that excuse. Toys R Us couldn't say "oh jeez, we didn't realize that a corner of our store was being used by child pornographers to spread child pornography and also recruit children" and not be liable. I'm not sure why we think it's good to give an excuse to YouTube and Facebook for this and anything else anyone else would normally be liable for.
I'll admit, we may even be better off as a society of communication was less "democratized." There certainly would have been a lot less covid and election misinformation out there if every rando wasn't able to have their uninformed ideas broadcasted by giant platforms.
Exactly, I understand why section 230 is in place and what it achieved, but I do wonder what good it has actually done and whether or not we actually need it. perhaps we don't need to break up the big tech co's, and instead just make them as liable as any other business would be. in that sense, I don't think they could afford the conglomeration they have right now.