Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm personally MUCH more worried about facebook and twitter than Google. Our current political discourse has gone off the rails in the U.S., and I think a large part of it is due to those two entities. Google I think is only a mild player in that space, which, in its extremes, is a threat to democratic society.


YouTube is a major player in that space.

https://www.theguardian.com/technology/2018/feb/02/how-youtu... YouTube's algorithm recommended extremist/conspiracy content during the 2016 election.

Here's YouTube CEO Susan Wojcicki. http://www.telegraph.co.uk/news/2017/12/05/youtube-taking-ex... (behind a free registration wall)

"Since June we have removed over 150,000 videos for violent extremism."

"We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018."


> YouTube's algorithm recommended extremist/conspiracy content

Would you agree that the algorithm recommended this content because people believe in, or at least are interested in, conspiracies? Even if we feel that's a mistake, who are we (or Google) to be the arbiters of truth?


No.

Content recommendation algorithms reward engagement metrics. One of the metrics they reward is getting a user's attention, briefly. In the real world, someone can get my attention by screaming that there is a fire. Belief that there is a fire and interest in fire are not necessary for my attention to be grabbed by a warning of fire. All that is needed is a desire for self-preservation and a degree of trust in the source of the knowledge.

Compounding the problem, since engagement is improved and people make money off of videos, there is an incentive in place encouraging the proliferation of attention grabbing false information.

In a better world, this behavior would not be incentivized. In a better world, reputation metrics would allow a person to realize that the boy who cried wolf was the one who had posted the attention grabbing video. Humanity has known for a long time that there are consequences for repeated lying. We have fables about that, warning liars away from lying.

I don't think making that explicit, like it is in many real world cases of lying publicly in the most attention attracting way possible, would be unreasonable.


No one hears the lies once they're removed. That can be good, but it also might work to prevent more people from criticizing or exposing them and thus, even if only slightly, validate them in the eyes of liars. This might yield nastier lies in the future.


Google recommends that stuff to me, and I don't believe in it or watch it. Watch math videos, get flat earth recommendations. Watch a few videos about the migration of Germanic tribes in Europe during the decline of the Roman Empire, get white supremacist recommendations.

My best guess? They want you to sit and watch YouTube for hours, so they recommend stuff watched by people who sit and watch YouTube for hours.

This stuff reminds of the word "excitotoxins," which is based on a silly idea yet seems to capture the addictive effect of stimulation. People are stimulated by things that seem novel, controversial, and dangerous. People craving stimulation will prefer provocative junk over unsurprising truth.


>Watch a few videos about the migration of Germanic tribes in Europe during the decline of the Roman Empire, get white supremacist recommendations.

Ugh, that happened to me. I was halfway through a video making weird, interesting, yet really benign claims when suddenly they connected all the pieces in support of some bizarre white supremacist shit.

Just because I wanted to know about proto-indo-europeans.

I looked over to the recommended videos, and it was already like 20% armchair white supremacist videos.


I binge watch youtube in a small window while doing certain kinds of work and specifically don't want anything I'm too interested in. This leads to future "weird" recommendations.

I also have to say that there's a certain comic relief to videos by flat-earthers (no moon landing, ancient astronaut) and other conspiracy theorists. Some mix valid points with hyperbole but some are so far out there that they're comical.


>Google recommends that stuff to me, and I don't believe in it or watch it.

So? Obviously people are watching them.

Or are you saying that YouTube has a strong hidden conservative bias?


A friend asked me to watch a clip of Ben Shapiro so we could sincerely talk about what the other side believes. I forgot that I was on a fresh OS install, not yet logged in. Well, right after that one video, every single recommended video to me became a "Ben Shapiro SLAMS liberal...".

I get that YouTube had no other metric to gauge me by, by there are 10 or 12 recommendations and there's no law that says YouTube can't ease into recommendations. Whether you're for or against Shapiro, this kind of behavior induces people into bubbles and radicalizing in the feedback loop.


Actually, I noticed YouTube's recommendations seems _heavily_ biased towards what you watched recently. This is likely due to how people watch videos - much like people binge watching TV serials, they are very much likely to continue watching videos from the same or similar channels.

In most cases this is harmless - play a song, and YouTube will automatically create a playlist of songs in the same genre. Watch a cooking video, and it'll give you a dozen other recipes by the same chef.

Unfortunately this doesn't work for political videos, as you've noticed. But there is absolutely no incentive for YouTube to insert videos from opposing viewpoints into the recommendations. YouTube is just giving people what they want.


YouTube is sort of caught between a rock and a hard place here.

So you watch a video about how the history of vaccines and how many lives have been saved by vaccination. Should YouTube recommend a counterpoint video about how vaccines are going to kill you and everyone you love?

If you are recommending "other sides of issues", either you start showing a lot of really crazy things to people who wouldn't otherwise see it, or you need to start taking an editorial position that some issues are "settled" and one side is just "wrong", and do a one-way gating where you'll show a breath of sanity in the follow-up recommendations to "bad" videos but not show "bad" recommendations on videos that get the subject "right".

Which one would you prefer?


It's funny how the problem is so _us_ this time. Turn a mirror on our souls and we lose nuance, mediums, neutrals, and common ground. "Ben Shapiro makes a good point" becomes "BEN SHAPIRO DESTROYS SUBHUMAN SCUM."

Who is that message for? It's not a discussion or debate, it's linguistic flashing lights to simulate a community around a simulated conflict. Youtube red, solving alienation in a more profitable way than the last red revolution.

The crime of social media is that they've created public solitary confinement. The internet is now a place where you go to be alone with your thoughts as simulated by companies. So boring.


> either you start showing a lot of really crazy things to people who wouldn't otherwise see it, or you need to start taking an editorial position that some issues are "settled" and one side is just "wrong"

Or you could attempt to take a neutral, quality-driven stance on recommendations: which videos do a good, _honest_ job of explaining the case for a given point of view? Plus of course any videos that present a reasonably balanced look at the pros/cons of multiple sides. You can even still factor in which ones are most engaging, as a secondary element (e.g. preferring Bill Nye over someone droning on boringly about climate change).

Maybe that has a bias toward moderate positions and away from extremes, but that doesn't seem terrible since (a) it's what compromise is built on, and (b) it seems more likely to succeed in winning over more newcomers to that viewpoint anyway (does David Brooks convince more liberals to question some of their beliefs than Alex Jones? I'd bet so).

I think the real problem with this approach is it's very hard to build an _automated_ recommendation engine for this. At scales that matter (YouTube, Facebook, etc.) you need it to be automated and hard to game. And ML doesn't quite seem up to the task of judging ranking things by demagoguery, let alone honesty, at this point...


Well, I'd prefer annotated recommendations with reasons, and analysis of quality as a big feeder into the recommendations (so, both opposing and supporting--and even supporting in part and opposing in part--viewpoints, especially with stronger support, would be natural recommendations, with their relationships to the thing you just viewed annotated.) OTOH, that's probably a lot harder to do (especially automatically; intuitively, it seems likely to be essentially an AI-complete problem, though I wouldn't be too surprised if someone found a way to do a "good enough" version for common classes of content without a full AI) than what typical recommendation engines do; the annotated related items thing is found some places, but they are IME always manually-curated, narrow-focus databases (e.g., commercial legal authority databases.)


Just hit this on YouTube today, had to go down past hundreds of nearly-identical garbage recommendations to get to one of interest. At least they should toss in a tiny bit of RNG.


I mean, that style of title exists accoross the political spectrum on YouTube and it's beyond stupid. Whenever I see "X DESTROYS Y" titles I assume the video will be a waste of 3-10 minutes with no particular moment of linguistic destruction on either side and is in fact just two people bickering.

Just clickbait titles is all it is.

Regardless, I don't see how any of that can really count as "extremist" or "conspiratorial" content.


The crux seems to be that Google's algorithms are designed to maximize time spent watching. It's frankly absurd how successful they have been at this, especially in younger generations.

It's one thing when their team of PhD's extract attention into cat videos, and another thing to extract it into potentially unhealthy amounts of extremist material and conspiracies.

I don't know what the answer is, but I do miss the days before infinite scroll and autoplay.


One of my friends is the music director of a US freeform radio station. He’s a legend in the industry known for his encyclopedic knowledge of cool but highly obscure artists over the past 40 years. Over the past year, he has often been left mouth open when YouTube’s algorithm starts picking what’s next after we pick a starting point. “How the hell does it know about these guys?” That’s how good the robots have become.


Opposite here. Google, going forward, is more dangerous with it's censorship imho. There is still a lot of trust in google by the average people who think that it's verifiably right XX% of the time, so it can be trusted.. for many reasons.. however there is no transparency in how results are ranked. With Matt Cutts gone there is even less discussed about things. Broad confusing terms like "we changed a couple things and it's only affecting 3% of results, so no worry, carry on.." - and then the things that are changed and not discussed. People don't know.

At least with fbook and twitter, people are figuring out that much can not be trusted, and that some "friends" are easily duped into sharing things that are not 100% right. I agree we still need to deal with repeated things affecting the subconscious as everyone scrolls past things, and we need methods to screen capture and research other viewpoints on things that come into a feed.. You can mute people view these other networks, with google you can't mute individuals, or tell it stop prioritizing "xx group thinks". It's not a mild player by any means, it's used like most use wikipedia, but it's used much more. That is a threat to many minds and the future, not just democratic whatever.


Sometimes I wonder how much their impact is overblown though. It seems like it's a very loud presence in people's lives, and a lot of very loud people on it who think they are making some kind of impact. When is the last time you were swayed by lefty or righty radical? I'm not even convinced it really pushes people further down an idealogy. But then again I'm genX. I just don't take tweets that seriously, even when they come from Trump or Obama.


> When is the last time you were swayed by lefty or righty radical? I'm not even convinced it really pushes people further down an idealogy.

I haven't been swayed by such radicals, but I have been turned off and disgusted by them, feelings which bleed over to people who I associate with them. I can see that effect as one that re-enforces political polarization.


>When is the last time you were swayed by lefty or righty radical?

That's one of the arguments against them. People become siloed into filter bubbles by their algos and are insulated against conflicting viewpoints.


Well, for that to be true, you'd have to believe things like propaganda and advertising are not effective.

But, we know that they are.


That's not even what this article is about though. What does political discourse have anything to do with this?


Sounds like they're saying stifling new or fringe ideas is a worse problem than stifling new businesses. I'm not 100% sure I agree - I would actually say they're about the same: Equally important (contrary to Upvoter333's opinion), and intimately related (contrary to yours). There's a parallel between established, popular, dominant ideas on the one hand, and established, popular, dominant businesses on the other.


Sure I can agree with that. OP's comment seemed pretty out of place though, almost like a google-whataboutism. What about Facebook? Well, yeah, if that is the kind of counterargument, then we can ask "What about...?" a million different things. It doesn't take away from what the article is about.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: