Actually, I noticed YouTube's recommendations seems _heavily_ biased towards what you watched recently. This is likely due to how people watch videos - much like people binge watching TV serials, they are very much likely to continue watching videos from the same or similar channels.
In most cases this is harmless - play a song, and YouTube will automatically create a playlist of songs in the same genre. Watch a cooking video, and it'll give you a dozen other recipes by the same chef.
Unfortunately this doesn't work for political videos, as you've noticed. But there is absolutely no incentive for YouTube to insert videos from opposing viewpoints into the recommendations. YouTube is just giving people what they want.
YouTube is sort of caught between a rock and a hard place here.
So you watch a video about how the history of vaccines and how many lives have been saved by vaccination. Should YouTube recommend a counterpoint video about how vaccines are going to kill you and everyone you love?
If you are recommending "other sides of issues", either you start showing a lot of really crazy things to people who wouldn't otherwise see it, or you need to start taking an editorial position that some issues are "settled" and one side is just "wrong", and do a one-way gating where you'll show a breath of sanity in the follow-up recommendations to "bad" videos but not show "bad" recommendations on videos that get the subject "right".
It's funny how the problem is so _us_ this time. Turn a mirror on our souls and we lose nuance, mediums, neutrals, and common ground. "Ben Shapiro makes a good point" becomes "BEN SHAPIRO DESTROYS SUBHUMAN SCUM."
Who is that message for? It's not a discussion or debate, it's linguistic flashing lights to simulate a community around a simulated conflict. Youtube red, solving alienation in a more profitable way than the last red revolution.
The crime of social media is that they've created public solitary confinement. The internet is now a place where you go to be alone with your thoughts as simulated by companies. So boring.
> either you start showing a lot of really crazy things to people who wouldn't otherwise see it, or you need to start taking an editorial position that some issues are "settled" and one side is just "wrong"
Or you could attempt to take a neutral, quality-driven stance on recommendations: which videos do a good, _honest_ job of explaining the case for a given point of view? Plus of course any videos that present a reasonably balanced look at the pros/cons of multiple sides. You can even still factor in which ones are most engaging, as a secondary element (e.g. preferring Bill Nye over someone droning on boringly about climate change).
Maybe that has a bias toward moderate positions and away from extremes, but that doesn't seem terrible since (a) it's what compromise is built on, and (b) it seems more likely to succeed in winning over more newcomers to that viewpoint anyway (does David Brooks convince more liberals to question some of their beliefs than Alex Jones? I'd bet so).
I think the real problem with this approach is it's very hard to build an _automated_ recommendation engine for this. At scales that matter (YouTube, Facebook, etc.) you need it to be automated and hard to game. And ML doesn't quite seem up to the task of judging ranking things by demagoguery, let alone honesty, at this point...
Well, I'd prefer annotated recommendations with reasons, and analysis of quality as a big feeder into the recommendations (so, both opposing and supporting--and even supporting in part and opposing in part--viewpoints, especially with stronger support, would be natural recommendations, with their relationships to the thing you just viewed annotated.) OTOH, that's probably a lot harder to do (especially automatically; intuitively, it seems likely to be essentially an AI-complete problem, though I wouldn't be too surprised if someone found a way to do a "good enough" version for common classes of content without a full AI) than what typical recommendation engines do; the annotated related items thing is found some places, but they are IME always manually-curated, narrow-focus databases (e.g., commercial legal authority databases.)
Just hit this on YouTube today, had to go down past hundreds of nearly-identical garbage recommendations to get to one of interest. At least they should toss in a tiny bit of RNG.
In most cases this is harmless - play a song, and YouTube will automatically create a playlist of songs in the same genre. Watch a cooking video, and it'll give you a dozen other recipes by the same chef.
Unfortunately this doesn't work for political videos, as you've noticed. But there is absolutely no incentive for YouTube to insert videos from opposing viewpoints into the recommendations. YouTube is just giving people what they want.