> If their interests on Facebook (what they like, follow, etc) were mostly cat videos, Facebook wouldn't be recommending extremist groups.
The whole point of recommendation algorithms is to find missing edges in the graph, so it can easily lead you to misinformation in 1 or 2 hops.
Think of it this way: the misinformation content is highly valuable - it generates a lot of engagement. There is always a “potential energy” (people like you also liked...) between low-value content and high-value content that the platforms are attempting to convert to “kinetic energy” (engagement - views, clicks, comments) in order to monetize it. The goal is to find the shortest path to the high value content.
The whole point of recommendation algorithms is to find missing edges in the graph, so it can easily lead you to misinformation in 1 or 2 hops.
Think of it this way: the misinformation content is highly valuable - it generates a lot of engagement. There is always a “potential energy” (people like you also liked...) between low-value content and high-value content that the platforms are attempting to convert to “kinetic energy” (engagement - views, clicks, comments) in order to monetize it. The goal is to find the shortest path to the high value content.