Hacker News new | past | comments | ask | show | jobs | submit login

I think a lot of folks forget that Facebook wanted to come in and clean up some of the filth in social media. They felt that by attaching your _real_ name to your posts, instead of a handle as was the traditional practice, that you would have something to lose (social standing, esteem, etc) and so you would be more thoughtful about your actions. The contrasts at the time were reddit, SomethingAwful, and 4chan. There was _definitely_ extant toxicity on the internet and there were funny posts in the early days of GMail that you could stop them from displaying ads by inserting lots of expletives and bad words in your email (and so some would have GMail signatures that just lumped bad words in together and explained it as an ad circumvention thing).

But I think there are a few key innovations that make FB worse for human psychology than previous iterations. Chief among them is the algorithmic newsfeed designed to drive engagement. Outrage certainly provokes responses, but in a chronological feed situation, eventually threads would become so large that the original outrageous situation would be pushed far back and the outrage would go away. Algorithmic newsfeeds bubble these to the top and continue to show them as they get more comments/retweets/shares/etc. They reward engagement in a visceral way that offers perverse incentives.

Secondly is the filter bubble. By showing you content hyper-relevant to your search interests, you can easily fall into echo chambers of outrage and extremism. Internet communities, like IRC channels, had huge discoverability issues. Each community also usually had disparate ways to join them adding another layer of friction. Even if you were an extremist it took dedicated searching to find a community that would tolerate your extremism. Now mainstream platforms will lump you into filter bubbles with other people that are willing to engage and amplify your extremist posts.

Combine horribly narrow echo chambers with engagement-oriented metrics and you'll have a simple tool for radicalization. That way when you're thinking of committing a violent act because of the disenfranchisement you feel in your life and your community, you'll be funneled to communicate with others who feel similarly and enter a game of broad brinkmanship that can quickly drive a group to the extreme. Balkanization and radicalization.




> I think a lot of folks forget that Facebook wanted to come in and clean up some of the filth in social media. They felt that by attaching your _real_ name to your posts, instead of a handle as was the traditional practice, that you would have something to lose (social standing, esteem, etc) and so you would be more thoughtful about your actions. The contrasts at the time were reddit, SomethingAwful, and 4chan. There was _definitely_ extant toxicity on the internet and there were funny posts in the early days of GMail that you could stop them from displaying ads by inserting lots of expletives and bad words in your email (and so some would have GMail signatures that just lumped bad words in together and explained it as an ad circumvention thing).

This is such a great point. The pre-Facebook Internet was full of anonymous random garbage. But everyone knew it was inconsequential garbage. Adding real names and likes changed all that: today garbage has gained legitimacy and is displacing prior forms thereof.


"the outrage would go away"

If there's one thing I've learned it's that the outrage never goes away. The type of people who fixate on outrage in their Facebook feeds are the same type of people who decades prior would cruise around town picking fights in person. I'm unconvinced that Facebook is meaningfully changing this dynamic.

I'm also unconvinced that the filter bubble is meaningfully different than what's come before. Humans have been sorting themselves into like-minded communities since before we could read and write. Do you remember the hive-minds of the 80s and 90s? If anything they were far more extreme because of the difficulty in proving anything, back before google and wikipedia. There was a lot more extremism and hate based violence back then. A LOT, LOT more, and no interventions like Facebook is at least attempting to provide.

Facebook has some new angles on old patterns in human behavior, yes. I think the people who're trying to show that it's made things work have a lot of work to do to make a compelling case. Facebook's biggest transgression is probably that it has chronicled this behavior and has dragged it into the light.


Very well put. When people say "it's always been like this" or "it's no different than X" – this is exactly the difference, and while fundamental human behaviors or impulses haven't changed, the design of the platform is changing how they are expressed.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: