Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

He's right in some sense, but the context is important. The problem is that Facebook is giving the village idiot a megaphone. Facebook can't say:

- Amplify your commercial business message to billions of people worldwide.

AND at the same time

- Well its your individual choice whether or not to listen to the village idiot.

You guys gave them a megaphone, how do you expect society to behave?!



> The problem is that Facebook is giving the village idiot a megaphone

While you're not wrong that it's giving the idiot a megaphone, it's missing the greater picture. it's giving _everyone_ a megaphone. The real question is why can't people discern the difference between the idiot and the non-idiot?

I'd also note that a big issue now is trust -- trust in "elites" (technocrats, wealthy, those in positions of power) has been declining for a long time. i think people are not so much seeking out the village idiot, but massively discounting "experts".

A list of things that come to mind which have broken trust: 60's saw hippies which wanted to break norms of their parents/grandparents, 70s saw vietnam war, breaking gold standard, 80s greed is good, iran contra etc, 90s tough on crime policies, y2k fears, 00s - iraq/afghanistan, 9/11 attacks, governmental data dragnet, manning/snowden/asange, Covid statements which did not pan out as planned...

People have good reasons to be skeptical of elites, but I think anti-corruption work is more important than trying to silence the idiot.


That's also missing the greater picture. It's giving _everyone_ a megaphone... but giving the loudest megaphones to the people who can get most people to listen to them.

You'll have noticed on the internet that there's a tendency to prioritise engaging with things you disagree with (hell, half of my HN comments are because I felt motivated to write something to disagree with some OP at some point - even this one).

What that means is the traditional small-c conservative 'village elders', 'parish priests', and 'elected officials', who hold authoritative positions not because they're controversial, but because they historically represented positions of neutrality and consensus end up with quiet megaphones, and the madmen claiming the world is flat and there's a paedophile ring run out of a pizza shop end up with the loudest megaphones.

Half of the population is below average intelligence, and giving the wrong people the loudest megaphones has a devastating effect on society.


It's not everyone though. FB algorithm is giving preference to the most controversial person. People that are reasonable are boring and don't cause engagement, so their posts also are not displayed.

When you get away from the site, FB will start bombarding you with the messages from people that you would mostly react, because they want you back.


...and not just "the most controversial", in a lot of cases on both the content creators' as well as the sockpuppets' sides, we're not even talking about "village idiot" type real humans anymore, but — in contrast to Meta CTO's words — about extremely skillful mass manipulators sitting somewhere in Russia and hiding behind international proxies.

Not only was that tolerated in the name of profit, these individuals were able to create official looking, completely unverified "pages" with bogus attribution to create their "engagement" campaigns meant to poison and destruct western society (and arguably being successful in that).

How this is legally anything different than complicity in treason is hard to comprehend for me.


Yeah I think this the crux of it. Facebook is prioritizing the most "engagement" which means prioritizing the most "reactions" which means prioritizing the most divisive and enraging content on both sides. If Facebook instead prioritized the "ah that's nice" kind of content we wouldn't see the divisiveness we see today


I feel this ignores human nature to engage an in conflict. Or in other words, humans like to debate so we will.


It's also common for people to accept defaults without substantial customization. That's why the algorithms matter. Some people will deliberately seek out rage-bait even if the default algorithm delivers just-the-facts news and heartwarming pictures from friends' families. Most won't. Also, most people won't customize their settings to eliminate rage-bait if that's what gets prioritized by algorithmic defaults.


It also doesn't matter how much you customize your settings if they're inherently useless, minimally functional, or never there in the first place. A lot of the content control settings that Facebook loves to tout are practically useless.

Sure, I can hide every post from the "Controversial News" page, but I can't stop viewing content from third parties entirely. I'm only interested in first-party content - what my contacts create. Unfortunately that goes against the monetization model of Facebook.

I want a more closed loop social network and think that's the model we should return to, but unfortunately that's not where the profit/engagement is.


Yeah but it is also human nature to fuck as much as possible but we have rules and laws against things like rape to control those tendencies. Just because we are naturally inclined to do something does not necessarily mean that it is best for us


What? This is certainly [citation needed]

I highly disagree that it's human nature to 'fuck as much as possible'.

Certainly it is the goal of some humans. I can speak with personal experience that my nature isn't just to 'fuck as much as possible'. And neither is it most ppl I know. And the thing that's stopping us is not just anti-rape laws?

Fucking is great, but if you have a family and young kids, you care a lot about taking care of you family and not just going to the club and fucking more people.


Yeah you're right, not my best take. But I guess the point I wanted to get across is that we shouldn't let our natural desires dictate what is legal or not, time and time throughout history.


It's more interesting to debate subject and preferences where there is not already an overwhelming amount of evidence that one side is wrong.


It's about surprise... content that is surprising has more information content (from the perspective of the surprised person), which drives engagement. The problem is that when you don't make a distinction between true surprising content, and false surprising content, it's a heck of a lot easier to generate false surprising content.


Was going to say exactly this. Reasonable people with reasonable views have no reason to promote themselves or their views on Facebook. However non-reasonable people with non-reasonable views promote heavily for clicks/engagement to sell you something, or just to “idiot farm” to sell the idiots something later.

Facebook’s unregulated revenue model will keep ensuring this dynamic.


> Half of the population is below average intelligence, and giving the wrong people the loudest megaphones has a devastating effect on society.

I don't think humans need a lot of intelligence to not be gullible. Scientific method for example is simple enough way of extracting truth from lies and does not require massive amounts of intelligence to apply to your measurements of the world.

Generally I don't think people believe in a "paedophile ring run out of a pizza shop" because they are stupid. They just see how unreal politicians behave from their tall platform and extrapolate that behaviour to domestic questions.

Working as a software engineer from a poor family I had seen so many times my CEO (or my upper middle class coworkers) being awkward or plain out low-key abusive with cleaners/waiters/etc that I can totally see where people who don't have ability to directly speak to people in power assume things about them which are not totally true. But let's be honest, they have a reason to think this way.


> You'll have noticed on the internet that there's a tendency to prioritise engaging with things you disagree with (hell, half of my HN comments are because I felt motivated to write something to disagree with some OP at some point - even this one).

It tends to be that platforms with more disagreement have healthier discourse. I think that actually the opposite is more harmful. Echo-chambers allow for extremist ideas to grow and encourage hostility towards those that don't go along with the echo chamber.


Intelligence is a tool that can be used for good or evil. Vladimir Lenin and Mao Zedong were quite intelligent by any objective measure, but giving them megaphones resulted in horrific disasters far worse than anything caused by Facebook users so far.


> Half of the population is below average intelligence

Maybe the real problem is that everyone assumes they're in the other half. Or possibly that intelligence and wisdom are the same thing.

I guess what I'm saying is that I would generally agree with your post if it weren't for this statement. I don't think intelligence really has anything to do with the problem as even a lot of otherwise 'intelligent' people have engaged with today's bullshit conspiracy theories and nonsense.


> You'll have noticed on the internet that there's a tendency to prioritise engaging with things you disagree with

I wonder if you're aware of any known effect/research on this topic? I'm openminded to learning more


> Half of the population is below average intelligence

By definition :)


Only if the distribution is symmetric. [0]

Extra pedantic mode: if the population is an odd number, the number of below average intelligence will not equal the number of above average intelligence.

[0] https://en.wikipedia.org/wiki/Symmetric_probability_distribu...


>if the population is an odd number, the number of below average intelligence will not equal the number of above average intelligence

I'll leave counting as an exercise for the reader :D


Not if there's exactly one person of exactly average intelligence. :-)


Or any odd number :)


You're right that if the set is of {99,100,101} then of course 100 falls into neither category and each category has cardinality 1, but we're talking about real-valued variables, so I don't think that's a worry. I imagine it happens 'almost never' (except where n=1).

[0] https://en.wikipedia.org/wiki/Almost_never


Good point :-P


This is what I'm seeing also. Youtube, Facebook, etc. all prioritize engagement. It's not only the megaphone problem, it's a quicksand problem. As soon as you watch some misinformation to even try to understand what the hell anti-vaxxers are claiming, then you get a ton of related misinformation promoted to your homepage. How the hell are technically ignorant people supposed to keep up with this? Youtube and Facebook will lump you in a category and show you what similar viewers watched.


that's all good points. I agree. I think it's not Facebook gives the wrong people the loudest megaphones, but our human nature and nature of the population are drawn to those megaphones held by the wrong people.

What could we do about this? How could we identify the wrong people so that we could take away the megaphone from them? Who to decide which people are wrong? Some of them are obvious, But some of them are not that obviously.

Maybe we could say madmen claiming the world is flat and there's a paedophile ring run out of a pizza shop are obviously wrong. We might know Nazi is obviously wrong, but what about What about Antifa, what about "woke"? What about all those theories behind "group identity"? The most dangerous wrong people are the ones hold "good intentions" (they could be self-deceiving or could be truly genuine) but bad ideas, and its hard to discern.

History repeats itself. I suggest reading history of China in 1930-1950, the rise of Communist China, and then read "Culture Revolution" in 1970s. You could find that how the people with "good intention" ended up being the most evil in the history.

How could we avoid that to happen here? I don't have an answer.


> Half of the population is below average intelligence

This is the real problem


Half of the population will always have below average intelligence.


This is incorrect. At least half the population will always have median or below intelligence.


If we are referring to IQ the median and average is supposed to be the same by definition.


I disagree. Half the population will always have above average intelligence.


But the majority is somewhere near center.


> why can't people discern the difference between the idiot and the non-idiot?

Because it's not idiots that are the problem, it's bad-faith actors, and they're very good at manipulating people. In the past they'd have to do that 1:1, now they can do it at scale.


I've been in several debates where I was written off being in "bad-faith", when in reality, I just didn't agree with with the popular opinion on a particular subject. It seems people are all too eager to justify their own position by labeling others as being in "bad-faith".


Except in this case we know that there have been many organizations, FB pages, and fake individual FB accounts set up specifically to spread misinformation and FUD about COVID and vaccines. That's the definition of bad faith.

Certainly, from there, real regular people pass on and help spread this misinformation. Hard to say how many of those people are also acting in bad faith or have just been manipulated and scared into believing the bad information. But it seems certain that the source of much of this garbage is bad-faith actors.


Where can I read more on these organisers of miss information?


>Just twelve anti-vaxxers are responsible for almost two-thirds of anti-vaccine content circulating on social media platforms. This new analysis of content posted or shared to social media over 812,000 times between February and March uncovers how a tiny group of determined anti-vaxxers is responsible for a tidal wave of disinformation - and shows how platforms can fix it by enforcing their standards.

https://www.counterhate.com/disinformationdozen

News story: https://www.npr.org/2021/05/13/996570855/disinformation-doze...


OK so I read through the reports here and... there's nothing there. The report focuses on the reach of these people, which is legitimate, but the report also operate on a tenuous (if not entirely false) premise that the information compiled is "false" or "misinformation" similar terms. The very few (and obviously cherry-picked) examples for each supposedly nefarious actor are hardly damning. The report's authors do absolutely nothing to "debunk" the claims (and I recognize that there are claims that cannot be debunked by virtue of their design), and simply says "This is wrong."

Forgive me for not trusting either side here, but simply saying "you're wrong" to someone else doesn't meet my criteria for believing that you're right.


If it's 'bad-faith' actors, then you're saying that the misinformation is intentional, which makes it disinformation.


I would say the most dangerous are not the bad-faith actors, but those self-deceived ones who have genuine "good intention" but act out the "bad" consequence.

And everyone of us could be that self-deceived ones, including you and me.


Given enough capability and knowledge of manipulation methods bad actors can not only shape conversations and promote controversial / conspiratorial information, but also fan the flames of the backlash that make collective reason impossible. So long as there are holes in these platforms and a combative stance by platforms to resolve them, this power can be had or hired. You don't need nation state resources to pull it off at this point.

It's pretty widely acknowledged that what happens or begins on social media is now shaping the behavior of politicians and the narratives of legacy media. So if you successfully seed something on social media you get to enjoy the ripple effects through the rest of society and the media. If I have enough resources and motivation, I'm fine with that success rate being 1%, even 0.1% if it gets significant traction. And once it's out there, the imprint on those exposed really can't be undone by a weakly issued correction that never gets the reach of the original false information.


They've been able to do that since at least radio and arguably since the printing press. At worst, Facebook is an evolutionary step along that spectrum.


Radio and newspapers were effectively local, we now have global reach so that it takes just a few to mess things up everywhere.


There's effectively no cost to do it now, though.


It's true there are bad-faith actors, but there are definitely lots of idiots who don't know they are wrong.


Is it possible that you're also a wrong idiot but are unaware of it?


Totally


> The real question is why can't people discern the difference between the idiot and the non-idiot?

And here is the real problem with FB, the algorithmic feed. Normal life is pretty boring day-to-day, and doesn't trigger 'engagement'. Conspiracies, etc... cause an enormous amount of engagement. When a person is fed conspiracies all day by the engagement algorithm, even the most critical thinkers will start to drift. It works for the same reason advertising works, familiarity and repetition. The solution is never use FB, but that ship has sailed for most.


I have had decent luck in reverting facebook back to what it is for: sharing pictures of my kids with people who care to see pictures of my kids. That means every time someone shares something - no matter how funny - I block the place it was shared from permanently. Slowly facebook is running out of things to show me that isn't pictures of my friend's kids.


Same, it didn't really take as much as I thought it would to get it to stop showing me reshares, but now I see either groups content or family/friends original content.

I still wish they had fine grain controls though.


I dropped all groups too. Facebook's algorithm is a terrible interface for groups - it doesn't show me everything instead just what it determines might be interesting, and doesn't have any means to help with discussion of topics.


If only they could advertise your kids to you.

"I picked a free service and was annoyed they wanted to make money somehow"


> The real question is why can't people discern the difference between the idiot and the non-idiot?

Societally we solve this through trust organizations. Individually, I have no way to validate the information every expert/idiot I might come across. So is “connect the frombulator to the octanizer but watch out for the ultra convexication that might form” gibberish or just your ignorance of the terminology in use in that problem domain? Most people don’t try to figure out how to navigate each field. Heck, even scientists use shortcuts like “astrology has no scientific basis so it doesn’t matter what so-called SMEs in those fields say”. So you rely on trust in various organizations and peers to help guide you. These structures can often fail you for various reasons but that’s the best we’ve managed to do. That’s why for example “trust the science” is a bad slogan - people aren’t really trusting the science. They’re trusting what other people (some times political leaders) tell them the science is. Add in bad-faith actors exploiting uncertainty and sowing chaos and it’s a mess.

Silencing the idiot is fine as long as your 100% certain you’re silencing someone who’s wrong and not just someone espousing a countervailing opinion (eg Hinton’s deep learning research was poo-pooed by establishment ML for a very long time)


Facebook is not just a hosting platform, through the Facebook feed it exercises a great deal of editorial control over what posts/information is surfaced to users. So while Facebook might be giving everyone a megaphone, it doesn’t turn everyone the same volume. It needs to own that.


Erosion of trust in elites just so happens to also be a long-term goal of polluters, quacks, scammers, and other powerful parasites of the common wealth when they run up against government or science.


I think in general Facebook has a bias towards inflammatory posts - and other platforms for that matter as well including HN actually. Also it's easy to blame the village idiot for everything, but I don't think Donald Trump or Alex Jones are village idiots. They are surely idiots but left the village quite some time ago and gained popularity before Facebook (InfoWars was founded 1999) - although FB surely was an accelerator stage.

That said, the village idiot is harmless and I think aristocracy (rule of the elite) is definitely not the solution. But what is true that the normal filters in an offline community haven't been translated online yet.


> it's missing the greater picture. it's giving _everyone_ a megaphone

I think this can be argued against, because Facebook does recommendation and algorithmic curation.

Even if Facebook didn't purposely tweak things to propagate disinformation, you could say it is easy to manipulate their algorithms to disproportionately push the information.

So for me it's a case of Facebook not doing enough to fight potential abuse on their platform.

There's an element of responsibility here, because we are prone to some material more than other. There are primitive instincts in us, and content designed to take advantage of that is parasitic, and it is manipulative and addictive in that sense.

Crazy theories, appeal to emotions, controversy, nudity, clan affiliation, and all that are ways to take advantage of our psyche.

Even a smart person is as smart as the data more readily available to them. If the only thing about gender psychology I ever heard about was Jordan Peterson because he's been recommended to me, even if I'm the smartest most reasonable person, this is now the starting point of my understanding and thoughts around gender psychology.

So I think a platform that is optimized to show information that is most designed to make people susceptible to it, and that also targets information to the most susceptible people to present it to is by design going to result in the outcomes we're seeing.


Facebook (like News) is entertainment. People don't select entertainment for accuracy.

The village idiot (that's successful on Facebook) has self-optimized for being catchy - that's why people are listening.


Not to mention vaccine hesitancy because "when was the last time you got healthcare for free?"


> it's giving _everyone_ a megaphone.

Are they, though? It seems like FB amplifies things that they think will generate more engagement and "stickiness". Sensational things that cause outrage tend to do that more than cold, hard facts. I would not at all be surprised if misinformation gets amplified orders of magnitude more than the truth.


They amplify things that cause you to see more ads, because you picked a free platform


Idiots engage.

Reason doesn't.


> You guys gave them a megaphone, how do you expect society to behave?!

Considering most of humanity is... challenged when it comes to thinking critically, this should have been an entirely forseeable outcome. I agree it's society's fault, but Facebook is part of society. They watched how their tool was being usde by these people, and ENHANCED the reach of those messages because it was good for Facebook. Facebook is the microcosm of the object of it's blame. Idiocy writ large in recursion.


> most of humanity is... challenged

Most, no. Everybody is blind to one perspective or another. Also, time is limited and attention is limited. Do not think that others are just stupid because their focus or knowledge does not overlap with yours.

"Those people" does not exist. It's just an illusion of your own limited perspective. We are on this together and calling people stupid not it is true, not it helps.


You're pretending that there's no difference in intelligence/knowledge/skills between individuals or groups of people. There are differences. Can we stop pretending that there's no difference between a college educated European, your average American who reads at a 7th grade level, and a 3rd world farmer who has no perspective outside their small village?


> a 3rd world farmer who has no perspective outside their small village?

I think this is a disappearing breed. Ubiquitous cell coverage means everyone knows what is happening everywhere now.


> Ubiquitous cell coverage means everyone knows what is happening everywhere now.

I think this assumes facts not in evidence. Just because someone is "connected" doesn't mean they're automatically informed. As we see in first world countries, there's a lot of fucking morons that only listen to comfortable lies rather than uncomfortable truths.

There's also industrial production of bullshit peddled by disingenuous actors taking advantage of that fact. Fleecing rubes can be very profitable.

The very problem being discussed is Facebook trying to absolve themselves of bullshit peddling by blaming everyone else. They're blaming people for believing shit Facebook put in front of them under the guise of news. They're also fine taking money to promote bullshit as "news". Yet it's society's fault that they believed everything labeled news Facebook put in front of them.


To that effect, its worth pointing out that in many developing nations, facebook IS the internet. To say that this compounds all of the issues already discussed in this thread is a fairly drastic understatement.


Still a lot of people getting their crops burned because they knew about science and managed to grow a crop, and everyone else didn't and thinks they're a witch


"Stupid people don't exist" is a bold take.


That's not what the parent was claiming. The grandparent was claiming that most people are stupid. The parent was pointing out that most people are not stupid. Some people are, and many people have various biases and preconceptions that make it easier for them to be manipulated into believing misinformation.


> "Those people" does not exist. It's just an illusion of your own limited perspective. We are on this together and calling people stupid not it is true, not it helps.

I don't understand what's going on on this site. This is the second time recently I've come across someone claiming a comment didn't say something that's essentially copied verbatim mere centimeters higher on the monitor. It's basically in the same eyeful.

Hell, the commenter even added the word "stupid", which wasn't in the parent comment.


I was saying critical thinking is not as common a skill as you may think. Most people aren't stupid, but most people ACT stupid by acting without thinking.


I agree here. My observation is that most of humanity is rational but acts on limited or incorrect information. If you can provide truthful and complete information (in a digestible form), humanity will do just fine.


I stated the deficit is critical thinking skills. You editing the quote to make it sound as though I said all people are simply "challenged" which is disingenuous.


> this should have been an entirely forseeable outcome

It was. It is. It always will be.

Could you imagine the outrage had the authorities even attempted to prevent Facebook et al?


Not saying you're wrong, but to take a slightly more charitable view on humanity: Facebook exploits well known human behaviour to amplify content.

It's (unfortunately?) human nature to share shocking things, it may have even been evolutionarily advantageous at some point. Using algorithms to exploit this behaviour at a scale never before possible is harmful to humanity. No idiocy required.


I completely agree with everything you said.


It's much worse than giving the village idiot a megaphone. Facebook (and most other socials) prioritize content to maximize engagement, and (big surprise) the village idiot maximizes engagement. Facebook is a machine tuned specifically to spread hate and bad ideas because that's what maximizes the time people spend on Facebook.

I thought of a good analogy a while back. Lets say someone walks past you and says "hi" and smiles. Lets say someone else then walks past you and punches you in the face. Which interaction maximizes engagement? Well that's the interaction and content that social media is going to amplify.

Social media companies are the tobacco companies of technology. They make billions by lobotomizing the body politic.


True, but the financial incentives of many (most?) companies doesn't align with public benefits (see pollution, plastics, diet etc.) Why is social media singled out in our demands for them to act morally good, instead of just profitable?


I don't believe FB's algos truly maximize profit. I think they are short sited.

Imagine a food company did metrics that showed the most popular food was hamburgers, so they kept optimizing and optimizing trying to serve nothing but hamburgers. It arguably wouldn't work and they'd be giving up tons of market share and $$$$$$

FB (and Twitter) seem to be trying to optimize where they believe everyone wants the same thing. Unfortunately for them, when myself and lots of others try their services we find it's shoveling stuff at us we don't want. Instead of driving our engagement it drives us away.

There is tons of content I'd like to see on Twitter but every time I check it, now down to about once a month, they shovel so much shit I'm absolutely uninterested in at me that I just remember why I don't look and I leave. If they'd let me opt out of their recommendations my engagement would go way up and over time I'd follow or and more people. Instead they just drive me away.

The same is true for FB though in different ways. They try to fill my newsfeed with stuff I'm not interested in hoping to get me to spend more time. Instead it just prompted me to delete the mobile app and only use the website on desktop since some extension helps me filter their crap out. My usage without the mobile app is probably 25% what it was and falling. Entirely their fault for removing choice and assuming everyone wants the same thing.


> Why is social media singled out in our demands for them to act morally good, instead of just profitable?

It's not. We're so used to being totally unregulated in tech that even minor oversight can be spun as burdensome.


The question, as always, is who watches the watchmen? We saw what happened with broadcast and telecom regulation for phone and radio/tv in the USA. ABC, CBS, and NBC held a virtual monopoly until cable news 60 years later. AT&T/Ma Bell did likewise. It was horrible and arguably retarded innovation and diversity of viewpoints for many years.


Supposedly they were working on detuning this effect.


> Lets say someone walks past you and says "hi" and smiles. Lets say someone else then walks past you and punches you in the face. Which interaction maximizes engagement?

Likely the first one. Could also lead to a literal 'engagement'.


Not on a busy streetcorner. A fistfight tends to attract much more attention than a passing greeting. Facebook is a very busy streetcorner.


If you're optimizing for time spent in the interaction, which is what FB does - then probably the punch in the face will cause you to stay on the scene for longer, whether to yell at the person, fight back, when you call the cops, etc.

The "hi" is returned with a "hi" back and you both continue walking.


Likely scenario is that the interaction stops the moment you are floored by being punched in the face. You may stay on the scene longer, but the perpetuator would likely take off.

Also answering to hi and a smile with a hi and smile _could_ in fact be the right way to optimize for time spent in this interaction because it has low probability but very high impact outcomes as far as time spent goes (dating/marriage).


> The problem is that Facebook is giving the village idiot a megaphone

What's interesting is that before Facebook, the only people who could afford a megaphone were either state sponsored medias or billionaires who owned TV stations and newspapers.

For the ordinary citizens, the only way you could be heard was to write a letter to the editor of your local paper. If the state/billionaire/editor didn't like you, your views or anything really (your skin color perhaps?) it would simply not get published, period.

With Facebook a lot of gatekeeping simply disappeared. It's interesting to see who has an interest in regulating Facebook and bringing back the "good old days" of medias.


I think a better analogy is Facebook gave society a window into each others lives, and people can't look away.

Facebook prioritizes what people want to see, and people want to see train wrecks and inflammatory content.


> people want to see train wrecks and inflammatory content.

I'm starting to believe this more and more, but what I can't understand is why? We know it has no real "nutritional value", yet we crave it anyway.

Are we just bored and desire entertainment and drama?

What's the evolutionary drive for drama anyway?


>Are we just bored and desire entertainment and drama?

Essentially yes, We evolved so that we get a chemical hit when we engage with social drama. If you are bored, have nothing better to pay attention to, and/or have low self control, it is a very easy way to get a quick fix.

>What's the evolutionary drive for drama anyway?

Humans evolved as pack animals, paying attention to pack drama was extremely important. Picking sides or paying attention could mean the difference between getting your next meal or being beaten to death.

Because we evolved in small packs, where information was usually relevant, we don't have a good filter, or on/off switch.

Today we are exposed to the latest and most exciting drama from around the world, opposed to our tiny pack, and it is really hard to resist paying attention.

Paying attention to anything in the news or on social media is unlikely to make an impact on your life. Even the biggest topics have a very low risk of impacting you personally, but you will notice that most of them have an explicit or implicit hook that they could impact you.


People want inflammatory content like a moth wants a flame. Facebook amplifying a signal from the lunatic fringe preys upon the need of non-lunatics (or different-thinking lunatics) to argue against ideas they consider dangerous or just wrong. As a side-effect, it makes the ideas appear more mainstream, which has the effect of making the ideas more popular. This further increases the compulsion of non-lunatics to address the ideas.

I'm not sure if that qualifies as being what people 'want' or not, but it seems like it's profitable.


>I'm not sure if that qualifies as being what people 'want' or not, but it seems like it's profitable.

People want it like an alcoholic wants a drink. Facebook is the corner store that sells to the public.

I think we agree on the mechanism. but the question is what to do about it.

People generally say, silence people I don't like, and blind others from seeing what I don't agree with.

The problem is that this is not a workable standard, because everyone has a different opinion on what should be censored.


I have no problem with them saying both things at the same time. You're responsible for what you give your attention to, and so is everyone else.


Ultimately, yes, but that's a rather short-sighted position to take when there's an cadre of psychologists and other highly-trained people who's entire job is to entrap you further, just so someone can make (more) money.

Eg when you buy items at the grocery store, do you consciously examine all options, including the items on the bottom shelf by your feet, or do you just go for the items at eye level, and are thus tricked by a similar group of psychologists into buying the product you've been trained to want. And even if you, personally, do, there's a reason why product companies pay supermarkets to have their products at eye/arm level - it works.


Sure, but also Facebook has a bunch of doctorate-having engineers and psychologists dedicating hundreds or thousands of hours to figure out a system that gets for me to give my attention to Facebook, whereas I’m one dude who doesn’t even have a graduate degree who gets tired and bored and struggles to sleep sometimes.


I am not sure how it goes for the average person. Myself: I just do not go to places where village idiots tend to accumulate like FB or if I do (hard for me not to watch youtube) I just completely ignore all that crap.


And that might be most reasonable thing to do.

It seems like lot of folks here allude though not exactly say that they should be in position to decide on who is "idiot", "bad-faith", "anti-science" and so on.


Should our society have free speech, or free speech for everyone except idiots?

If you agree with the second formulation, who do you think ought to be in charge of deciding who the idiots are? Surely Mark Zuckerberg would not be your first choice.

Maybe there is a third option: no free speech for anyone, all speech must be moderated for lies and misinformation. Is that what you want? In that case, who gets to decide what is true and what is not? Surely Zuckerberg wouldn't be your first choice for that either, right? And what should happen when Facebook blocks "misinformation" that turns out to actually be truthful?

Those who want Facebook to regulate "misinformation" and gatekeep who (and what) is allowed on the site need to admit that they don't actually believe in free speech -- they believe in limited speech regulated by corporations.


Take any of these arguments about Facebook, replace "Facebook" with "printing press" and everything still makes sense, which tells you what this really is:

Cultural elites wanting to control what their perceived inferiors think, believe, and most importantly, vote for.

The same class of people who wanted to regulate the printing press in Europe during the 15th and 16th centuries are the ones who want to regulate the internet today.


To be fair there is also a large contingent of well-intentioned people who don't realize the full implications of what they are asking for.

Ironically many of those people would say they oppose the concentration of corporate power, yet they are asking a very large capitalist corporation to exercise power over one of the most fundamental freedoms.


I believe speech should be free but people should be responsible for their speech.

People behave completely differently when there are consequences to what they say.

Speech for "everybody but idiots" is not free speech.


What should the consequences be?

Removal of speech is not a consequence of speech -- it's preventing speech in the first place. That's what happens when Facebook blocks or deletes "misinformation" -- they are removing the speech itself. That's not the same thing as "consequences" for speech.

Look at what HN mods do -- they ban trolls, but they don't delete what the trolls posted. It's there for everyone to see -- in fact, if you look at "dead" comments you can see flagged stuff too. In terms of free speech, that's very different from deleting the comments entirely, which is what people seem to want Facebook to do.

And for the sake of argument, even if we accept that "consequences" ought to include the right to free speech being taken away from bad actors -- who can be trusted to decide who ought to be punished? Again, surely not Facebook. Surely not the government either -- the winners of every election would punish their enemies by taking away their rights. So even if we could tell, 100% reliably, who were trolls and who were not, we still should not give any corporation or government the power to take away the right of free speech.


Yes, removal is not a consequence.

What should be consequences? What are consequences when you say something stupid to your family or friends? What should be consequences when you knowingly lie to slander somebody?

> And for the sake of argument, even if we accept that "consequences" ought to include the right to free speech being taken away from bad actors

This already exists in the law. Just as your right to move freely is taken away in certain situations (for example due to restraining order).


> What should the consequences be?

The usual when you get up in front of a group and act like a jackass: social shame and ostracization. Something that's hard to do on internet platforms. Even when people are not anonymous, they have plenty of ways to "hide", and it's easy to unfollow and block people who criticize you for spreading misinformation.

So I don't know. Removal and deplatforming, IMO, is not the answer. You don't fix extremism through censorship; that just makes it worse and drives it underground.


Facebook should ban or suspend accounts which spread objective untruths that will tend to be harmful if spread.

You can have your free speech on your own website.


Objective untruths like COVID being the result of a lab leak?


This.

People who support these kinds of activities are youthful and arrogant enough to have any form of humility about things they once passionately believed to be true turning out to be incorrect.

In the 90's, eggs were thought to be as deadly as cigarettes. A bowl of cheerios was considered to be vastly superior nutrition wise to a plate of eggs. This is the opposite of what we know to be true today. If I had tried to argue against this with the current form of Facebook, I'd be censored. (They also thought avocados were bad for you in the 90s.)

The elevation of a collectively determined "objective" truth over the freedom of individuals to exchange ideas is the first step towards creating an environment for authoritarianism to flourish. Subjugation of the individual to the collective is the norm in most of history, and it's not an accident that our current prosperity emerged when in the times and places where it was lifted.


> People who support these kinds of activities are youthful and arrogant enough to have any form of humility about things they once passionately believed to be true turning out to be incorrect.

It's not just that. Usually the folks you'll see on forums like HN, or many other tech spaces, aren't the people who would have had trouble having their voice heard in a pre-internet age. They're usually the (generally not brown or black) children of wealthy families, many of them being 2nd generation scientists or engineers, and would have had privileged access to publishing houses, financial news, or high quality PSTN lines. This is why these spaces usually see so much more of the bad associated with these technologies than the good. The old status quo was very much a benefit to them, their families, and the milieu they grew up in.

Ask someone who grew up or had family in a country where PSTN over copper was a crapshoot and where sending exorbitantly expensive telegraphs and mail out of the country were really the only ways to get your voice heard, and you'll hear a very different story.

That's not to say Facebook, Twitter and other social media companies have society's best interest in mind (I doubt they do) nor that they are impartial operators (I really don't think they are), nor that they shouldn't be regulated as any other neutral communications carrier, but simply that this is one of the main reasons why tech forums tend to hate so much on social media.


If your basis for believing it's the result of a lab leak is it would make a really juicy story with no basis in reality then it's an obvious untruth at time of telling even if it later turns out to be true.


I want free speech for everyone except idiots.

> who do you think ought to be in charge of deciding who the idiots are?

Think about it. Engineering disciplines have mostly solved this issue. Lets take structural/civil engineering and something that affects many people - bridges. Through a combination of law, codes, and government, not any joe schome can build a bridge. Existing bridges generally work well and can be trusted. Sometimes bad things happen like the FIU collapse, but generally that's very rare.

I don't understand why there can't be a group of people, large or small educated and from diverse backgrounds, that can set basic standards on what is and is not misinformation, with due-process like things such as appeals, etc. It's not an impossible task.

> Those who want Facebook to regulate "misinformation" and gatekeep who (and what) is allowed on the site need to admit that they don't actually believe in free speech -- they believe in limited speech regulated by corporations.

If you're going to use a third party for communication and that third party is not owned by the people (i.e. a government entity) then it follows from the above statement that you don't believe in private property rights.


How do you ensure that this Ministry of Truth you are proposing will remain free of political pressure and corruption?

And how do you expect a panel of experts to escape groupthink and rule fairly in cases where the expert consensus turns out to be incorrect?

Both of those goals are impossible to achieve.


There is a difference between defining absolute truth and identifying obvious lies.

If a medication is approved and then a new risk factor is identified the issue goes from unproven to validated but if it wasn't based on nothing it was never a lie.

The covid vaccine containing a chip to track you was always a lie.

We don't need a ministry of Truth we need a ministry of obvious bullshit.


Under the -- very rocky -- assumption that a Ministry of Obvious Bullshit could still avoid scope creep into moderating truth, I still don't think this would fix things. People who actively want to spread bullshit will find a way. It's like spam vs. anti-spam, ads vs. ad-blockers.


People who want to commit murder might still find a way too but we still have laws, police, prosecutors, etc. Perfect should not be the enemy of good.


That's quite the leap from moderating information to a murder investigation


Optimum number of murders is 0. So we take steps to prevent murder. This doesn't make the number of murders 0, but we take the steps because we want to reduce murder as much as possible, because we've agreed it's unacceptable.

Substitute "online misinformation postings" for "murders" above.

The point is just because something is difficult to eliminate is not a good reason to give up trying to eliminate it.


> I don't understand why there can't be a group of people, large or small educated and from diverse backgrounds, that can set basic standards on what is and is not misinformation, with due-process like things such as appeals, etc. It's not an impossible task.

This seems incredibly naive to me. It seems what’s happening there is we give people a list of important figures to bribe to allow their speech to be considered information (or for a competitors speech to be considered misinformation).

And even if these were incorruptible humans, there are several statements that are heavily under debate currently as fact or fiction, such as the validity of neopronouns, whether or not Spanish speakers anywhere use latinx, who is the bad art friend, if kyle rittenhouse should’ve been convicted for murder, do trans children exist and if so can they perform any transitioning or puberty delay, what is critical race theory, is the Covid vaccine rollout speed sinister and if trump lost the 2020 presidential election legitimately. And that’s just all in America.


> It's not an impossible task.

Ok, set it up and then maybe removing idiotic speech can be considered. Until then you have nothing but a desire to define what is undesirable.


Your comment can be loosely translated to the following:

"Authoritarianism can work. It's just the wrong people were in charge. If me and other people like me had the same power as Stalin/Hitler/Mao/Mussolini/Putin, everything would be better because they were dumb, but we know better. We can create Utopia when nobody else could. We are uniquely prescient and intelligent."

The amount of arrogance and utter lack of humility is shocking.


If you thought the world is bad with aggressive bullies in power, wait until you see how bad it gets with aggrieved nerds in power.


Twitter and Youtube, sure.

But the blast radius of a Facebook post doesn't have the same reach given the majority of posts go to your explicit network of connections. Unless you're specifically referring to Facebook Groups? But then are we certain it's different from Reddit or other forums?


Facebook Groups and Pages create ways for people to share content, triggering exponential growth (e.g. user shares meme to their page so that their friends see it. Their friends choose to re-share. wash. rinse. repeat.)


That's too simplistic and naive, their algorithms amplify what will get the most clicks!


I think it's not so much Facebook alone but the entire Internet. The connectivity between humans is suddenly increased manyfold, and reaches much wider. Imagine using a graph layout tool on a giant graph with only few localized connections. Likely the picture will have evenly distributed nodes without much movement. But then as you dump all these new edges onto the graph, the nodes start to move into rigid clusters separated by weak boundaries. I think this is what's happening with the red/blue, vax/antivax etc. groups.


The internet alone doesn't connect people. Remove things like Facebook and Twitter, and how do you get this giant interconnected graph with few localized connections?


Given a network like the Internet, things like Facebook and Twitter naturally emerge.


Yeah, I always hear people talking about the great "global village" where everyone is 'connected', but I have to admit I am against it. I don't want to be prank called.


> the village idiot

One man's terrorist is another man's freedom fighter.


Right. Prior to social media, people were vetted many ways and in every context in which they gained an audience. (e.g. earned standing in social settings and community groups, promotions at work, editors of one sort or another when publishing to a group, etc) Audiences grew incrementally as people earned their audience. Social media removed all that vetting and it inverted the criteria to grow an audience. Sensationalism was rewarded over thoughtfulness. So one of the most important tools we've always relied on to judge information was removed. Hard to believe, as intelligent as these folks at Facebook/Meta are said to be, that they don't understand this. Feels disingenuous.


It is difficult to get a man to understand something when his salary depends upon his not understanding it.

- Upton Sinclair


The problem is that facebook is giving people earplugs. Biases and minority opinions get clustered together in huge echo chambers by eliminating mean societal influence.

This has assisted valid and invalid minority opinions to be heard.

What wasn’t there was critical thinking on behalf of the people who were already overwhelmingly exposed to mass political marketing and had developed a pseudo Asperger response. I will agree for once with the facebook exec, political philosophy has pretty much come to the conclusion that since there is not a unique definition of good or bad, there is not an algorithm that can do it.


The problem is that Gutenberg is giving the village idiot a megaphone. Gutenberg can't say: - Amplify your commercial business message to billions of people worldwide. AND at the same time

- Well its your individual choice whether or not to listen to the village idiot.

You guys gave them a megaphone, how do you expect society to behave?!


So should there be a special tax on "megaphones" like Twitter, Facebook or YouTube? What exactly is the legal framework under witch these companies could be scrutinized? Normally the manufacturer of megaphones does not get sued when a person uses it to promote hatred on a village square.


I think the megaphone is thus more of a metaphor than it is an analogy. Or at least, like most analogies, it breaks down under even the lightest pressure. For it to be an analogy, it'd have to be a megaphone manufacturer that also brings the audience together. Maybe Facebook is the megaphone AND the village square AND then some.


That’s what’s challenging about this situation. We’re experiencing a fairly new problem. It hasn’t before been possible for a member of society to communicate with all other members of society at the same time, nor has it been possible for a member of society to get addicted on a curated feed of random (sometimes anonymous) folks spreading their ideas globally.

All of these things seem new to me:

- Global, direct communication with all members of society.

- Addictive design patterns in software.

- AI-curated news feeds based on increasing engagement.

- Anonymous conversations.

Since it’s new, society doesn’t have frameworks to think about this kind of stuff yet.


>That’s what’s challenging about this situation. We’re experiencing a fairly new problem. It hasn’t before been possible for a member of society to communicate with all other members of society at the same time, nor has it been possible for a member of society to get addicted on a curated feed of random (sometimes anonymous) folks spreading their ideas globally.

This comment could have been taken more or less word for word from the diary of a monk who lived in the 1500s.

We've been through this before.


I think scale matters, though. In the 1500s (through much of the 1900s, even), most people were still mainly exposed to the viewpoints of people and groups who were physically local to them. Your local newspaper and (more recently) local TV news was a product of local attitudes and opinions. Certainly all of those people were not a member of your "tribe", but many were, and there were limits as to how far off the beaten path you could go.

If you had some wacky, non-mainstream ideas, you self-moderated, because you knew most of the people around you didn't have those ideas, and you'd suffer social consequences if you kept bringing them up and shouting them from the rooftops. Even if you decided you'd still like to do some rooftop-shouting, your reach was incredibly limited, and most people would just ignore you.

Today you can be exposed to viewpoints from every culture and every walk of life, usually with limited enough context that you'll never get the full picture of what these other people are about. If you have crazy ideas, no matter how crazy, you can find a scattered, distributed group of people who think like you do, and that will teach you that it's ok to believe -- and scream about -- things that are false, because other people in the world agree with you. And the dominant media platforms on the internet know that controversy drives page views more than anything else, so they amplify this sort of thing.


In a similar sense to how if you've experienced a low-pressure shower you've experienced Niagara Falls, sure, we've been through this before.


I don't understand how a targeted tax would help at all here.


[flagged]


The above is SCOTUS misinformation.

WASHINGTON — The Supreme Court ruled on Friday that abortion providers in Texas can challenge a state law banning most abortions after six weeks, allowing them to sue at least some state officials in federal court despite the procedural hurdles imposed by the law’s unusual structure.

But the Supreme Court refused to block the law in the meantime, saying that lower courts should consider the matter.

https://www.nytimes.com/2021/12/10/us/politics/texas-abortio...


Senate Bill 8 is still in effect. The above is SCOTUS disinformation.

https://www.kvue.com/article/news/politics/texas-this-week/t...

"So, essentially, the Supreme Court left the law in effect. We were expecting to possibly see them limit the enforcement of it because that was the biggest concern that Supreme Court, Supreme Court justices seemed to have. And that enforcement, of course, is allowing private citizens to sue, under the law, anyone who aids and abeits an abortion for at least $10,000 damages, if won. And so, that's where it kind of was being targeted today. And the Supreme Court essentially put that back on the U.S. District Court, allowing that lawsuit to resume to determine the constitutionality of the law."

Or TLDR they left it to the states. Can't wait to see how the states run with that concept.


"ruled that it's okay" != "left the law in effect"

"U.S. District Court" != "the states"

Whatever a state statute may or may not say about a defense, applicable federal constitutional defenses always stand available when properly asserted. See U. S. Const., Art. VI. Many federal constitutional rights are as a practical matter asserted typically as defenses to state-law claims, not in federal pre-enforcement cases like this one.

https://www.supremecourt.gov/opinions/21pdf/21-463_3ebh.pdf


It's not a megaphone, the only people that can see it are literally the village idiot's friends and family. It's gossip within your social circle.


But village idiots can share content into their social circle from other more popularly known village idiots.


So you don't know a guy who spouts off Sean Hannity/Don Lemon talking points to you every time you have a beer with him?


I don't drink beer, so no. But if I did go to bars and drink beer, I'm sure I would meet someone like that even in deep liberal Seattle.


I wouldn't blame megaphones for the fact that "idiots" use them. Nor would I expect megaphone manufacturers to dictate what messages can be amplified using them. Nor would I expect megaphone retailers to determine somehow whether a person was an "idiot" before selling them a megaphone.

If someone uses a megaphone in an anti-social manner, that's a matter for the police to handle.


Analogies are nearly useless in making an argument Facebook is an online platform with real time access to users communications and metrics and analysis of how it's used which allow it to make reasonable predictions on how it's going to be used in the future.

Comparing it to dumb hardware is ridiculous.

Their ability to predict the negative effect of amplifying crazy provides a moral imperative to mitigate that harm. In case you don't understand there is a difference between the platform allowing Bob to tell Sam a harmful lie, and letting Bob tell 200 people who tell 200 people ..., and different yet from algorithmically promoting Bob's lie to hundreds of thousands of people who are statistically vulnerable to it.


So i think this is a breakdown of our previous mindset on the matter. I don't know what future is "right", what the answer is.. but i think it is important for us to at least recognize that in the past, a crazy person on the street corner was limited quite a bit on velocity.

This megaphone is a poor example imo. A far better example would be broadcast television. We're now broadcasting everyone straight into not just American homes, but world wide.

So i ask, because i don't know, how does broadcast television differ from a megaphone in requirements? What responsibility is there on broadcast television what doesn't exist for a street corner?


Is it a problem, however, if the megaphone manufacturers specifically look for people who spread misinformation, and sell them the loudest megaphones with the most reach?

FB has not directly done that, but they have consistently refused to acknowledge that selling the biggest megaphones to the people who create the most "engagement" (aka money for FB) tend to be the types of people who generate false information and outrage.

Their publicized efforts to shut down fake accounts and pages set up specifically to spread misinformation is perfunctory, and simply something for them to point at and say, "see, we're doing things to fix the problem", when they're merely playing whack-a-mole with symptoms, know what the root of the problem is, but refuse to fix it because it's their cash cow.


Would you expect megaphone manufacturers to give souped up models capable of drowning out other megaphones to only the most controversial, destructive people?


it isn't the village idiot, it's the insidious manipulator that influences village idiots at industrial scale now.


Internally Facebook works aggressively to combat covid misinformation: source I work at fb. Literally most of the commonly used datasets are about it. It's easy to hate and hard to understand.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: