I'm an outlier, I admit it. I don't think I'm Zuck's ideal customer, either.
I use FB to communicate with friends & family. That's it. Most of them post very, very seldom, and the really chatty ones, I just mute. Most of these people, I'd never hear from and they'd never hear from me, except for FB.
Whenever I read about FB's treatment of news, politics, and vax (mis)information, I think "why are you going to FB for that? There are a zillion better places on the Web."
Twitter is better for getting the political Zeitgeist. RSS is not dead, contrary to what you might have heard, and Feedly is thus better for almost every kind of news. This site, Stack Exchange, and Quora are better for more intelligent commentary. Why would I even try to use FB for any of that?
My cousin Ray had been (I thought) permanently estranged from his family, and he finally reconciled with them. I wouldn't have found out about that without FB. That's what it's good for. It also would not make Zuckerjerk nearly as rich.
Can Quora still be described as locus for intelligence discussion? A few years ago it'd be the place to go for in-depth yet concise explanations from experts in all different kinds of fields, and for some very interesting perspectives on different subjects.
Lately I've visisted Quora only to find a series of inane replies and silly questions dominating the website.
> A few years ago it'd be the place to go for in-depth yet concise explanations from experts in all different kinds of fields, and for some very interesting perspectives on different subjects.
If by "a few years" you mean "a decade", then yes. Quora went down the drain between 2012-2013.
There's certainly a lot of that. However, there are still people who really know what they're talking about, who aren't just repeating what they read elsewhere on the Web.
Should 'everyone' incl. the cousin Ray be on one single platform?
Platform size matters here, Because when 40% of the population of a under-developed country are one social platform and use as a gateway for Internet it becomes a viable genocide aiding tool, Because when 70% of the population of a developed country use one platform it becomes a viable election interference tool.
When a private entity has accumulated such power, then it goes against the ethos of democracy.
People made noise about FB being used to commit genocide and it is now proactively blocking friend lists in countries where there could be targeted violence, Agencies investigated FB's role in election interference and now it's taking proactive approach to vet political ads.
Constant vigil over such powerful private entities and calling it out is the last option left to defend the well-being of the society at large.
I don't think FaceBook is going to be able to pull this off. While Zuckerberg is clearly a talented individual, the area he has consistently shown himself to be most lacking in is self awareness. Unfortunately for him and Facebook, this is perhaps the most critical ability one needs when defending one's image.
Forgive my ignorance, but how is it overblown? The article discusses Facebook deciding to inject positive stories about itself into the news feed so that people think, "Oh, maybe they're not so bad."
That's what the linked image is designed to do. That it says "from Facebook" is beside the point, and the article even says they would say it was from Facebook: "People essentially see posts with a Facebook logo that link to stories and websites published by the company and from third-party local news sites."
The fact is that they're purposely injecting fluff pieces into news feeds to try to get people to forget negative news about the platform. The only thing the linked tweet does is confirm what's reported in the article.
Edit: Previous tweets in that chain from Joe also don't lend themselves to trust. First, the comment about how they didn't quote him for saying, "There is zero change to News Feed ranking," which the article never said would be modified - there's no reason to quote him on that if that's not the claim they're making. They only said that more puff pieces would appear, not that their ranking for where they are placed relative to other content would be adjusted. All they said was that you'd see more, and according to Joe's confirmation of it's existence, that's true.
Second, his comment about how the meeting never happened. The article cites six current and former employees as having discussed that meeting. All he offers as proof that it didn't happen is that the article doesn't name the attendees or the sources.
I'm sorry, but that entire thread holds almost no water.
You can see from comments here on hackernews that at least a few people interpreted this as a change to their ranking algorithm. Writing ambiguously to support a narrative is also misreporting.
I know anti-FB sentiment is strong on hackernews. But seriously look at that screenshot and tell me what's wrong with that? How's that different from promotions by Apple on ios/app store, Microsoft on windows, twitter, google, youtube, etc. etc. They are just promoting their company on their own platform. When did your Facebook newsfeed become this untouchable sacred place where nothing beside chronological posts from your friends is allowed?
I want to hate facebook as much as anyone else but I really don't see news here.
> Edit: Previous tweets in that chain from Joe also don't lend themselves to trust.
That tweet is from @joeosborne "a @Facebook spokesperson." Not someone who I'd trust to accurately and fully characterize Facebook's actions. Given their prior history, I'd expect something misleading that attempts to portray them in the absolute best light.
He also seems to think journalists should take statements from company spokesmen as the absolute truth. That's not true, and especially not true for a company as untrustworthy as Facebook.
If they were clear, open, and transparent about this, then the article probably doesn't get written.[1] But because they didn't announce what they're doing or the scope of it, they leave a lot of room for interpretation, and right now a whole lot of people, across the US and the world, are not interpreting Facebook's actions with any sort of benefit of the doubt. That is largely because of Facebook's well documented history.
This is why intangible things that don't directly show up on a profit and loss statement- like 'consumer trust'- are so valuable. The difficulty that Facebook faces now is that when you lose trust with key stakeholders- politicians, regulators, the media, etc., it is really hard to get it back. Microsoft did, eventually, but it took ~15 years and two changes of CEO. If Zuckerberg wants to stay the CEO it might prove impossible to build trust, because people have state and it's really hard to reset that state.
[1]: One of the things that the Trump administration showed is that American journalists are very good at ferreting out secrets, but if you openly declare that you are doing something, they react with much less interest. But, as the article documents, Facebook has decided to be more controlling of their data, and curate what they release much more carefully. The fiasco where they decided to kill some reporting, then held off for a month on their own reports because it would have shown that Covid vaccine FUD was the most popular article on Facebook is exactly the sort of behavior that journalists salivate over and can make careers. Just releasing the data was a minor irritant every day, but Facebook managed to turn it into a much greater embarrassment that destroyed trust with the media and academics.
Even this article, which claims to be sourced to multiple fairly senior executives who disagreed enough to leak this to the NYTimes is a clear sign of the massive problems facing Facebook.
Manipulating the News Feed to show users pro-Facebook articles is...so obviously and blatantly unethical that I'm surprised so many people were willing to authorize it.
As someone who works at a not-small tech company, I try to practice the whole "people in glass houses shouldn't throw stones". After all, most tech companies exhibit some behavior that is objectionable. But Facebook... I honestly don't know how their employees are comfortable working there.
> I honestly don't know how their employees are comfortable working there.
Probably most of the people who don’t feel comfortable with facebook’s role in society have already selected themselves out of the employee pool. If you’re the sort of engineer Facebook wants to hire, and you have a strong sense of ethics, there’s no shortage of well paid work you could do instead.
The people who end up at FB are the people who don’t see a problem or don’t care.
I understand the sentiment but don’t think that’s fair. Not everyone gets to choose from their pick of job offers, and some people have serious financial commitments and dependents. If FB is the only job offer you have that pays the bills, is it really wrong to take it?
I have FB on my resume. I make hiring decisions. I will delete any applicant who was at FB over the past three years without a moment's hesitation. You do not take an FB offer to 'pay the bills', you take it because you want to acquire a big pile of money.
There was a point pre-IPO where you could think that you were actually trying to do something cool, interesting, and overall were helping people stay connected with each other. Then there was a point where you could tell yourself that you were working on some big and interesting problems and if you didn't do your job well or nail some very tricky problems then it would have a negative impact on people's lives (or even get people killed if you messed up security/privacy and revealed info about some people.)
Then one day you wake up and realize you are a prostitute working in Zuck's brothel, but unlike a real sex worker your actions are ending up fucking millions of people a day. If you don't leave at this point you are as unethical as the people around you.
Wow. So do you have a list of objective criteria you use to decide which list of companies is morally acceptable for your candidates to have worked at, or are you just coming up with an arbitrary list based on your emotions and limited personal experience? Isn’t this also incredibly hypocritical given that you admit to working at FB yourself in the past? Shouldn’t you technically fire yourself, or do you of course consider your circumstances an exception?
> do you have a list of objective criteria you use to decide which list of companies is morally acceptable for your candidates to have worked at
As a matter of fact I do.
> Isn’t this also incredibly hypocritical given that you admit to working at FB yourself in the past
It is. I excuse myself by saying that it was creepy but not overtly evil at the point where I left. To be honest, I think I got out at just the right time to have a small shred of morality left, but of course I would say that and others may differ. I am sure lots of people do not care, and the fact that my work at FB in its early days gets me interviews shows this to be a widespread opinion, but I have been inside the factory and have seen how the sausage is made. I have a date I keep in my head, and if you were at FB after that point you would be a 'no hire' for me.
Yeah. So you worked at FB, made your dirty money, got your new role, and are now demonizing others who made the same decision as you. But it’s different for you, or course, and easy to do now that you’ve made your cash. You are just as immoral as the candidates you are rejecting, if not more.
Your childish belief that companies are static entities and that the moral calculus of a job doesn't change over time makes this entire conversation a fools errand. Searching for simple black and white answers for complex problems never seems to work well. Sorry you were never good enough to make the cut and will never face these kinds of questions, but the simple fact is that at one point it was different and the company grew into something else entirely.
> If FB is the only job offer you have that pays the bills, is it really wrong to take it?
At least on the engineering, product management, etc. side of things, if you can get an offer at FB, it's highly unlikely you can't get offers elsewhere.
Of course, there are only so many companies that can offer FB level compensation. Which, if we're being honest, is why FB continues to be an employer of choice for so many people despite its atrocious ethics track record.
They pay extremely well. I have a friend who moved to Facebook and complained all the time the first couple of months, but the third month or so he was like even with London living expenses the money is really great.
Why are you using a throwaway to mischaracterize what the reporter wrote?
He literally wrote: "So to be clear: Project Amplify, in this iteration, does not involve any algorithm changes to news feed (which we don't say in the story). But it still involves Facebook pushing pro-FB content into News Feeds in a pre-determined fashion to boost its reputation."
The feed ranking wasn't change, but the News feed is being manipulated by adding pro-FB content to it and he stands behind what he wrote. The article never said the News feed algorithm was changed.
and don't forget shelving their own comissioned reports that didn't fall in line with the pro-facebook narrative:
"Mr. Schultz argued that Facebook should publish its own information about the site’s most popular content rather than supply access to tools like CrowdTangle, two people said. So in June, the company compiled a report on Facebook’s most-viewed posts for the first three months of 2021.
...
A day before the report’s publication, Mr. Schultz was part of a group that voted to shelve the document, according to the emails."
But what company out there releases internal reports that would not show it in a good light? Have you ever worked at a company that just airs out its dirty laundry like that? This is an unreasonable expectation that is applied to no other company.
I don't agree with the factual claim: As examples, IT industry businesses publish their flaws and internal investigations into them all the time, about massive security failures, for example. If a plane crashes, they manufacturer and the airline are expected to be very open. Facebook has great power over our communities, and thus we have very high expectations.
The parent's 'expectation' that businesses (or individuals) only act in their own immediate interests, and not in the interests of their communities, is an odd contemporary idea that humanity's worst instincts are the 'true' and necessary ones, while humanity has many instincts, good and bad, and it's our free will to choose. It also ignores even 'enlightened self interest': Our society is falling apart, and social media plays a major role; you don't sh-t where you eat.
not saying its right, but all the tech companies do it. Blind for instance will remove a post within minutes if you create a post saying 'blind is censoring our posts', as they did with me. Surprised I didn't get banned but maybe that's coming.
It certainly feels like a hail Mary attempt to save their image, especially given the recent media barrage. Anecdotally at least, it feels like most of my friends who used the platform 5 years ago are no longer active on it.
What does that have to do with anything? That's not what's described in the article.
It's "an informational unit clearly marked as coming from Facebook". It's similar to the units that facebook has for vaccine information or voting information.
The traditional media is FB's competition, though. Why should a company be obligated to host links to negative publicity paid for by a competitor? I assume you don't think that the NYT should be required to print anti NYT articles written by FB, right?
The comment you responded to said "unethical"; there are many things that are unethical that are not illegal (though I am not claiming this should not be, as Facebook is supposedly providing a communications platform here... but it doesn't matter if it should or shouldn't be: ethics is not law).
You are right. But I don't think it's unethical either. Just like I don't think it's unethical that the NYT doesn't run anti NYT articles written by FB.
edit: When it comes to censoring content by political opinion, then I'm more towards the unethical side, but I think it's a step to far to expect any company to allow it's platform to be used to spread attack pieces against itself.
The NYT and FB are two different kinds of entities, and I don't think you can judge similar (hypothetical) actions that either one might take.
The NYT is a news outlet. They publish what they believe is newsworthy, interpreted with whatever biases they might have. They don't just take random things that other people have published (or want to publish) and publish them.
FB is a social media site. They don't "publish" things; they provide a platform for their users to share things that interest them (and a platform for advertisers, but that's not really relevant here). If a user shares something that praises FB and another user shares something critical of FB (let's assume for the moment that both the praise and criticism are accurate, to preempt the "misinformation" argument), it is absolutely unethical for FB to over-promote the praise and bury the criticism. That's pretty much the textbook definition of unethical.
Facebook doesn't present itself as a media company to its users though. It presents itself as a social platform allowing people to communicate, and the expectation from the user is that you will see content that is popular and/or being shared by your friends without any editorial control/oversight such as downranking "inconvenient" truths.
It's also not done transparently, it would be one thing if FB turned around and outright said "this is inconvenient to our business, so we won't allow you to post it", but in this case they silently downrank the content behind the scenes.
> you don't think that the NYT should be required to print anti NYT articles written by FB
Facebook isn't journalism; it's not comparable. The NYT does publish results of internal investigations into flaws, as do many other businesses, including much of the IT industry (all those open after action reports, for example).
Openness and transparency are necessary antidotes to concentrated power.
considering that, as is known, publishing negative stories about facebook only makes them more popular, maybe they should start publishing positive ones again?
It's really gotta sting when, as a young man, you create this technological powerhouse only to be cursed to preside over its decline into a malevolent, warmed-over AOL.
I helped my mother fix her access to facebook and it's one of my biggest regrets in life. She leaned hard into the covid misinformation that facebook spreads, turned into a delusional anti-vaxxer, and I can't help thinking that my innocently helping her fix her account is ultimately going to lead to her death.
"But at least one of the decisions was driven by Mr. Zuckerberg, and all were approved by him, three of the people said."
"Mr. Zuckerberg, who had become intertwined with policy issues including the 2020 election, also wanted to recast himself as an innovator, the people said. In January, the communications team circulated a document with a strategy for distancing Mr. Zuckerberg from scandals, partly by focusing his Facebook posts and media appearances on new products, they said."
And beyond that, we all know the story of how and why Facebook was actually created. There was never any intent to create something that would have a positive influence on the world. I kind of doubt he's struggling much with his conscience.
To be fair everyone said stupid things when they were younger. The problem with Zuck is that he hasn't changed a bit considering his entire business is just getting "dumb fucks" (his words) to trust him at large scale despite a proven track record of abusing this trust.
Can't imagine the guilt that comes with that situation, though you definitely don't deserve it.
I think the amount of vaccine misinformation propagated by facebook will be the source of numerous reviews and studies in the future. It's overwhelmingly sad to see so many people die because they did not get vaccinated, presumably after being influenced by fb misinformation and cable news pundits.
As a side note, the content warnings of fb posts are almost laughable, as most include this blurb: "According to the World Heath Organization" ... as though the WHO is reputable in the eyes of those brainwashed by anti-vaccine propaganda.
when people say anti-vaxxer I get so confused. They call people who are against vaccines vaxers, which is kind of stupid because thats like calling people who wont play ball 'ballers'. They just did the 'er' thing to dehumanize the people, but its created a lot of confusion because now you hear 'anti vaxer'. are they against vaccines? or against people who are against vaccines? o_O
its not always a term to dehumanize, but is often used as a way to do so. recognize nuance in the world. characterizing someone is also a way to dehumanize them, FYI.
If you don't like a group of people and you want to get other people on board with hating them, just put some word_er and label them as such. it makes it much easier to garner hate for said group.
Frankly, having Facebook on your resume should disqualify you as a potential hire.
Because if you are willing to work for Facebook, the best case is that it means you are amoral. But how likely is it that you are probably immoral yourself?
You are working on a platform that is directly responsible for propagating misinformation, often with deadly consequences. And don't get me even started on January 6th.
The founder of the company started Facebook as a site to just put up pictures of women so they could then 'rate' them.
The fish starts to rot at the head, and if that's the kind of 'moral compass' of the CEO, you know things are bad.
If you read this and you work for facebook. I urge you to quit today.
Facebook is not all bad. It is certainly concerning and some may hold the opinion that social media is tearing society apart, but there is at least as much positive about it as negative. People who work there go for the positive.
So Zuck rated girls in college. I’ve used such sites and behaved badly when I was younger as well. Nobodies hands are spotless. That one thing certainly does not define him.
All of that said, I would not work at Facebook. They wouldn’t hire me anyway.
>So Zuck rated girls in college. I’ve used such sites and behaved badly when I was younger as well. Nobodies hands are spotless. That one thing certainly does not define him.
Sure, one thing doesn't define someone. But in the instance of Zuckerberg, you've got "rating girls in college" followed by every other user-hostile thing his company has done since, so...
It is all bad. There is no redeeming quality about Facebook whatsoever. They are a netto negative impact on the world.
The similarity towards religion in that regard doesn't go unnoticed, although I do admit that this is really out-of-scope.
The benefit that Facebook brings to its users does not offset the significant harm it does to others.
Facebook kills.
And that boys-will-be-boys argument is really telling about your own ethics/morality. Because I would not even dream about building a site, to which I would upload pictures of women in my University, to let other men rate them.
That is so profoundly unethical, if you don't understand that, you have a lot to learn about morality and ethics.
> Frankly, having Facebook on your resume should disqualify you as a potential hire
> And that boys-will-be-boys argument is really telling about your own ethics/morality
Yikes, you are incredibly judgemental about people you know nothing about. Not sure what sort of discussion you think this sort of behavior is going to spur, but I can pretty much guarantee it'll be unproductive.
What about those who have a change of heart and left facebook to try and do something better?
What about those in third-world countries that could only get in a on a Visa and FB was the one that hired them, could we fault them for choosing to leave a much riskier life to live the life of a FAANNG developer?
Are people beyond redemption because of one bad decision, what about people that see the world differently?
You are not redeemed because you were poor and FB gave you a ticket out of poverty. Because working for FB made you do things that made the lives of other people miserable.
I don't think we can hold it against the citizens that we, as a society in the US, are fortunate enough to have naturalized by Facebook. For all their myriad harms, Facebook engineers are no slouches. We definitely get talent by Visa holders immigrating by their employment; further, I don't think we're familiar enough with their experience in their home to say that they shouldn't take any opportunity they can get to make it to the US. For all our myriad problems, I do think we're a great nation.
However, excepting these employees alone, I do agree with you: Facebook employees are not good people.
Raising people out of poverty by building a platform that basically chips away at the fabric of society is beyond insane.
I can see why 'poor' people would jump at the opportunity, but that doesn't really excuse them from the basic morality/ethics.
Holding people from poorer countries to lower moral and ethical standards smells like a kind of latent racism to me too but maybe I should give you the benefit of the doubt.
I don't think it's racism, necessarily, because it holds true for our naturalized citizens of any race. You could charge me with some sort of dread nationalism. I still wouldn't agree, but it's closer to my espoused basis. Also, for context: I am the child of a naturalized citizen from a very poor country, which very much colors my opinion.
It's not so much that I hold these people to lower moral or ethical standards; it's moreso that my morals and ethics are entirely arbitrary, based on nothing more than what values I choose to hold. My standards are the same for everyone and everything, which is that my standard will be whatever it is at the moment I choose to query what my position is. Is this wildly inconsistent? Yes. Is it universalizable? No. But it's the best I've got, and it's what commits me the least.
Your employment decisions are significant as it pertains to your behavior and actions.
It's very reasonable to judge someone by who they choose to work for, specifically if there are repeating negative patterns.
If someone builds a career out of violating human rights, working on egregiously anti-privacy tech at several companies, there's a negative pattern there. It provides substantial insight into who they are and what immoral things they're willing to do to other people for the right price.
When you own the largest propaganda machine the human race has ever seen, might as well use it.
This follows the article on HN yesterday about Peter Thiel's take no prisoners, apologize for nothing approach he taught to his protege Mark Zuckerberg. It's also very Trumpy. We're seeing a new generation of trillion dollar companies flexing their muscles in broad daylight, rather than behind the scenes.
Everyone here seems to think FB is some horrible, unethical company, but what, exactly have they done? It seems to me that all FB is is a platform that allows anyone to connect with and talk to anyone else, post whatever they want, and it comes with something similar to a Netflix recommendation algorithm that guesses what other posts people will be most likely to want to see. And, yes, this can have negative effects, but how is it FB's responsibility if people want to see negative things and interact negatively with other people?
Facebook is an ad company that monetizes social interaction and manipulates the social sharing of information for profit. It isn’t just about people interacting negatively, it’s that negative interactions are directly profitable to facebook, they know this, and they act on it.
Then what makes Facebook different than any other social network? If this is what makes a company evil, then why aren't people up-in-arms about Reddit?
None of them (or none of the big ones at least) are any good. Truth be told, I think a lot of the people up in arms over Facebook and are quiet about Reddit and Twitter use both Reddit and Twitter and don't want to stop using them, whereas they don't use Facebook. Its easier to stand on the soapbox when it has no affect on you personally.
That all being said, that still doesn't make Facebook good, or not evil. It's perhaps the worst offender of the lot
The argument is usually that non-linear effects of the size of their network make the potential or actual harm Facebook do much greater than anyone else, therefore it makes sense to focus on them.
Calling them "evil" can be useful to mobilise people, but it clearly makes the conversation a lot less nuanced than it needs to be, and it's unfortunate that people often use that type of language.
Because Reddit and Facebook really aren't all that similar.
Facebook is like the "book club" that is really a front for gossiping and drinking. You probably know the people you're talking to personally and it's more a conversation with them (that other people might be able to see).
Reddit is like a community message board. You don't know these people personally, so you get an inbuilt skepticism to what they say. It's mainly about sharing links to somewhere else and you can interact with it without reading anyone's comments.
Social media as an umbrella term really doesn't work because Twitter, Facebook, TikTok, Reddit, YouTube, etc all have different ways you interact with them. It would be like saying network TV, the newspaper, the community message board and a guy with a megaphone on the corner are all the same.
I'd disagree with the notion that people treat discussions on Reddit with more skepticism than Facebook. This is coming from someone who was a daily user for over a decade. Due to its low bar to entry and anonymous accounts it's probably astroturfed more than Facebook and such accounts are harder to distinguish.
You're probably right about astroturfing. It's part of the reason the Hail Corporate subreddit exists.
I'd be interested to see the ratio of lurkers to posters. Reddit is a link aggregator, it's primary utility is unrelated to the comments. This isn't the case with Facebook.
Here are just a few examples of obviously unethical behavior by Facebook:
* Their own internal research shows that Instagram harms 1 in 3 users who are teenaged girls
* Lying to companies that paid to advertise on FB about the reach of their ads
* Letting advertisers discriminate based on FB users' race and disability status
* Letting Russian accounts pay for election ads in 2016 IN RUBLES
That's a lie. FB did not aid in anything. Google is not responsible for people who use gmail to plan crimes, nor are ISP's responsible for the content of the packets that go over their wires.
Your comparisons are not even close: Email and ISPs are not amplifiers of content. They do not curate messages.
Facebook's algorithms absolutely were responsible: they amplified the propaganda that led to deaths. And their money-first-research-second approach at censorship is part of the reason.
Rather than turning off the service, they let it run wild.
The problem with this analogy is that GMail and ISPs treat all traffic equally, where as Facebook and other "engagement"-driven social media use algorithms to prioritize content that generates the most engagement (which often happens to be the most outrageous, offensive or divisive content).
So by that logic, if a tv station chooses to broadcast content that propagates racist misinformation and calls for violence, but the station only does it because that's what they algorithmically determine gives them the most viewership, that it's fine?
And yet they are all responsible for reporting CSAM found on their servers. These platforms are not dumb pipes, the ability is already there to decide what is allowed and what is not.
I do tend to agree that FB gets more flack than seems reasonable, it's hard to expect any company with a profit incentive in this space doing much better. Though IMO Twitter is an example of better handling of misinformation, propaganda, toxic interactions, and more.
> it's hard to expect any company with a profit incentive in this space doing much better.
That’s crap. Facebook could easily choose to prioritise the long term health of society over their advertising profits today. Everyone with a seat at the table (and all the engineers on down) live somewhere between comfortably wealthy to obnoxiously wealthy. Zuckerberg and all the engineers involved are making active choices, every day, to line their pockets instead of cleaning up their act.
This is the thing that makes me sad about capitalism as a whole. The only incentive is to make more and more money, and if you have to do unethical things to do that, you do them, and you're rewarded for it.
It's just not that common for executives at a company to forego revenue in service of the common good. Especially at companies like FB, where exploiting the commons is what makes them money in the first place.
Bryan Cantril had a great rant about this a few years ago. He made the point that if you go back to the 60s, the corporate values of basically every company had "integrity" right near the top. Plenty of capitalist companies past and present choose to act in ethical ways.
This isn't an essential problem with capitalism, any more than food poisoning is an essential problem with restraunts. The problem comes from how some people run their companies. Money gives you the power to do what you want. Whether or not they realise it, people in high up positions at FB, Uber and elsewhere are powerful. They have the capacity to choose what their companies do. Claiming their unethical behaviour is "the fault of capitalism" is like a plumber blaming weak building standards for their own shoddy work.
Do better.
And don't place the blame for unethical behaviour amongst our community on capitalism. People chose to build and maintain all of these system. And people continue to choose to keep those systems running long after knowing the harm they cause.
Twitter? That's the most toxic place on the internet. But I don't blame them for what their users want any more than I do FB.
As for the Wikipedia link, 90% of that stuff falls under my initial comment of people using the platform in ways other people wish they wouldn't, so I don't find it to be FB's fault. The privacy stuff I think is real, but it's so standard internet these days that it hardly justifies the vitriol.
Disagree with your second point. Just because there's an entire industry of people that profit from the erosion of privacy, doesn't mean any of them is less deserving of their share of vitriol; if anything, I'd say the amount of vitriol each one of them ought to experience (personally, in their day-to-day lives) should rise exponentially as a function of the amount of people doing the same.
Well, Facebook's own researchers wrote reports (cited last week in the WSJ's Facebook Papers articles) that said that Instagram was worse for teenage girls mental health than either Snapchat or Tik-Tok. So there are gradations of how bad a for-profit social media company can be, and Facebook seems to be worse along several axis.
> it's hard to expect any company [...] doing much better.
That is on you. The simple fact of the matter is that they can do better, they should do better and because they are not doing better they must be punished until they do. That is how you deal with a beast that is actively malicious (it does wrong and it knows it is doing wrong, it does not care).
It’s not clear to me if Twitter has handled it better, or if they had a self selected audience that made their techniques more effective. I don’t see why attempting to flag misinformation would work on Twitter and fail on FB for reasons other than the people who use it.
I can think of several reasons why that would be the case.
The easiest one is just that Twitter might have more competent developers and product managers working on that problem than FB does.
The more sinister option is that FB knows that the kinds of things that end up being misinformation often drive engagement, so maybe they don't try as hard.
(Agreed, though, that it's not clear that Twitter handles this better than FB.)
You are of course encouraged to start reading the news if you have a voracious appetite for learning. In short though, they take everything they can from you and sell it to everyone willing to buy. This gives them an incentive to have you "engage" with content, of which some is actively harmful (but ostensibly encouraged because Facebook wants the effects that such content has).
> Everyone here seems to think FB is some horrible, unethical company
I don't think that's true. Most of us here see Facebook as a drunk 16 year-old with a set of car keys and Daddy's credit card. Not evil, just hopelessly reckless and lacking in self awareness and self control.
Their platform has toxic and negative effects, from teenagers to adults participating in elections. And they know they are bad, and cover up and lie about how much they know and when they knew it.
Facebook is designed to encourage and reward the kind of negative and harmful interactions and results people are noticing. There’s no reason a social network has to work the way Facebook does, other than maximizing revenue.
I agree it has become entirely irrational. I think it's the relentless media campaign starting with Cambridge Analytica, the narrative the intelligentsia left has constructed for itself re. the Trump election and now Covid combined with the new culture of conformity that there is no one to speak up when people like Miguel de Icaza casually call for the entire executive rank to be jailed.