Fair to say it's about incentives? I think it's possible to run a profitable social media company that does care. We're just stuck with too few options. I'm unaware of any that make transparency a core value by, say, openly publishing moderator actions.
I think it's possible to run a profitable social media company that does care.
That's probably true but not at Facebook's scale. If the employees are willing to accept the argument "We could have increased profit but chose not to in the interests of users.", because everyone's bonuses and stock value is based on profit, you'd need to hire a lot of people who are happy to trade some amount of personal gain for user wellbeing. That would be difficult when you have tens of thousands of people.
Alternatively, maybe it'd be possible to run Facebook without giving the staff equity or bonuses. That would take a huge shift in the way tech renumeration works though.
This comment consider employees (people) as if they were algorithms to maximize personal revenue.
That is not necessarily true. If FB where a place where people could feel proud of the good it does, a lot would be ok taking a pay cut.
Sure, maybe the ones that care the most about compensation wouldn't join, but that might be ok.
I worked previously for a for-profit company which was devolving a massive amount of their profits to charity. It was something I was proud of, and a part of my evaluation of how much I liked working there.
It seems increasingly likely that if FB keeps ignoring these concerns they're going to get hit with some sort of government regulation. So I still think it's in their long-term self-interest to try and avoid that happening, even if we assume profit is really all they care about.
if they can do this, let everyone trust they do this, rheir company will beyond all organizations in human history. you can let them take care your kids!
>I think it's possible to run a profitable social media company that does care.
No. Due to network effect successful company must be large, and the large company lives by different rules. FB for example is a trillion dollar company. It is like a TeV energy level in physics. Different behavior of matter. At those energy levels chemistry for example just doesn't exist, and matter itself changes as protons break apart. Ethics in big business is like chemistry in high energy physics - it just doesn't exist at those levels of money. At these levels they can be affected only by comparable level of money or power, like the government power.
I can't see it working. Instagram and WhatsApp should be spun off, absolutely no question about it. But Instagram and Facebook still remain way too big.
And how do you break them up? By geographic area? People would be pissed because they have contacts with, or maybe want to watch vacation videos from people from different areas.
IMHO the only options would be:
* non-profit run by the UN or similar
* forced to open all data and APIs to make it a platform, with rules to make it an even playing field
In both cases with much less, and why not zero, "engagement" focus ( just show the latest stuff chronologically from the things you've liked/subscribed to). Could either work? No idea, but seems to me that they have better chance than a Facebook per country or region.
It's probably too late for any specific publicly traded company that already exists, but social media as a protocol, with forced interoperability if necessary, is the way to solve this. Many regional phone carriers doesn't prevent people from communicating across regions.
Of course, it does cost more, so the question becomes who is willing and able to pay for a socially less harmful means of networking individuals without having to put them all on a single data hoovering ad platform? Whether it's direct charge to consumers or subsidized by government, the money still ultimately has to come from people.
Implement ActivityPub (an existing protocol recommended by the W3C), and offer their underlying social networking services as hosted and managed software for big orgs to operate on their own domain. With interoperability, they'll work across domains. Target customer is anyone with at least 100K followers (so gov, institutions, media, et cetera).
And the US would ban any discussion on Guantanamo and CIA torture. What's your point?
It should be run independently, like the WHO, not directly under the power of the security council. And before you say Taiwan, the Republic of China is not a UN member, and there's nothing the UN or WHO can do about it, it's between China and them.
I am unfamiliar with cases where the US prevented CNN (or others) from discussing Guantanamo.
I am familiar with the extreme lengths China would go through to prevent discussion on topics it hates though (e.g. if HN was based in China both of our comments would have been deleted immediately)
And who is going to break up TikTok or whatever comes next/instead?
The social media is like tobacco, and FB increasing the engagement at all costs is like when Big Tobacco were increasing addictiveness of the cigarettes. Like with tobacco, the way to deal with the issue is to wean the population off by in particular educating the population about the damage it does to them .
Btw, "Statement from Mark Zuckerberg" - that reminded that foundational tenet of Facebook :
Zuck: yea so if you ever need info about anyone at harvard
Zuck: just ask
Zuck: i have over 4000 emails, pictures, addresses, sns
Also, he was right. People shouldn't have sent Zuck their private info even if Zuck were a saint, because there was no way they could have known Zuckerberg was a saint.
how many stupid things you said at 19 made you $100B? I'm pretty sure that if a stupid thing you said at 19 had made you a $100B by the age of 30 and continued to made many billions after that you'd pretty much continue to believe and follow that stupid thing.
I honestly don't know if it's possible. Common sense seems to suggest that social networks become more harmful as they becomes bigger and more profitable. That's just my surface-level take though. I guess the meta-question is what to do about highly-profitable lines of business that cause demonstrable social harm. Clearly regulation and government intervention are one answer, as are reforms to corporate governance structures (e.g. German and Nordic rules around worker representation on corporate boards.)
While I'm sympathetic to your anti-FB bias, I think it's pretty clear that by any reasonable definition Facebook is indeed a social network (which also happens to make gobs of money through advertising, and covets engagement because that increases ad revenue as well as a general sense of "platform health"). The question posed was whether "being bad for society" is an emergent property of social networks as they grow.
It’s quite hard to see how it’s just a business that doesn’t give one shit about you no matter now many followers you have or how many likes you got when you’re still using it.
An actual social network would care how you feel today.
A social network is just a kind of business, like a chain of supermarkets or a steel mill. You could, of course, have a non-profit social network (or a business with a different corporate structure - as I allude to in a parent post), but that's not what Facebook is or what its peers are. In the United States, businesses are beholden first and foremost to their shareholders, which means that there's an inherent tension when "giving a shit about you" and "making money" come into conflict. What makes you think that a social network with a normal corporate structure and in the absence of countervailing regulation would have more empathy than any other business?
As an aside, I'd like to make the broader point that boiling everything down to "Facebook=evil" doesn't seem like a productive way to get the changes that I think both of us would like to see. It walks right into the strawman arguments that Zuckerberg is responding to in his post ("Why would we invest so much in trust and research if we didn't care?"). And it doesn't capture the fact that Facebook is a huge entity comprised of a lot of people, many of whom have different incentives and some of whom are even trying to do the right thing (see: Anna Haugen).
You’re trying to appeal to “my better nature”, to help me believe that FB can be changed. It wont work.
In 2007 or 2008 I promoted the idea amongst my friends to “poison the well”, to feed bad data into FB’s algorithms. If they enjoy Coke, talk about Pepsi, I said. If they vote Green, share far right news. I called it Falsebook.
It didn’t work, because the problems inherent in the advertising auctions are not very visible to users. They mesh with the echo chambers of our friend circles and fizzle into the background. We seek out echo chambers in order to feel safe and validated. It’s mostly fine when it’s just humans relaxing with friends. But when that echo is robotically generated from an accurate model of all individuals involved, it can be leveraged for all sorts of big scale shady crap. Advertising is the village idiot of this town and corportate-backed political propaganda is the warring gang lord.
And Zuck does what any market owner does.. sits back and rakes in profit, cleaning up the mess when it suits him. And mostly it doesn’t.
I’ve never felt it could be fixed from the inside. I’ve never thought it was a good idea to begin with but I let peer pressure and my magpie nature suck me in. I regret signing up for FB and GMail back in the day because corporate surveillance has fucked our society hard and let sociopaths run rampant.
You will not change my mind. I wasn’t contributing to the conversation in good faith. I shall assume you were. My original glib comment about FB abusing social networks wasn’t an invitation to learn a new perspective, it was a war cry, a flag raise, a call for comrades. I was hoping someone would respond with links to a new Scuttlebutt implementation or tell me about a cool Masto instance. Or try to explain why I should bother with Matrix. Or something more interesting than all of those.
The web is sick. Personalised advertising has made it ill. We need to fix it and I don’t believe FB or GOOG are interested in trying.
An actual social network would let you choose how you feel, which is why you can choose your friends who might all be downers and feel horrible, and the social network will let them share their despair with you.
> Common sense seems to suggest that social networks become more harmful as they becomes bigger and more profitable.
Yeah I don't know that anyone could have predicted these networks would become too massive to displace. Intervention should be on the table, no doubt about it. I hope they bring up net neutrality and Facebook's violation of it via internet.org. I thought "Free Basics" had died but apparently it's still going.