I honestly don't know if it's possible. Common sense seems to suggest that social networks become more harmful as they becomes bigger and more profitable. That's just my surface-level take though. I guess the meta-question is what to do about highly-profitable lines of business that cause demonstrable social harm. Clearly regulation and government intervention are one answer, as are reforms to corporate governance structures (e.g. German and Nordic rules around worker representation on corporate boards.)
While I'm sympathetic to your anti-FB bias, I think it's pretty clear that by any reasonable definition Facebook is indeed a social network (which also happens to make gobs of money through advertising, and covets engagement because that increases ad revenue as well as a general sense of "platform health"). The question posed was whether "being bad for society" is an emergent property of social networks as they grow.
It’s quite hard to see how it’s just a business that doesn’t give one shit about you no matter now many followers you have or how many likes you got when you’re still using it.
An actual social network would care how you feel today.
A social network is just a kind of business, like a chain of supermarkets or a steel mill. You could, of course, have a non-profit social network (or a business with a different corporate structure - as I allude to in a parent post), but that's not what Facebook is or what its peers are. In the United States, businesses are beholden first and foremost to their shareholders, which means that there's an inherent tension when "giving a shit about you" and "making money" come into conflict. What makes you think that a social network with a normal corporate structure and in the absence of countervailing regulation would have more empathy than any other business?
As an aside, I'd like to make the broader point that boiling everything down to "Facebook=evil" doesn't seem like a productive way to get the changes that I think both of us would like to see. It walks right into the strawman arguments that Zuckerberg is responding to in his post ("Why would we invest so much in trust and research if we didn't care?"). And it doesn't capture the fact that Facebook is a huge entity comprised of a lot of people, many of whom have different incentives and some of whom are even trying to do the right thing (see: Anna Haugen).
You’re trying to appeal to “my better nature”, to help me believe that FB can be changed. It wont work.
In 2007 or 2008 I promoted the idea amongst my friends to “poison the well”, to feed bad data into FB’s algorithms. If they enjoy Coke, talk about Pepsi, I said. If they vote Green, share far right news. I called it Falsebook.
It didn’t work, because the problems inherent in the advertising auctions are not very visible to users. They mesh with the echo chambers of our friend circles and fizzle into the background. We seek out echo chambers in order to feel safe and validated. It’s mostly fine when it’s just humans relaxing with friends. But when that echo is robotically generated from an accurate model of all individuals involved, it can be leveraged for all sorts of big scale shady crap. Advertising is the village idiot of this town and corportate-backed political propaganda is the warring gang lord.
And Zuck does what any market owner does.. sits back and rakes in profit, cleaning up the mess when it suits him. And mostly it doesn’t.
I’ve never felt it could be fixed from the inside. I’ve never thought it was a good idea to begin with but I let peer pressure and my magpie nature suck me in. I regret signing up for FB and GMail back in the day because corporate surveillance has fucked our society hard and let sociopaths run rampant.
You will not change my mind. I wasn’t contributing to the conversation in good faith. I shall assume you were. My original glib comment about FB abusing social networks wasn’t an invitation to learn a new perspective, it was a war cry, a flag raise, a call for comrades. I was hoping someone would respond with links to a new Scuttlebutt implementation or tell me about a cool Masto instance. Or try to explain why I should bother with Matrix. Or something more interesting than all of those.
The web is sick. Personalised advertising has made it ill. We need to fix it and I don’t believe FB or GOOG are interested in trying.
An actual social network would let you choose how you feel, which is why you can choose your friends who might all be downers and feel horrible, and the social network will let them share their despair with you.
> Common sense seems to suggest that social networks become more harmful as they becomes bigger and more profitable.
Yeah I don't know that anyone could have predicted these networks would become too massive to displace. Intervention should be on the table, no doubt about it. I hope they bring up net neutrality and Facebook's violation of it via internet.org. I thought "Free Basics" had died but apparently it's still going.