This is exactly the point Frances Haugen is making, and it's why this is so different and so much more significant than the other Facebook scandals and leaks in the past.
Haugen repeated over and over again her testimony today that Facebook is full of smart, thoughtful, kind, well-intentioned people, and that she has great empathy for them, even Mark Zuckerberg. Her point is that they have created a system of incentives that are inexorably leading to harmful outcomes. It is not about good and evil people, it is about the incentives. It's exactly as you are saying.
That's why she is not advocating to punish Facebook for being evil, but rather to force Facebook to reveal and permit research so we can understand the system and fix it, because Facebook is too deeply trapped in its own tangle of incentives to fix itself. In this I think she is absolutely correct.
"Facebook has created a system of incentives that are inexorably leading to harmful outcomes" Exactly right. The solution baffles me. "Force Facebook to reveal and permit research so we understand the system and fix it" Basically keep the harmful system in place, but pass the reins to an unspecified cabal hiding under the innocuous word "we". Hard pass.
The "lean in" people and incentives have made society suffer for profit. Perhaps we can define a better set of incentives that reward companies of people building products.
We can. It's called pay directly for the services you use. It is a time-honored system where you give providers money in exchange for goods and services. In response, their incentive is to keep you happy and healthy and prosperous so you can continue to give them money.
No, their incentive is to get your money and get other people like you in case you die on 'em. They don't need you specifically, and they extra much don't need you to be prosperous.
The message 'you can save all this money, using us!' always means 'you can spend all this money with us'. I'm not faulting the general system or even your point here: I am, however, suggesting that while the system is fine it does NOT in any way imply that such people have or feel ANY incentive to your well-being.
You could maybe make a case that such a company might feel an incentive to the POPULATION it depends on… but even then, I feel like that might be mythical. In theory you don't want to eat your own seed corn, but such incentives toward good behavior are so easily ignored… and even if they are honored, it's a collective concern, NOT personal.
They don't care about you, and you are damn lucky if they care even a little about your wellbeing as a class or demographic… most likely they do not. And that's where the system tends to break down.
> In response, their incentive is to keep you happy and healthy and prosperous so you can continue to give them money.
Their incentive is to find a way to get your money; we can see in the world around us that many of them have no problem if you're insecure, addicted, and indebted.
Which will never happen. That takes customer impetus, it's not there. People don't understand the cost of the free products they use, they are unlikely to switch.
So what ways would influence your outcome to actually happen? Because I think it would be the right way to run software platforms as well, I just don't see a pathway there that isn't heavy handed.
I would be for regulating the advertising industry, since I feel it is the root of all this. None of the unethical software magnates would exist if not for the advertising dollars pouring through the door thanks to the ad-tech apparatuses they have built, and the poor incentives that creates. But that regulation is challenging and unlikely too.
I think a freemium model would be better. You should have to pay for having a large number of followers/friends past a certain point.
For example, maybe an account with 1,000 friends is free. Up to 10,000: $5 / month, up to 100,000 $50/month, and so on.
If you're Kim Kardashian with 250 million followers and you're making millions of dollars hawking skin cream or whatever, you can afford to pay a few thousand dollars a month to reach your large, valuable audience.
This way, the content creators can sell ads if they want. The platform doesn't sell ads. Users only see ads if they follow a creator who has sponsors. It's up to that creator to make their content worthwhile enough for people to choose to follow them in spite of the ads.
A platform should be like a company that sells TV broadcast towers. They give people a way to reach an audience. What that content creator does with their audience is up to them. Maybe they could charge a subscription. Maybe they get sponsors. If it's a large non-profit or government organization, maybe they pay at a lower rate or get to use it for free.
Granularity matters. Social feed granularity is small enough that an algorithm, even primitive, can sketch an arbitrary narrative on the spot by juxtaposing unrelated items, akin to a ransom note built from letters cut off from different publications. Pandora and especially GoodReads have large granularity, making it difficult to employ in the same manner.
Very different outcomes I'd say. Are friends and family getting torn apart on those platforms? Do they need armies of moderators to remove abuse material or fact-check posts? (I'm sure there's some, but not on the same scale as Facebook.) This is the first I've ever heard such a thing suggested, and certainly haven't observed it personally.
It may be that facebook can't fix itself, but what makes anyone think an even larger and more powerful organization is the answer and won't itself succumb to its own system of incentives? She is pushing for the equivalent of The Ministry of Truth.
Remember, this is the system of incentives that had us spend 20 disastrous years in Afghanistan, across both parties. And has failed to deal with climate change. And healthcare. And education. And wealth inequality. And housing. And... Siri, what's the definition of insane?
By the way let's give a name to that system, it's called "PSC". Google it. It's the most absurd and ineffective performance management system I've ever witnessed.
It creates a Hunger Games mentality within teams and makes doing anything that actually matters virtually impossible, generating an infinite sequence of half assed 6 months projects that get systematically abandoned as soon as the people responsible manage to get promoted or switch teams.
> It creates a Hunger Games mentality within teams and makes doing anything that actually matters virtually impossible, generating an infinite sequence of half assed 6 months projects that get systematically abandoned as soon as the people responsible manage to get promoted or switch teams.
That's a bit of an over dramatization, PSC is just peer feedback, and is very similar to Perf reviews at Google as well as other large SV tech companies. Having done both I didn't experience this "Hunger Games mentality" you described.
> they have created a system of incentives that are inexorably leading to harmful outcomes
If the people inside are "smart, thoughtful, kind, well-intentioned people", they would have tried to work around the incentive, influence them, denounce them, or quit.
It rarely happened. Most of the time, the just take the money, and goes with the flow.
A long time ago, I worked in a startup full of smart, thoughtful, kind, well-intentioned people. Of course, there was also a CEO who was a ruthless manipulator and managed to make everyone believe that they were working for the greater good. In truth, in the course of lining his pockets, he was in the process of destroying several employees and former employees.
Getting past the illusion was hard.
Taking a stand against said CEO while nobody else was aware of the problem? Really, really hard.
Now, instead of a few dozen employees, Facebook has tens of thousands. I assume that all of them are subject to permanent propaganda, as in many tech companies, and that the semi-official word is that they are being misunderstood by the rest of the world, because of course they are doing the right thing but the problem is harder than people think (well, that last part is true, at least). I suspect that it's even harder to go against the flow.
How is giving access to user data for "research" is better than that whole data privacy scandal with Cambridge Analytica.
these days research comes with a set of politically charged assumptions, for example the definitions of "hate speech" and "misinformation" are different based on which political camp you ask
So giving access to Cambridge Analytica is bad but to some other partisan "think tank" is fine? who would make those decisions?
The government has systems in place for doing this. For example, the SBA has tons of data about small businesses across the country. You can get access to it... IF you are part of a research institution and go directly to their dedicated research facility so that you can't exfiltrate the data. Such a model is open, just not free. It would probably be the right model for this issue.
Haugen repeated over and over again her testimony today that Facebook is full of smart, thoughtful, kind, well-intentioned people, and that she has great empathy for them, even Mark Zuckerberg. Her point is that they have created a system of incentives that are inexorably leading to harmful outcomes. It is not about good and evil people, it is about the incentives. It's exactly as you are saying.
That's why she is not advocating to punish Facebook for being evil, but rather to force Facebook to reveal and permit research so we can understand the system and fix it, because Facebook is too deeply trapped in its own tangle of incentives to fix itself. In this I think she is absolutely correct.