I think the level of complexity is the problem. A bad actor can be embedded in any of the above contexts: corp, non-profit, FOSS hobbyist. It doesn’t matter. The question is: when software is so complex that no one knows what 99.9% of the code in their own stack, on their own machine, does (which is the truth for everyone here including me), how do you detect ‘bad action’?
The level of complexity involved in making sure that electrical plants work, that water gets to your home, that planes don't crash into each other, that food gets from the ground to a supermarket shelf, etc, is unfathomable and no single person knows how all of it works. Code is not some unique part of human infrastructure in this aspect. We specialize and rely on the fact that by and large, people want things to work and as long as the incentives align people won't do destructive things. There are millions of people acting in concert to keep the modern world working, every second of every day, and it is amazing that more crap isn't constantly going disastrously wrong and that when it does when are surprised.
> Code is not some unique part of human infrastructure in this aspect
It kinda is. The fact that code costs fractions of a penny to copy and scale endlessly, changes everything.
There's hard limits on power plants, you need staff to run them, it's well-understood.
But software - You can make a startup with 3 people and a venture capitalist who's willing to gamble a couple million on the bet that one really good idea will make hundreds of millions.
Software actually is different. It's the only non-scarce resource. Look at the GPL - Software is the only space where communism / anarchy kinda succeeded, because you really can give away a product with _nearly_ no variable costs.
And it's really just the next step on the scale of "We need things that are dangerous" throughout all history. Observe:
- Fire is needed to cook food, but fire can burn down a whole city if it's not controlled
- Gunpowder is needed for weapons, but it can kill instantly if mishandled
- Nuclear reactors are needed for electricity, but there is no way to generate gigawatts of power in a way that those gigawatts can't theoretically cause a disaster if they escape containment
- Lithium-ion batteries are the densest batteries yet, but again they have no moral compass between "The user needs 10 amps, I'm giving 10 amps" and "This random short circuit needs 10 amps, I'm giving 10 amps"
- Software has resulted in outrageous growth and change, but just like nuclear power, it doesn't have its own morality, someone must contain it.
Even more so than lithium and nuke plants, software is a bigger lever that allows us to do more with less. Doing more with less simply means that a smaller sabotage causes more damage. It's the price of civilization.
So the genie ain't going back in. And private industry is always going to be a tragedy of the commons.
I'm not sure what government regulation can do, but there may come a point where we say, even if it means our political rivals freeload off of us, it's better for the USA to bear the cost of auditing and maintaining FOSS than to ask private corporations to bear that cost duplicating each other's work and keeping it secret.
Is that a handout to Big Tech? 100%. Balance it with UBI and a CO2 tax that coincidentally incentivizes data centers to be efficient. We'll deal with it.