Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've worked in these enterprise organizations for a long time. They don't run on common sense, or even what one might consider "business sense". Their existing incentives create bizarre behavior.

For example, you might think "if a big security exploit happens, the stock price might tank". So if they value the stock price, they'll focus on security, right?. In reality what they do is focus on burying the evidence of security exploits. Because if nobody finds out, the stock price won't tank. Much easier than doing the work of actually securing things. And apparently it's often legal.

And when it's not a bizarre incentive, often people just ignore risks, or even low-level failures, until it's too late. Four-way intersections can pile up accidents for years until a school bus full of kids gets T-boned by a dump truck. We can't expect people to do the right thing even if they notice a problem. Something has to force the right thing.

The only thing I have ever seen force an executive to do the right thing is a law that says they will be held liable if they don't. That's still not a guarantee it will actually happen correctly, course. But they will put pressure on their underlings to at least try to make it happen.

On top of that, I would have standards that they are required to follow, the way building codes specify the standard tolerances, sizes, engineering diagrams, etc that need to be followed and inspected before someone is allowed into the building. This would enforce the quality control (and someone impartial to check it) that was lacking recently.

This will have similar results as building codes - increased bureaucracy, cost, complexity, time... but also, more safety. I think for critical things, we really do need it. Industrial controls, like those used for water, power (nuclear...), gas, etc, need it. Tanker and container ships, trains/subways, airlines, elevators, fire suppressants, military/defense, etc. The few, but very, very important, systems.

If somebody else has better ideas, believe me, I am happy to hear them....




While good, those ideas will all increase costs.

Would you pay 10x (or more, even) for these systems? That means 10x the price of water, utilities, transport etc, which then accumulate up the chain to make other things which don't have criticality but do depend on the ones that do.

The thing is, what exists today exists because it's the path of least resistence.


Consumer costs would not go up 10x to put more care into ensuring the continuous operation of critical IT infrastructure. Things like "an update to the software or configuration of critical systems must first be performed on a test system".


Cars without seat belts were the path of least resistance for a long time. I wonder how that changed.


You're right (not sure about the exact factor though) - and there's also additional costs when those systems fail. Someone, somewhere lost money when all those planes were grounded and services suspended.

At some point - maybe it already happened, I don't know - spending more on preventive measures and maintenance will be the path of least resistance.


No, it exists because of all must bow to the deity of increasing shareholder value. Remember that good product is not necessarily equal or even a subset of the easy to sell product. Only once the incentives are aligned towards building quality software that lasts will we see change.


> Would you pay 10x (or more, even) for these systems?

if it's critical to your business, then yes; but you quickly find out whether or not it's actually critical to your business or whether it's something you can do without


Probably there should be an independent body that oversees postmortems on tech issues, with the ability to suggest changes. This is what airlines face during crash investigations and often new rules are put in place (e.g., don’t let the shift manager self-certify his own work in the incident where the pilot’s window popped off). How this would look like with software companies, and what the bar is for being subject to this rigor I don’t know (I suspect not for a Candy Crush outage though).

In general, the biggest problem I see with late stage capitalism, and a lack of accountability in general, is that given the right incentives people will “fuck things up” faster than you can stop them. For example, say CrowdStrike was skirting QA - what’s my incentive as an individual employee versus the incentive of an executive at the company? If the exec can’t tell the difference between good QA and bad QA, but can visually see the accounting numbers go up when QA is underfunded, he’s going to optimize for stock price. And as an IC there’s not much you can do unless you’re willing to fight the good fight day in and day out. But when management repeatedly communicates they do not reward that behavior, and indeed may not care at all about software quality over a 5 year time horizon, what do you do? The key lies in finding ways to convince executives or short of that holding them to account like you say.


I've commented on this before, but in this case I think it starts to fall onto the laps of the individual employees themselves by way of licensing, or at least some sort of certification system. Sure, you could skirt a test here or there, but then you'd only be shorting yourself when shit hits the fan. It'd be your license and essentially your livelihood on the line.

"Proper" engineering disciplines have similar systems like the Professional Engineer cert via the NSPE that requires designs be signed off. If you had the requirement that all software engineers (now with the certification actually bestowing them the proper title of 'engineer') sign off on their design, you could prevent the company from just finding someone else more unscrupulous to push that update or whatever through. If the entirety of the department or company is employing properly certificated people, they'd be stuck actually doing it the right way.

That's their incentive to do it correctly: sign your name to it, or lose your license, and just for drama's sake, don't collect $200, directly to jail. For the companies, employ properly licensed engineers, or risk unlimited downside liability when shit goes sideways, similar to what might happen if an engineering firm built a shoddy bridge.

Would a firm that peddles some sort of CRUD app need to go through all of this? If it handles toxic data like payments or health data or other PII, sure. Otherwise, probably not, just like you have small contracting outfits that build garden sheds or whatever being a bit different than those that maintain, say, cooling systems for nuclear plants. Perhaps a law might be written to include companies that work in certain industries or business lines to compel them to do this.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: