I don't think this has necessarily anything to do with engineering competence.
From a business perspective, security isn't a marketable feature until it becomes a problem—you don't install safety belts, or airbags, or protection against malware until after people start suffering from their absence in a vehicle.
Why? Because while you're busy building a well-secured system, your competitors are busy implementing new features that give them an actual advantage in the marketplace. As unfortunate as it might be, consumers tend to understand things like “remotely start your car with your phone” better than “your ability to brake won't be taken away from you while you're barrelling down the highway at 70 mph.”
It's sad and more than a little scary, but it's also nothing really new. Computer security, at least in the consumer sector, wasn't really a feature until viruses started showing up in the Eighties, and Internet security wasn't really a feature until the average Windows user's PC was getting taken over remotely the moment it was connected to the Net. Even Apple has only been able to tout security and privacy as a feature in its products by juxtaposing it to Google's business model—had the latter not existed and its data grab become part of public discourse, I doubt that Cupertino would have been able to make so much noise about it.
So, it's perfectly possible that every engineer and manager who worked on these systems is really quite competent and perfectly aware of the potential for security flaws (indeed, I doubt that they would have been able to make something so complex work otherwise), and still the sum of all the decisions made and market pressures applied caused the resulting product to be so vulnerable despite everyone's best intentions. It's not because people don't care or don't know, but rather because there are only so many resources available, and the market has pushed them all in a specific direction that happens to be away from security.
But this is also why we need this kind of research. Now that these problems are out in the open, and politicians are starting to take notice, security will become a feature that the public will care about, and, hopefully, car manufacturers will start adopting (or be forced to adopt) better standards.
Even if you disagree, preventing corporate liability is a component of competence in the law's opinion. That is, if the company is found liable, that's saying the employees responsible did something wrong, even if it's not holding them individually accountable.
Ianal, but from previous fallout on security issues, I'd assume legal liability stops well short of requiring actual competency.
Parent is 100% correct. It's market-adaptation. Same reason Samsung ships known-vulnerable extensions to Android: features >> security.
> So, it's perfectly possible that every engineer and manager who worked on these systems is really quite competent and perfectly aware of the potential for security flaws [...], and still the sum of all the decisions made and market pressures applied caused the resulting product to be so vulnerable despite everyone's best intentions.
I think this is key. Although I'd lump it more on management given that they allocate technical resources. When you have a lack of technical knowledge in management, you lose the ability to make technically informed decisions.
Sometimes the nuances of a situation can't be summed up in a PowerPoint slide. Especially when it's a slide that someone created to summarize a slide deck from an engineer that they saw.
You think at least some of the OPM vulnerabilities were internally unknown? Even with incompetence, you had to have actual engineers who looked at settings and/or lack of feedback and went "Hunh..."
In addition to your third paragraph, auto makers are disincentivized toward adding security features because they CAN'T advertise them. If Toyota started advertising the all new Prius with the feature "your ability to brake won't be taken away from you while you're barrelling down the highway at 70 mph", consumers wouldn't just not care, they would question why the fuck that wasn't there in the first place. This sort of stuff is expected to have been there from the start by consumers, so at this point its nothing but a cost sink to add it.
This will change as awareness of these attacks reaches the general public.
According to the article, US politicians are looking at introducing legislation to enforce cybersecurity measures. At that point, it will just be another safety rating that manufacturers can and do use to promote their vehicles.
From a business perspective, security isn't a marketable feature until it becomes a problem—you don't install safety belts, or airbags, or protection against malware until after people start suffering from their absence in a vehicle.
Why? Because while you're busy building a well-secured system, your competitors are busy implementing new features that give them an actual advantage in the marketplace. As unfortunate as it might be, consumers tend to understand things like “remotely start your car with your phone” better than “your ability to brake won't be taken away from you while you're barrelling down the highway at 70 mph.”
It's sad and more than a little scary, but it's also nothing really new. Computer security, at least in the consumer sector, wasn't really a feature until viruses started showing up in the Eighties, and Internet security wasn't really a feature until the average Windows user's PC was getting taken over remotely the moment it was connected to the Net. Even Apple has only been able to tout security and privacy as a feature in its products by juxtaposing it to Google's business model—had the latter not existed and its data grab become part of public discourse, I doubt that Cupertino would have been able to make so much noise about it.
So, it's perfectly possible that every engineer and manager who worked on these systems is really quite competent and perfectly aware of the potential for security flaws (indeed, I doubt that they would have been able to make something so complex work otherwise), and still the sum of all the decisions made and market pressures applied caused the resulting product to be so vulnerable despite everyone's best intentions. It's not because people don't care or don't know, but rather because there are only so many resources available, and the market has pushed them all in a specific direction that happens to be away from security.
But this is also why we need this kind of research. Now that these problems are out in the open, and politicians are starting to take notice, security will become a feature that the public will care about, and, hopefully, car manufacturers will start adopting (or be forced to adopt) better standards.