There are standards, Toyota simply chose to ignore all of them. Standards such as MISRA [1] and DO-178C [2] exist for the purpose of ensuring software quality in safety-critical situations. Most embedded software development environments even include tooling to help verify that you're not doing the things that are unsafe. The problem lies more in that automakers aren't required to use any such standard, unlike what the FAA has required for decades.
Very awesome comment! I didn't know there were standards, but it's great to see there are. However, Toyota should have followed them. I mean, did they think they knew better? I think with self driving cars, the standards will become more requirements; though I doubt before a series of deaths result from software failure.
Given that a lot of computer vision work is based on randomised algorithms, do you think that these standards would be enough? You could demonstrate 100% MC/DC coverage through a neural net implementation, but the weights are where the faults probably exist, for example.
[1] http://en.wikipedia.org/wiki/Motor_Industry_Software_Reliabi... [2] http://en.wikipedia.org/wiki/DO-178C