> But it does give us information. This incident, along with all the other incidents and non-incidents that do or do not occur, as measured by incidents per car-mile. One cannot simply wish away this one incident and make it disappear. It is now forever part of the statistics which will either prove or disprove the hypothesis of autonomous cars being safer.
Can you point out the part of the article or the post that I was responding to which mentions how many car-miles were traveled?
This is exactly what I'm pointing out.
> The same way that I expect the same version of Notepad to open my text file exactly the same on this computer as on another computer.
Notepad doesn't have to read your text file through a lens with slightly different focus, viewing area, and patterns of dust on it each time.
> The alternative being that there is some undeterministic behavior in the driver that is not tied to input...
The undeterministic behavior is that the hardware which collects the input will never be the same. I don't know whether the software is non-deterministic (it wouldn't surprise me) but I know the hardware is never going to be identical--hardware is always made to tolerances and always has some degree of variability.
Your claim is tantamount to saying that if we put the same person in the same situation but with two different sets of eyes, the eyes would have no effect on the results.
> However, we cannot make that same argument by replacing human drivers. Because each human is, in fact, different.
Autonomous cars are, in fact, different. Just because they're running the same software doesn't mean they're the same; even if the software is completely deterministic, software is only a component of the autonomous driver.
> Can you point out the part of the article or the post that I was responding to which mentions how many car-miles were traveled?
> This is exactly what I'm pointing out.
I think, if your intent is to show that this is incomplete information, that...
1) No one is arguing that.
2) You have not done a great job of attempting to relay that, given phrases like, "Simply saying that a car made a mistake in this situation doesn't give us any information at all."
3) Sometimes that doesn't matter. For instance, Florida law is a mandatory 6 months to one year license revocation on DUI, regardless of the circumstances or information.
> Notepad doesn't have to read your text file through a lens with slightly different focus, viewing area, and patterns of dust on it each time.
Difficulty of the task is unrelated to expected outcomes of the task given the same inputs. And we already tread the topic of duplicating the exact situation... Not sure what you're trying to gain through this line of argument.
> Your claim is tantamount to saying that if we put the same person in the same situation but with two different sets of eyes, the eyes would have no effect on the results.
I am making no such claim, and I cannot believe that you are so adamant about not understanding my actual claim. This is a hypothetical situation. There is no mention in this scenario about changing the car, including any of the sensor hardware. I am interested in replacing only the driver (or driver software) into the exact same circumstance.
(EDIT: OK, reading back, I did say "any Uber vehicle". While your point stands, I think it's a very uncharitable reading. If hardware sensor tolerances and specs of dust on the camera are going to determine whether a life is lost or not, either those tolerances need to be driven down or this entire idea needs to be rethought. After all, we don't allow those who are legally blind to drive unless they have corrective lenses...)
Assuming the software is deterministic [1], by definition given the same inputs it will result in the same output. Therefore, "replacing" the autonomous driver with another would have resulted in the same incident. You cannot say that with any measure of confidence for any two pairs of human drivers.
[1] Which seems like it would be a good thing to assume, since I don't think one would get much traction by arguing that we should be putting vehicles with non-deterministic behavior on the road...
Can you point out the part of the article or the post that I was responding to which mentions how many car-miles were traveled?
This is exactly what I'm pointing out.
> The same way that I expect the same version of Notepad to open my text file exactly the same on this computer as on another computer.
Notepad doesn't have to read your text file through a lens with slightly different focus, viewing area, and patterns of dust on it each time.
> The alternative being that there is some undeterministic behavior in the driver that is not tied to input...
The undeterministic behavior is that the hardware which collects the input will never be the same. I don't know whether the software is non-deterministic (it wouldn't surprise me) but I know the hardware is never going to be identical--hardware is always made to tolerances and always has some degree of variability.
Your claim is tantamount to saying that if we put the same person in the same situation but with two different sets of eyes, the eyes would have no effect on the results.
> However, we cannot make that same argument by replacing human drivers. Because each human is, in fact, different.
Autonomous cars are, in fact, different. Just because they're running the same software doesn't mean they're the same; even if the software is completely deterministic, software is only a component of the autonomous driver.