That's not how it works. Imagine you had a medical device that had a high rate of operator-error that would result in serious injury or death. Now imagine someone made a fully automatic device performing the same function, but which would kill or injure a person half the time a manual operator would. Would such a device be allowed on the market? Of course not. The first time a machine did any damage to a human it would be recalled and would disappear off the market, no matter how much better it is than a manually operated machine.
If an automatic car kills a person through software error(and I would argue it has already happened with Tesla) the public backlash will be severe, even if those errors happen much less often than human mistakes in manual cars.
If an automatic car kills a person through software error(and I would argue it has already happened with Tesla) the public backlash will be severe, even if those errors happen much less often than human mistakes in manual cars.