I suspect that the corporate entities involved have such deep pockets and/or so many lawyers and lobbyists that won't work.
Instead, how about:
All new autonomous vehicle configurations (let's call that the algos + sensors + vehicle) have to take some kind of actual driving test, just like us humans do.
Maybe the public could even help design a good test? "Not driving at speed into a stationary fire truck which is parked on the highway right in front of you" would be one element I'd want to see tested.
If an autonomous vehicle is involved in an accident, and the algo/sensors/vehicle are found to be (partially) at fault the configuration earns penalty points.
If that configuration earns enough penalty points over a period of time, the entire configuration loses its certification, plus a fine, plus a mandatory re-test.
This method appears to work reasonably well in dealing with us not-always-perfect human drivers, and ought to concentrate the minds of the designers/developers/managers behind autonomous vehicles.
Could this/should this also apply for autonomous vehicles? If so, how?