On the surface, this looks correct. We should have autonomous cars. Makers should pay damages when they kill people, since the fault shifts from drivers to manufacturers. This should be part of the cost of doing business, and should be factored into the price of both the car and of insurance.
Tesla market cap is nearing a trillion dollars. A typical wrongful death lawsuit is around $1 million in damages. If autonomous cars reach 100% market share, and kill 40,000 people each year (the same as normal cars), that adds $40B each year to the cost of doing business, so O(1/20th) of Tesla's market cap.
If Tesla pays, this would also translate into lower insurance fees as well. Car prices go up slightly, but if safety is similar to human driving, total cost of car ownership remains the same. If autopilot is safer, cost goes down. If humans are safer, costs go up. There's a clear business upside to making safer cars, so Tesla invests there.
In any case, that's not enough to drive Tesla out-of-business, but it is enough to align incentives to make safe self-driving cars.
40k people a year in the US die in auto accidents already, and we collectively pay for that. Now Tesla pays for it when humans make poor risk choices.
If you're watching a movie on Autopilot and end up dead, how is that Tesla's fault? Yes, Tesla should not be able to lie about driver assist capabilities ("full self driving"), but there is no perfect safeguard against humans abusing assist systems (Tesla puts interior cameras in the cabin now to detect attentiveness and alertness, and people complain about that). Will we simply pay out to next of kin of those who abuse safety systems and end up dead? If so, I guess that is the cost of progress in a system with human participation. If you exceed the capabilities of systems where life safety is involved, you'll die, and we'll all move on.
> A Tesla Model S on autopilot crashed into an articulated tractor-trailer on Highway US 27A in Williston, Florida killing the driver, Joshua Brown. The trailer was turning left in front of the incoming Tesla, and the Tesla autopilot system was unable to detect the white trailer against the bright sky. Cruise control was set at 74mph and did not slow before collision. The driver had his hands on the wheel for 25 seconds of the 37 minute trip and was watching a Harry Potter movie when the collision occurred. Before the collision, the driver received 6 audible warnings that his hands had been off the wheel for too long.
Regulators aren't going to make it illegal, but you are still entitled to your opinion. You can't stop a determined human ignoring or bypassing safety systems from getting themselves killed. It is an unfortunate reality. I am not without empathy, just pragmatic when you try to build systems to protect humans and they do what they do. "Make something idiot-proof, and they will build a better idiot."
Until regulators pull it from vehicles, we're going to carry on with it except for folks who get themselves killed (with some economic drag from these legal proceedings).
Is the problem lane keeping, traffic adaptive cruise control, and automatic emergency braking? Or is it the marketing around it and how people use it? Maybe it's the lack of a controlled stop when the driver attention check fails?
Plenty of other cars have driver assistance features, but we don't hear about people not paying attention to their well equiped Pacifica and it crashing into things. Maybe because there's no videos from Chyrsler saying the car is driving itself, driver there for legal purposes only. Maybe because the driver assistance equipped unit sales are lower. Maybe the collisions are happening, but don't make the news.
Driver assistance features seem like a good idea if used properly. Drive normally, but the computer is always supervising and will help if it sees a problem you don't. They don't work well when the easily distractable human driver is supposed to be supervising the computer.
Honesty I don’t know what causes it. Driving assist stuff is great.
My Hyundai actually has great lane keep assist, emergency braking, cross traffic alert and braking, plus radar cruise. It will disengage lane keep and radar cruise if you don’t put your hands back on the wheel after alerts.
Maybe it’s because many buyers are younger? Or Tesla appears to be more techy and futuristic? Or maybe it is the weird approach to marketing musk is doing. It is a bit reckless imo.
I don’t really care if someone dies being an idiot and ignoring warnings. I do care if they slam into another driver that has no choice but to share the road with a Tesla.
> Model Y also received a leading score of 98 percent in Euro NCAP's Safety Assist category. This result was achieved with Model Y vehicles equipped with Tesla Vision, our camera vision and neural net processing system that now comes standard in all Tesla vehicles delivered in North America and Europe. This score was a result that many did not believe was possible without using radar.
(own four Teslas of varying vintages, from 2018 to 2021, have driven over 120k miles on Autopilot)
What you are referring to is safety rating, which nobody is disputing, however, it did not prove "vision" superior. The car is safe for sure but it wasn't because of vision but of structure.
Vision replaced front facing radar for AEB and TACC (traffic aware cruise control). The safety tests including automatic emergency braking, which Vision provides signal for with regards to forward travel. Automated lane changes were always Vision.
That and user driven lane change automation. (fully automatic lane changes for navigation or speed management is apparently in beta based on Tesla's online manual [1]).
IMHO, that's kind of the root of the problem. As individual features, it's not super exciting, but when you call it autopilot, people think it does more than it does (which is fairly true of airplane autopilot as well).
I mean the lane change and using navigation is a pretty big jump from the Hyundai system which is dumb.
Hyundai would not be able to attempt to drive me home. While, you can get into a Tesla, enter an address and hope for the best.
Calling it autopilot is for sure part of the problem. The average individual is dumb and won’t ever open a manual or read any prompts. But then again they are there for legal reasons.
> 40k people a year in the US die in auto accidents already, and we collectively pay for that. Now Tesla pays for it when humans make poor risk choices.
No, we individually pay for that. I pay an insurance premium every year to cover the cost if I do significant damage.
It's not about fault. It's about aligning free market incentives to allow self-driving cars, and to allow them to become safer. It's a model which works well across many industries. You set liability at actual cost, so companies can't externalize costs, but also don't have punitive liability. The market then works things out.
1) The denominator and numerator need to match up.
$40B is the total costs assuming self-driving had 100% market penetration and were self-driving 100% of the time. It's not an estimate for damages this year.
If you look at a snapshot today, Autopilot is responsible for O(300) accidents per year, leading to around $300M in potential damages. That's about 1% of their profit, and 0.25% of their revenue.
2) Market cap is quite literally a bank. I can always borrow from the market by releasing more shares, or do the reverse by buying them back.
> 2) Market cap is quite literally a bank. I can always borrow from the market by releasing more shares, or do the reverse by buying them back.
Here's an idea. Become the CEO of a large cap company and try that. If you are please let us know which one, I'd love to short your stock just before you do that because I would make a literal killing.
The moment you do that your stock will tank a significantly larger amount than what you just diluted, also you would be fired by your board the very next day.
Modern stock prices are not anywhere near book value of the company, the is especially true for Tesla which by many people's estimation is quite overvalued and a move like that would just tank the stock.
No one is proposing Tesla be liable for every accident on the road.... At the time they have 100% market penetration, their market cap will be much higher.
Nevertheless, it's not clear the impact on Tesla's market cap. If a self-driving car costs $1000 more, and saves me $1000 on insurance, on the net, it's a wash for the buyer. Tesla can quite literally pass on the costs to the buyer in the purchase price, who can get it back in insurance premiums.
Personally, it'd be an upside to me. My costs would be the same, but my stress, risk, liability, etc. would go down. I don't want to kill someone, and if I do cause an accident, I don't want the headache. If other people are like me, sales (and market cap) might go up.
On the surface, this looks correct. We should have autonomous cars. Makers should pay damages when they kill people, since the fault shifts from drivers to manufacturers. This should be part of the cost of doing business, and should be factored into the price of both the car and of insurance.
Tesla market cap is nearing a trillion dollars. A typical wrongful death lawsuit is around $1 million in damages. If autonomous cars reach 100% market share, and kill 40,000 people each year (the same as normal cars), that adds $40B each year to the cost of doing business, so O(1/20th) of Tesla's market cap.
If Tesla pays, this would also translate into lower insurance fees as well. Car prices go up slightly, but if safety is similar to human driving, total cost of car ownership remains the same. If autopilot is safer, cost goes down. If humans are safer, costs go up. There's a clear business upside to making safer cars, so Tesla invests there.
In any case, that's not enough to drive Tesla out-of-business, but it is enough to align incentives to make safe self-driving cars.