This is an area where it's easy to be too credulous. If you want "iPhone 1 of self-driving cars", Google has been tooling around in them for a while now. They do a great job under ideal conditions, on a sunny day. The future already happened!
It simply isn't enough to post a video of a car driving in light traffic on a beautiful, California afternoon. That isn't what makes the problem hard. Mess up with your video-based system in the dark, or in bad weather, or...lots of people will die.
This isn't web development. You don't risk lives with your beta release. (Which, not incidentally, is why they haven't released anything other than a video.)
Google's cars are not iPhone 1. They still just have a few dozen or a few hundred, who knows, and they have only been driving 2 million miles until now. They are not even for sale and will not be any time soon. Oor ever, if they don't market their own electric cars very soon or hook up with a car manufacturer.
Tesla can collect data from 5 billion miles of driving already one year after the Giga Factory is full operation, if production reaches the planned 500,000 in 2018 and each car drives 10,000 miles annually. And then the miles accumulate exponentially.
Here is my guess:
After a few million miles of shadow driving, i.e. a few weeks after the first few thousand new Teslas have been sold, Tesla is likely ready for the real thing under perfect driving conditions like those Google experiences in California. So Tesla is going to open for the self driving capabilities in such geographic areas during the day when weather is nice.
After a few hundred million miles of shadow driving, i.e. within half a year or so when Tesla has marketed 1-200,000 cars that have driven for a couple of months, Tesla will have collected enough data to drive in more difficult traffic conditions, like other American urban areas.
After a billion miles, around a year after Tesla has gotten the Giga Factory up in full speed and the first 2-300,000 cars have been on the streets for half a year, Tesla is probably ready for certain bad weather conditions, like rain, light snow, light fog, and for driving in the dark.
Give it another few billion miles, so still before 2020, and the streets of Istanbul or the snowy roads of Nebraska have been tamed.
Not sure why you keep talking about the gigafactory as if that's relevant to self-driving cars. Your only point can be summarized in one sentence: they've released a senor suite into the field, and through some combination of Big Data and magical AI, they will win. This, as much as anything else, is a religious belief.
Ultimately, this game is about the sensors. If your sensor suite is inadequate to detect every conceivable bad situation a car can be in, then it doesn't matter how much data you accumulate. One million miles of good LIDAR data is worth more than a billion miles of video, if the video can't see white objects during the daytime, or figure out on what side of the parking lot the car should be driving (ahem).
I have no idea how good or bad their sensors are, but instinct tells me they're not up to the task of all-condition driving, and this video provides no useful information either way.
Production capacity is the most important factor for data collection.
The Teslas sold in October 2018 don't need to have the same sensors as the ones sold in january 2018. As you can see, the car is designed modularly. No other mass marketed car on the planet has any sensors integrated yet.
It's all about big numbers. Google has spent 5 years or more driving a few cars around California. It's never going to match billions of miles driven around the globe.
EDIT: Imagine the engineers working at Google's autonomous cars today. I bet many of them are preparing their resumes for Tesla. They can launch an algorithm or a new sensor type now, wait a few months and then get the feedback on how it worked. With Tesla, they will get that feedback in days. In which lab would you want to work if you want to take part in shaping the future? In the Google lab where you have a Lexus or two assigned to you that drive up to 25 miles per hour in California or in the Tesla lab where you have 2,000 Teslas assigned driving in 50 different countries?
Once again: if the sensors don't work for the problem domain, the quantity of data gathered is irrelevant.
If you mounted a forward-facing video camera on every car in in the world and gathered the data for decades, you'd still be nowhere: you're missing the side and rear views. This is a thought exercise, but it demonstrates the point. If your robot car has a blind spot, all the data in the world won't fix it.
Nobody knows how good these cameras are, but every camera-based system so far has had the same critical limitations: they don't work well at night or in poor visibility.
But we know that the sensor suite must be at least as good as a human; more vision, plus other sensors. Therefore we know the sensor suite is sufficient to be as good as (and likely better than) a human.
It's not. For example, the eye has much higher dynamic range than any camera sensor available today. Try taking a picture at night that looks half as decent as it does in your head.
> but instinct tells me they're not up to the task of all-condition driving
From a sensor perspective, what does a human have that are missing in the new Tesla suite?
The sensors cover the whole spectrum of light humans can see, plus a lot more. There is also ultrasonic sonar.
Human eyes have significantly higher dynamic range than cameras though, which is important when you have scenes with both dark and brightly illuminated areas.
Once again, far from being an expert in this field.
That was also my understanding for the general case, but I was also assuming that cameras were available that could at least approach human dynamic range.
This isn't web development. You don't risk lives with your beta release
First: Why do you always need to assume web dev is stupid. Or less impactful. Imagine a bank website showing 10,000 <your currency> less in your account. I myself paid several customers a penalty/reimbursement, when because of our website, they suffered. And I am sure there are websites/apps which are more critical.
Second: You are assuming, they have done a half-assed job and hurrying to release. If you look at the earlier thread[1], several people have explained, how Tesla has 100s of millions of kms of their customers driving footage, using which they have trained the AI/ML algos. So I won't be surprised if these cars can do well enough in the dark.
Sebastian Thrun, one of the pioneers of self driving cars, recently has taken a turn to self driven cars just using cameras and AI. In his own words[2] he says, that we humans also just have our two eyes, so cameras should do the job well.
Well, for starters, the only thing Tesla has "released" here is a video. So there's that. I don't know if they hurried the release of the video. The real product launch is the sensor suite...and those haven't shipped in any quantity, yet.
As for AI, there's no magical "algo" that solves the problem of an all-black video frame. Having more black video frames than anyone else doesn't exactly solve that problem. See also: rain, snow and fog.
But finally, nobody said that web developers are stupid. Calm down. But it's completely fair to note that, however much money you lose for your customers with half-baked beta releases, you're still not killing them. The stakes are higher here. The expectations of proof are higher as well.
there's no magical "algo" that solves the problem of an all-black video frame
So do human eyes. I perhaps should have used the word night rather than dark. So basically it should work in the night also is my understanding.
And I'm totally calm, and not angered :-). Just trying to make my point. Because I don't like arguments like - 'this is not rocket science', or 'this is not web development.
> First: Why do you always need to assume web dev is stupid. Or less impactful. Imagine a bank website showing 10,000 <your currency> less in your account.
The impact of that, terrifying as it may be, is still significantly less then the impact of a self-driving-but-buggy vehicle into the side of a semi.
I was arguing against the trivialization of web dev -- this isn't web development -- while I agree that self driven car is one of the highest level of critical. But that said, people can equally suffer, if not so visually, due to financial app bugs. I shudder to think of state of people who lost lot of money in hacks on MtGox and other Bitcoin exchanges for example.
There was no net impact in that accident. It replaced three deaths which would have occurred without autopilot, so it was part of a net gain in lives.
Perhaps that particular human was special and worth saving over the other two who were saved instead, but I am not aware of any reason to believe that.
Only after a particularly tortured interpretation of highway safety statistics.
If you exclude pedestrian deaths (Few jaywalkers on the interstate), motorcycle deaths, deaths in conditions where autopilot cannot be used, and deaths due to collisions of lighter, less safe, older, or poorly-maintained automobiles, then regular highway death statistics start looking better then autopilot-enabled ones. [1] [2]
Again, there's a great gap between the hype of Tesla, and the reality.
AI/ML algos cannot magically guarantee safety, for example, deep nets are known to be very un-robust: https://arxiv.org/abs/1312.6199. We are not yet at the point where AI/ML based algos can be guaranteed to be safe.
Bet you there is a clause that it's only usable during perfect conditions. I don't doubt there is a ton of information being sent back to Tesla for machine learning. They have the capability to send software updates to the cars. When you have hundreds of thousands of these cars driving around, they should be able to gather so much information and tweak the design to perfect the whole issue.
If I had one of these cars, I would not put 100% faith into this technology yet. But, after 5 years, who knows? I may finally be able to drink more than 1 beer at a bar with friends and have my car drive me safely home.
Which brings an interesting point, traffic violations. What will local police do now that self-driving cars drive safely?
If we assume self-driving cars will be an on-demand service, they could be offered intermittently - only on good weather, at daytime, at cities(or even routes) already mapped. Even with that limitation, it could be a pretty valuable service, if complemented by the right ride-sharing services.
And even when purchasing a car , a car that "only" drives itself when it's safe would be great, even if it's only 40% of the time, for people in the right city.
It simply isn't enough to post a video of a car driving in light traffic on a beautiful, California afternoon. That isn't what makes the problem hard. Mess up with your video-based system in the dark, or in bad weather, or...lots of people will die.
This isn't web development. You don't risk lives with your beta release. (Which, not incidentally, is why they haven't released anything other than a video.)