Human vision is simply converting rays of visible light sensed by our light sensors (aka eyes) into electric signals which are subsequently converted into an image by our brain which is then processed.
Given that, perceiving the environment with radar, lidar, visible light, infrared, and so on is equivalent to human vision.
As far as I'm aware, Tesla uses more than just visible light sensors. Am I wrong in my understanding?
Humans aren't capable of emitting an electromagnetic pulse, sensing the reflection, and using the time of flight to calculate distance between themselves and an object. So, no, lidar and radar aren't equivalent to human vision even if you extend the idea of human vision over a wider range of the EM spectrum.
> Given that, perceiving the environment with radar, lidar, visible light, infrared, and so on is equivalent to human vision.
When paired with a general intelligence evolved over a couple million years, and visual sensors that have much more dynamic range than commercially available sensors yeah.
The problem is without those 2 things Teslas crash into fire trucks.
And different bands of EM have different properties so eyes aren’t equivalent to radar. Otherwise we would be able to see around corners.
And if humans had more sensory data we would definitely be integrating it into our driving. Otherwise ADAS tech wouldn’t be so commonplace in 2023.
Park sensors, for example, use radar but they are not equivalent to human vision because humans can't see the back of the car, and they can't measure the distance as accurately.
I meant that they are equivalent in a very fundamentalist sense.
You have input, taken by sensors, converted into usable form by a processor. There is no fundamental difference between visible light and radio waves, per se.
Thanks, I don't care to follow car news so I wasn't aware of that.
Though, it seems Tesla reincorporated radar from their post '21 models on, so the premise of this sub-thread is at best outdated and at worst a half-truth. Oh well.
Given that, perceiving the environment with radar, lidar, visible light, infrared, and so on is equivalent to human vision.
As far as I'm aware, Tesla uses more than just visible light sensors. Am I wrong in my understanding?