I'm pointing out that calling anything "AI" is both pointless and meaningless. It's a buzzword for board members and shareholders to throw around, since they refer to it as the latest LLM technology, while the phrase just means any complex business logic generated by a program.
It’s generally accepted to mean the use of neural networks which Tesla is obviously using. Good luck even identifying a stop sign with “complex business logic” or “if/else”
Most important road signs have rather distinct shapes, standardized sizes and are angled towards oncoming traffic. Having an object with known shape aligned almost perfectly towards the camera is basically the best case for many primitive object detection algorithms.
True, but it’s equally important that a self-driving car be able to recognize a stop sign that is bent from a previous accident and facing an arbitrary angle (as well as one that is angled towards the car’s lane but applies to a different road).
And stop signs that have been altered in some way. For example, rural stop signs that are peppered with holes from pot shots must still be recognized. Snowy stop signs with the bottom half obscured by accumulated drift. Signs with a non-red sticker reading “WAR” placed below the word “STOP”.
And that’s not even getting into cases where you conditionally act like there’s a stop sign. The city of Houghton,MI has major streets along the side of a hill, and minor streets going up and down the hill. Every winter, sand is put down for traction, and every spring it is cleaned away. If there’s a late-season snowstorm after the spring cleaning, cars going downhill on the minor streets physically cannot stop, so everybody on the major streets looks uphill before crossing.
Short of location-dependent fine-tuned models, I’m not sure how machine learning could replicate the logic of “if snowy in late spring, grant right-of-way to cars headed downhill”.
They're "artificial neural networks" and it would seem to me it's recognizing stop signs by comparing them to images of stop signs. So I tend to lean toward "AI" is the latest "buzzword". I think in truth it's more akin to a search engine reacting to inputs, but from sensor data, than anything close to real "intelligence" of any kind.
I can see how it appears to intelligent, but it lacks reasoning, creativity, and critical thinking.
“Creative” doesn’t necessarily mean “generating new behavior”, but can also mean “generating new hypotheses”. Suppose you see a group of young kids playing in a yard. One tosses a ball up into the air, and the rest run towards it. The first to reach the ball throws it back into the air, and the rest run toward it again.
It requires creativity to recognize the rules of the game as “try to be the first to reach the ball”, to recognize that the thrower may not have time to carefully aim, and that the others might chase the ball regardless of its location. Only if all three of those creative leaps are made, then logical deduction can take over to conclude “if the ball goes in front of me, stop before a kid does the same”.