I've certainly seen cases where self-driving cars behaved in a way that I would consider reckless if a human was at the wheel, but were not collisions, did not result in injury, and I expect will not form part of any statistics reported to any government body
A few weeks ago I saw one blow a red light. Nobody hurt or killed, so it wasn't reported to any responsible agency. It might not have even been tallied by the company, if the car thought there wasn't a problem. But its action was clearly unsafe.
It's hard to get a grip on the problem when the data is so faulty.
Someone posted an example of a self-driving Cruise car that appeared to run a red light https://www.reddit.com/r/sanfrancisco/comments/14wyyzw/just_.... Although in that example, technically the self-driving car was guilty of entering the intersection without sufficient space on the other side, rather than of crossing the limit line while facing a red light.
A few weeks ago I saw one blow a red light. Nobody hurt or killed, so it wasn't reported to any responsible agency. It might not have even been tallied by the company, if the car thought there wasn't a problem. But its action was clearly unsafe.
It's hard to get a grip on the problem when the data is so faulty.