Hacker News new | past | comments | ask | show | jobs | submit login

If your system is objectively right, but also objectively causing accidents with humans. Well you won't fix the humans...



> If your system is objectively right, but also objectively causing accidents with humans.

It's not causing accidents in these cases. The humans are.


The causality version of "cause", not the blame version. Accidents that would not have happened without the system.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: