Hacker News new | past | comments | ask | show | jobs | submit login

Here's the thing: you don't need AGI to have an alignment problem. It is predicting text, which is surprisingly closely aligned in most cases, but can fail spectacularly. As it becomes more capable and people give it more real-world responsibility, the blast radius for its mistakes increases. That might not be an AGI-level problem, but it is still a big problem.



Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: