Hacker News new | past | comments | ask | show | jobs | submit login

I've called this out numerous times (and gotten downvoted regularly), with what I call the "Cult of Optimization"

aka optimization-for-its-own-sake, aka pathological optimization.

It's basically meatspace internalizing and adopting the paperclip problem as a "good thing" to pursue, screw externalities and consequences.

And, lo-and-behold, my read for why it gets downvoted here is that a lot of folks on HN ascribe to this mentality, as it is part of the HN ethos to optimize , often pathologically.




Love your point. "Lack of alignment" affects more than just AIs.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: