Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Basically he underestimated the value of his experiences.

How can anyone here confirm that's true, though?

This reads to me like just another AI story where the user already is lost in the sycophant psychosis and actually believes they are getting relevant feedback out of it.

For all I know, the AI was just overly confirming as usual.



He actually got the job he didn't think he could get.


Yea, with an AI resume.

Are you missing the point, or do you genuinely consider LLM output a proof of merit?


I don't think I'm missing the point. Getting the job is real-world validation that cannot be explained by LLM sycophancy-inspired delusions.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: