Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm desperately looking forward to, like, 5-10 years from now when all the "LLMs are going to change everything!!1!" comments have all but completely abated (not unlike the blockchain stuff of ~10 years ago).

No, LLMs are not going to replace compiler engineers. Compilers are probably one of the least likely areas to profit from extensive LLM usage in the way that you are thinking, because they are principally concerned with correctness, and LLMs cannot reason about whether something is correct — they only can predict whether their training data would be likely to claim that it is correct.

Additionally, each compiler differs significantly in the minute details. I simply wouldn't trust the output of an LLM to be correct, and the time wasted on determining whether it's correct is just not worth it.

Stop eating pre-chewed food. Think for yourself, and write your own code.



I bet you could use LLMs to turn stupid comments about LLMs into insightful comments that people want to read. I wonder if there’s a startup working on that?


I'm screenshotting this, let's see who's right.

Actually, your whole point about LLMs not being able to detect correctness is just demonstrably false if you play around with LLM agents a bit.


A system outputting correct facts, tells you nothing about the system's ability to prove correctness of facts. You can not assert that property of a system by treating it as a black box. If you are able to treat LLMs as a white box and prove correctness about their internal states, you should tell that to some very important people, that is an insight worth a lot of money.


As usual, my argument brought all the people out of the woodwork who have some obsession about an argument that's tangential. Sorry to touch your tangent, bud.


> LLMs not being able to detect correctness is just demonstrably false if you play around with LLM agents a bit.

How is telling you that this method of determining correctness is incapable of doing so, only tangential?


Correctness and proven correctness are different things. I suspect you're a big Rocq Prover fan.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: