Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For a human, it takes human reasoning. But a xerox machine can also output the correct answers given the right inputs, which is exactly what you can say about an LLM.

The "attribute of importance" I'm referring to is "rationality". You keep talking about it like it means something but you can't define it beyond "I'm pretty sure this text was made using it".

Does a tape recording of a bird song "know" how to sing like a bird?



Those aren't good analogies. An LLM isn't like a xerox machine or a tape recorder. Again, the answers to the bar exam it passed weren't in its training data. Nor was the code it wrote for me.

I'm using the common, colloquial definition of reasoning. I don't think we need an academic treatise to say that passing the bar exam (without copying the answers) or writing code for a novel task requires reasoning.

You're right that we don't fully understand how the LLM is doing this, but that doesn't mean it isn't happening.


Thank you, yes, for saying I am right in saying that the evidence is lacking, which was precisely my original point.


The evidence isn’t lacking :) We have lots of evidence. What we lack is a coherent theory that explains the evidence.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: