Hacker News new | past | comments | ask | show | jobs | submit login

The probability that a Stochastic Parrot returns coherent reasoning seems vanishingly small.



Are you saying that GPT is not a stochastic parrot, or that GPT is not returning coherent reasoning?

Because if it's the latter, the evidence is rather against you. People seem to like to cherry-pick examples of where GPT gets reasoning wrong, but it's getting it right enough millions of times a day that people keep using it.

And it's not as if humans don't get reasoning wrong. In fact the humans who say GPT can't reason are demonstrating that.


why do you say that? You don't think stochastic pattern matching can feature reasoning as an emergent property? I do.

A stochastic parrot doesn't just mimic things totally randomly. It reinforces what it's seen.


I keep getting surprised at how a large chunk of HN's demographic seemingly struggles with the simple notion that a black box's interface informs surprisingly little about its content.

I'm not saying that GPT-4 is reasoning or not, just that discounting the possibility solely based on it interfacing to the world via a stochastic parrot makes no sense to me.


Isn't "reasoning" a functional property though? If from the outside it performs all the functions of reasoning, it doesn't matter what is happening inside of the black box.

Here's a silly example I thought of. We can ask whether a certain bird is capable of "sorting". We can place objects of different sizes in front of the bird, and we observe that the bird can rearrange them in order of increasing size. Does it matter what internal heuristics or processes the bird is using? If it sorts the objects, it is "sorting".

To me, it seems perfectly obvious that GPT-4 is reasoning. It's not very good at it and it frequently makes mistakes. But it's also frequently able to make correct logical deductions. To me this is all stupid semantic games and goalpost-moving.


> Isn't "reasoning" a functional property though? If from the outside it performs all the functions of reasoning, it doesn't matter what is happening inside of the black box.

Yes, that's my point exactly.


Replace forb and tworby.

How common is the pattern? I would expect quite common. So if one can do some replacement, it could solve it just by replacing right words.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: