Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>That GPT-3 is just vomiting stuff up out of its training set and not producing any new knowledge?

Hmm. Seems obvious to me that it's producing new output, but that output isn't knowledge and it can't be.

Sometimes ChatGPT tells me something that turns out to be correct and relevant. And I get excited, and then I Google it and what it told me is the first hit on Stack Overflow.

There's a subtle point here, that other people might say "well, ChatGPT is ok, but no better than Google" or something like that. But I differ on that. The key is that I don't know it's Stack Overflow until I check independently. So it's giving it too much credit to say it's as good as Google, and the amount of information it can output is not lower bounded by its training set, but is actually zero due to being adjacent to an infinite amount of BS that by its nature always requires external mechanisms to separate out.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: