Hacker News new | past | comments | ask | show | jobs | submit login

I wouldn't bet on that study. ChatGPT hallucinates. It can beat doctors but at the same time deliberately present wrong information.

Doctors at least make a best effort.




I don't care about best effort. I care about getting the correct diagnosis. GPT-4 can already offer insightful information, even if it's just by augmenting the doctors diagnosis efforts. Not using it is just dumb.


I hate doctors but at the same time you have to look at reality. Not every decision needs to be data driven, the qualitative aspects of hallucinations are very real and should not be ignored. I'm sure you know the hallucinations that pop out of chatGPT can get wild. Definitely use it, but do so with caution.


> Doctors at least make a best effort.

lol. Just skim the CFS thread for counterexamples


Eh best effort in terms of not killing you and harming you too much... that's what I meant.

For chatGPT hallucinations, anything goes, you know that machine has no boundaries so that has to be taken into account.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: