Hacker News new | past | comments | ask | show | jobs | submit login

Nice - I've used the product before but noticed it sometimes gives hallucinated answers if I ask something for which there's no good google result. Is this something you plan on addressing soon?



I think that's kind of the problem with these tools lol, there is no obvious solution to this. Automatically fact checking an AI model would probably require a bigger and more sophisticated AI model.

E: That said this does look sick


We've tried to mitigate this recently. Does it still happen with Expert mode? If you have any examples, please send them my way and I'll talk a look at how we can address them.


I've found this happening with Expert mode, especially when just using a chatting style prompt. For example:

https://www.phind.com/search?cache=f017634d-e354-4795-ae6e-d...

I've had similar when thanking Phind after a chat thread.


Phind isn't designed to do small talk. It's very results-oriented. Saying "Hello" and "Thank you" doesn't really do anything.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: