Hacker News new | past | comments | ask | show | jobs | submit login

Medical was just one example, replace with anything you like.

As another example, you can give the AI a photo of something to have it name what that thing is. Then you can check the thing by its name on Google to see if it matches. Much easier than describing the thing (plant, tool, etc) to Google.




Having the wrong information can be more detrimental than having no information at all. In the former case, confident actions will be take. In the latter case, the person will be tentative wich can reduce the area of effect of bad decisions.

Imagine the lambda person confronted with this:

  sudo rm -rf /
What is the better situation, having no understanding of what it does or believing that another action will take place?


The process I'm suggesting is:

1. You have a complex or vague question that you can't search easily via Google etc

2. You ask the AI and it converts that to concrete searchable suggestions (in this case "sudo rm -rf /")

3. You search "sudo rm -rf /" to check the answer.

Step 3 is designed to (hopefully) catch this kind of problem.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: