I'm honestly so confused how people use LLMs as a replacement for search. All chatbots can ever find are data tangential to the stuff I want (eg i ask for a source, it gives me a quote). Maybe i was just holding search wrong?
It should give you both - the quote should be attributed to where it was found. That's, generally, what people mean when they ask or search for "a source" of some claim.
As for general point - using LLMs as "better search" doesn't really look like those Google quick AI answers. It looks like what Perplexity does, or what o3 in ChatGPT does when asked a question or given a problem to solve. I recommend checking out the latter; it's not perfect, but good enough to be my default for nontrivial searches, and more importantly, it shows how "LLMs for search" should work to be useful.
Some people expect LLMs as part of a better "search".
LLMs should be integrated to search, as a natural application: search results can heavily depend on happy phrasing, search engines work through sparse keywords, and LLMs allow to use structured natural language (not "foo bar baz" but "Which foo did a bar baz?" - which should be resistant to terms variation and exclude different semantics related to those otherwise sparse terms).
But it has to be done properly - understand the question, find material, verify the material, produce a draft reply, verify the draft vis-a-vis the material, maybe iterate...
DuckDuckGo Ai assist is going in the right direction, imo. It will pull info from wikipedia, use math and map tools plus other web sources that has been mostly accurate for me on the search page.
The chat option uses gpt-4o with web search and was able to provide links to colonial map resources I was curious about after falling down that rabbit hole. It also gave me general (& proper) present day map links to the places I was looking for in the map sites I asked for.
It did get confused a few times when I was trying to get present day names of old places I had forgot; like Charles River in Va that it kept trying to send me to Boston or Charles City Co on the James river and told me to look for it around there...
The York river wiki page clearly says it was once Charles River. Maybe I wasn't asking the right questions. For more unique things it was pretty helpful thou and saved the endless searching w/ 100 tabs adventure
Some chatbots plan a query and summarize what a search returns instead of trying to produce an answer on their own; I use perplexity a lot which always performs a search, I think ChatGPT et al have some kind of classifier to decide if web search is necessary. I especially use it when I want a suggestion without sifting through pages of top ten affiliate listicles (why is there a list of top 10 microwaves? I only need one microwave!)
Its good to be shown direction. When I only have a vauge idea of what I want, AI usually helps me frame it into searchable terms I had no clue existed.
I find LLMs are often better for X vs Y questions where search results were already choked by content farm chaff. Or at least LLMs present more concise answers, surrounded by fewer ads and less padding. Still have to double check the claims of course.
Maybe that's because we're conditioned by the UX of search.
But another thing I find even more surprising is that, at least initially, many expected that the LLMs would give them access to some form of higher truth.