Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I really hope so. LLM for search seems like a big leap right now. It isn't even really search but just a UI change for interacting with the underlying search engine.

To question the underlying premise: what makes you so sure that an improved interface for searching _isn't_ meaningful? From what I can tell, we're a lot closer to optimal collection/categorization of data in search engines than we are to optimal interfaces for searching that data. We've all seen stories like the grandmother typing in questions into google with "please" and "thank you" (https://www.theguardian.com/uk-news/2016/jun/16/grandmother-...), and the concept of having good "google-fu" shows that right now, being able to find the answer to your question is influenced not just by whether the answer exists but whether you're skilled at _asking_ the question.

I don't think this improvement is limited to non-technical folks either. As an example from just the past couple of days for me, I recently have been running into issues with Linux gaming on my laptop due to abysmal power management, and from doing some research, it's somewhat of a known thing with my laptop brand and model. I decided to research what laptops are known for being good for gaming on Linux and also fit my specific preferences (at least 1440p, 16 GB or more RAM, and AMD CPU/GPU for good measure due to my issues being related to Nvidia's weirdness on Linux). I spent a good hour or two finding specific models that seemed promising, searching for mentions of them in places like /r/linux_gaming, looking up availability and prices, and while I found a few potentially decent options, I didn't have much confidence that I was finding all potential options. I found some options for laptop-specific searches that were purportedly able to let me select on whatever criteria I wanted (e.g. noteb.com, notebookcheck.net), but none of them let me pick the _exact_ critieria I wanted; some of them were too granular (e.g. making me search and select exact GPU models to check off instead of letting me just say something like "discrete AMD GPU from 2021 or later", or giving me a list of 30 or so different resolutions and making me manually check off the ones I wanted to include without enabling bulk checking with shift-click) and some of them were not granular enough (e.g. only letting me select a single resolution to search for at a time, or allowing me to require a discrete GPU but not specify the vendor). On a whim, I decided to open up a session with ChatGPT and present it with these criteria to see what it came up with. I needed to nudge it to prune a bit (occasionally it would give me a clearly incorrect option, e.g. one with an Nvidia GPU or only 1080p), but within a few messages, I was able to get it to generate dozens of options. Unfortunately, it only had knowledge up through 2021, and despite trying various roleplaying methods with it to circumvent the "I can't search the internet" policy based on things I saw back when it first became available, I wasn't able to get it to completely finish the job, so I only was able to use those options as a guide for looking up newer models and then finding reviews from people who had used them for Linux gaming. If/when a language model like that that has access to search current data is made generally available, it genuinely seems like that would be a game-changer.



I'm not downplaying the change improved interface at all. I think it's an extremely impactful change. But it isn't sorted out completely right now. The Google search box cannot be replaced by a LLM chatbot. I totally relate to your laptop example since I've experienced something similar myself. The main advantage with LLM/GPT is distilling something complex into a much simpler form with the ability to ask questions and maintain context.

In fact, your laptop example is a perfect illustration of my point. Finding a laptop for Linux gaming is extremely complex. Let's not kid ourselves. The number of things that can go wrong (especially with a Nvidia GPU) is bonkers - my machine completely nukes the display manager every time I update Debian forcing me to do a re-install of SDDM. But the problem here isn't fundamentally search. We know what we're looking for and the exact criteria. Like you said, the problem is collection and categorization of data and presenting it to the user in a helpful manner. This is a digital version of a computer salesman who actually knows their job. LLMs are just salesmen who know about a lot. I'm just extending this to teaching and the knowledge industries and saying, "Look, if you can present information about computers so well, you can tell me about heat death a lot better"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: