It wasn't immediately clear what relevant meant, but I don't doubt they have thought deeply about that
My real comment though is that to some extent, the more accurate the results are in terms of "relevance", the worse the outliers, and the worse the user understanding of how the query went wrong.
For better or worse, the nice thing about keyword matching and boolean queries is you understand exactly why something did or didn't match. If your results are coming from a neural network optimized for relevance, it's much tougher to form a mental model of how the search responds and to adjust it.
(This is not a new idea, nor is it specific to google, but I think it does offer an explanation of how google has reduced irrelevant results even as there are more and more complaints about search getting worse. Nobody will complain a keyword search doesn't do what they wanted...)
It seems to me that they achieve this by increasing the relevance of common, often made searches at the expense of rare specialized ones. Google used to be very good for special narrow searches on technical topics.
Lately when I search for something like "recipe for borgelnuski" I get a page of links to sites with names like "Molly and Audrey" that first tell you a long story about their grandmothers pet kangaroo, then go through a very long a tedious explanations about ingredients, then vaguely go through the recipe step by step with a lot of pictures, but if you don't mind scrolling for 20 or 30 minutes, you get a very good recipe for borgelnuski.
It seems that when I do a narrow technical search lately, I get a page of links about funeral homes and insurance companies in Haskell, Texas.
> Lately when I search for something like "recipe for borgelnuski" I get a page of links to sites with names like "Molly and Audrey" that first tell you a long story about their grandmothers pet kangaroo, then go through a very long a tedious explanations about ingredients, then vaguely go through the recipe step by step with a lot of pictures, but if you don't mind scrolling for 20 or 30 minutes, you get a very good recipe for borgelnuski.
So what you're saying is Google took you to a page that had a very good version of the thing you were searching for?
My real comment though is that to some extent, the more accurate the results are in terms of "relevance", the worse the outliers, and the worse the user understanding of how the query went wrong.
For better or worse, the nice thing about keyword matching and boolean queries is you understand exactly why something did or didn't match. If your results are coming from a neural network optimized for relevance, it's much tougher to form a mental model of how the search responds and to adjust it.
(This is not a new idea, nor is it specific to google, but I think it does offer an explanation of how google has reduced irrelevant results even as there are more and more complaints about search getting worse. Nobody will complain a keyword search doesn't do what they wanted...)