I suspect this is because googles search engine rarely returns more than 40 results for any query these days. Sure, it says "X million results", but if you actually start to scroll, you soon run out.
Yes, but they include a box for news in the first page. I assume 90% of people who search for google /facebook/youtube intend to go to the site itself. And if they want specifics they just type in the additional keywords.
so this would be
Facebook trustworthiness
Facebook stock
Facebook app
Facebook account setup help
. I guess google is no longer a "search engine" as in doing a keyword search across multiple webpages and returning the matches, but an AI powered answer engine that guesses what you want based on your keywords and only returns those results.
While you could potentially justify it by evoking an ill defined query intent detection, X returning less results than X+Y will always be surprising for me.
But why? That is the naïve assumption from the context of primitive search techniques, but in the space of actual answers I think its going the right thing. Just give the user what they asked for.
I think this approach is probably in line with Google's, or at least it seems like it. Trying to intuit what I really want instead of paying attention to what I actually searched for is also one of the big reasons why I find Google search to be terrible.
Often I'm surprised at how I just seem to be wired differently from others - if I wanted any of the above, I'd probably search for them instead of just "facebook".
facebook stock (though I of course know the ticker symbol is FB, so I'd just type that into Yahoo Finance and it would know how to handle that)
edit: I guess it's a reflection of my distrust of Google/most (all?) large companies - I really don't like it when they guide me in any direction other than what I've specified because I'm pretty sure it's some dark pattern designed to relieve me of my money.
I dislike needing to be this verbose with search engines, and hate it with voice assistants too. I’d rather a little inference than to have to be explicit every time
I don't really understand why that distrust doesn't seem to extend to wanting them to display a single result for something, as if you can trust that single result to be what you want?
I think search should also output links to some Facebook scandals, about Cambridge analytics, clearview, some research papers how Facebook affects children, speech rules dramas, Facebook myanmar scandal.
I think one would have to explicitly search for these terms for anything like it to appear.
If not then search should not be used to discover new topics.
On that note: "nobody" (nobody commercial, providers) wants "organic" results anymore. They want "filtered", algorithmic, "suggested" and the like. Besides, if their "secret sauce" algos are so good (such as they indeed are) at determining what will hit you the best, with the best likelihood of "engagement" (for whatever that is for the system in question), any extra effort put beyond the very few, first, results, is an inefficiency, and adds cost, so ...
The sad part is, with them down ranking http only sites, and with all the spam farms, going 10+ "pages" in, eg 100+ links, used to be an easy way to get to real results.
Then they added infinite scrolling, and oh well!
They should add a "this site hasn't changed in a decade" or "only give http results" for people searching for older stuff.
Google takes these weird stances, like making it impossible to find legacy sites (http sites) because they want to push encryption (fine, but don't hurt users!).
Or refusing to add zoom + reflow on chrome (super simple to do, just use a virtual dpi), showing how little they care about those with vision issues, or the aged. Why? Repeatedly they've stated it is to "punish" sites not updating for mobile.
Well, even mobile sites are hard to see for some. What a ridiculous, absurd, stupid response.
The only people they are punishing is the sight impaired, or the aged. Thanks Google.
It's not extremely weird, when they stated both of those reasons, multiple times, in press releaes and bug reports.
I am well aware of where their profits come from, but when going after them in the public sphere, you must address, and refute the reasons they publicly cite.
Also note, often there are multiple reasons. By ripping down their stated reasons, and showing the absurdity of them, and also validating how it literally hurts accessibility for those with visual disabilities, you doubly ahow them to be an organization filled with uncaring individuals, which care not for accessibility.
Personally, I wouldn't want to work in a place that wouldn't put in a wheelchair ramp. I wouldn't want to work with people that accept that. Yet that's Google.
Google wants you to be only focused on a small set of results, mostly the ads at the top of the results. They'd rather you refine your search giving them more data and more chances to show you top ad spots than let you scroll/next through results.
Yes, Google made the Internet boring for ears now.
How many results would you expect for terms like "war", "peace", "elon musk", "bill gates"? I would expect billions. Google has the data. So why does it not want to share it with you?
- One reason might be that you will not be interested in articles from 2011... but that is quite sinister to me. Why Google should decide if I want to see results from around Bush administration?
- Second reason is that some links are not safe for Google. Google will not show you results for emulation, because it might violate some patents, IPs. It plays nice to the point in destroys the Internet.
- The other thing is someone mentioned to me that it is easier to maintain small list of links, which you can blocklist. For political reasons it is difficult to show the world as is
- Google is a potemkin village that showed you really good result as the first link, while trimming all the remaining links. Redirecting traffic from small sites to corporate overlords, since it made good deals with them
- In era where Internet is full of walled gardens, I think even Google might struggle to scrape everything from facebook, or amazon
It is sinister. The internet is just not a good resource for reliable information. There are some databases that are good resources, but they are not on the public internet. The few good public databases like Archive.org are being litigated into oblivion. The powerful people in the US have decided that they want the internet to be like the old network news.
For a brief moment in time, the internet was pretty good. Not anymore. It is a place to go to be censored, monitored, exploited, and to receive your maintenance level of corpo-gov propaganda. The "social" features are just there to create illusions of real activity and for various "let 1000 flowers bloom" type propaganda operations in which lonely people are baited into stating extreme positions before they become victims of a 1984 five minute hate session.
Yea, you have to click "Tools" to see the number of search results and the time it took for Google to retrieve information. Again, idk why they hid that.
Those counts have always been wildly inaccurate, to the point that engineers on the team were embarrassed to be displaying them, but product people felt it was important to the user experience anyway so kept it. Nice to see the engineers finally getting a win there.
(I worked on Google search like 15 years ago. I'm assuming they haven't found a way to make the estimate any more accurate, since it was seen as intractable at the time.)
Previously, in recent years, when not "logged in", total results were capped at around 250. With the "AI" hype, it has now dropped even further. Perhaps more people will notice. With some searches, today I get less than 20 results. This is highly amusing given the claims about the size and scope of Google's index. These are not obscure queries. I am 100% certain the terms appear in hundreds if not thousands of web pages. Then I try the same query on a Bing-based search engine and I see the results Google is hiding.
In the early days web search engines had no such filtering; I used to browse page after page of results to discover nuggets of useful information. Back then, I wonder how many people would have believed the claims about how many pages were allegedly searched unless they could actually browse deep into the results.
If you hit the last page on the pagination bar, they just keep concatenating new page indexes at the end, so that it effectively grows infinitely large...
If by "pretty apt" you meant "really effective into keeping their users scrolling mindlessly and contributing to the 'doomscrolling' effect", yep. Seems to be pretty great for them.
The funny thing is that for us who don't use Instagram or Twitter they won't let us scroll past a screen height without asking to be logged in...
If users want to scroll endlessly, they should be able to, and they're going to, regardless of what scrolling implementation a platform uses - I scroll through many pages of HN just fine without infinite scroll.
But infinite scrolling never worked properly on browsers. With infinite scrolling, it just forgets everything once you navigate out of the page. To resume from where you were, you have to scroll and scroll and scroll and load the results all over again. Even if you were a single page away when you hit the back button. I've now been conditioned to open almost every link in a new tab because of this and it drives me crazy. This and the fact that going a back a page is so slow nowadays for whatever reason.
You just use bad tools for that. When using custom AutoPagerize UserJS mod that utilizes pushState, endless scrolling works like a charm for me for years already. Like I can scroll to thousand-numbered items of HN seamlessly. It remembers the exact page you were on after you close the tab and overrides "target" property of added pages' links to open everything in new tabs regardlessly, preventing accidental page reloads. IDK why developers don't reproduce this.
Kinda sad more and more Web sites switch to pointless AJAX app model recently and it stops working for those.
I don't think Twitter even lets you scroll at all, these days.
Anyway, "HN yells at cloud" was a fairly predictable low value comment. No, that's not what I mean. I mean "pretty apt" insofar as pagination would add no value. The order and count are changing so rapidly that infinite scroll is the obvious choice.
The order should not be changing - that’s another user-hostile invention designed to encourage doomscrolling. Increasing count is a non-issue because once you reach a content piece you saw previously you know where you are (same as any bulletin board).
People use these apps because of their personalized feeds. Saying the apps shouldn't have that, or calling it anti-user, seems pretty misaligned to what the users actually do on them. The apps are not bulletin boards.
Anyway, again - infinite scrolling is apt for their use cases. We can say that and also all frown at the concept of doomscrolling afterwards.
No, it's just plain shit. When you have to interrupt your work you have to start all over again and you have zero indication where were you. Idiots come up with things and more idiots copy them and laud it as some kind of secret sauce. While it's just plain idiocy.
Remember how they used to have links at the bottom of pages? If you scroll fast enough many implementations still retain them. That was one bandwagon people were quick to thoughtlessly jump on and even YouTube was half-assing it last I checked.
I love infinite scroll, and hate pagination. All lists should be some combination of these: infinite scroll, filterable, sortable. Pagination should never be involved. If I want to get to something on page 2, I scroll. For anything else, I’m going to sort and filter.
If the thing I want is on page 17, and I see 1, 2, 3…79…159, 160, 161, I’m still just going to want to filter. The absolute best pagination is set up in a way that lets you binary search drill down for what you’re looking for, but even that is something I’d almost never prefer over filtering
hard disagree. i had an experience just this morning looking for some pictures from an event i participated in, and the infinite scrolling was absolutely infuriating. they didn't have an index to filter on, and when i clicked on a picture to download it and navigated back, it took me to the top of the page. i had to scroll through about a dozen loading indicators to get back to where i was. sure, this was a bad implementation, but adding it to every single list of results on the web is asinine trend chasing and bad UX.
The Google search page faux-infinite scroll is not just clumsy, it also breaks "back" navigation if you click through, then back, after loading more results.
(edit: to clarify, I have the continuous scroll setting off, so it's "incremental infinite scroll" where More Results loads more results inline)
> Google said this change is to allow the search company to serve the search results faster on more searches
I'm surprised that this makes enough of an impact: in order to find the top 10 results the engine needs to retrieve and rank a much larger number anyway (I'd guess at least 100 make it to the final stages of the funnel), and that is where virtually all the cost is. That initial set is probably held in some cache so that subsequent page loads don't re-do the search from scratch.
So either this is a small win in frontend efficiency, or continuous scrolling is fast enough that a large fraction of users goes past the initial set?
I suspect the problem is that the most expensive to serve users go past the initial set.
Yet I believe most of those users are the lowest value users - many of them will be bots and web scrapers, whilst others might be real humans doing long tail queries that have no relevant ads to serve.
I'd pay extra both to serve and to receive incredibly niche ads that would only show up for long tail queries.
That was the original promise of all the ad tracking, wasn't it? Outrageously accurate ad targeting? Reach the weirdos in your tribe wherever they are?
Now it just feels like drowning in a sea of things a bunch of giant corporations have decided everyone ought to want, and have been programming people to want en masse, rather than helping people find the unique, interesting little things they would actually want were it not for all the programming.
Caching the retrieved results for future follow-up page 2 queries is nontrivial. It’s also not clear if it’s worth it, the number of requests going to page 2 are in the lower percentages at best, maybe even <1%.
So what you typically do instead is to just issue a new query and request 20 instead of 10 results, drop the first 10, and voila, there is your page 2.
I do not know if this is how google websearch does it though.
> So either this is a small win in frontend efficiency, or continuous scrolling is fast enough that a large fraction of users goes past the initial set?
Yeah, it doesn't really make sense. Infinite scroll is pages tacked onto each other in the frontend - the backend should be doing the same amount of work for page X, except less overall because it's not returning the header/footer. I don't think frontend efficiency works either because the first page of results should be the same - or, possibly, the tiniest bit more work because now they have a footer to return as well.
They have infinite scroll for the YouTube search results but the irony is that the top search results are often not the ones you are looking for so you have to scroll somewhere to the middle to find useful results (or try different query or filters) and the bottom ones are something like related/similar videos from that topic.
YouTube's search is the worst. It's basically "here's 3 videos potentially related to what you're searching for and here's another 50 completely unrelated videos".
Search engines are an engineering problem. Tuning ranking and relevance to be “good” on a mass-market content site is an unholy black art.
The expectation is that results will accomplish numerous mutually contradictory goals, and be deeply personally relevant to a user who has used the search bar twice ever.
Expecting to see the original video by the creator when you search for the exact title of that video, verbatim, is not a self-nullifying mysterious goal.
Sure, but that’s a subset of what a search bar does. For YouTube, they are very likely getting queries like “funny cat,” “Taylor swift,” “prank,” etc. Those are not factual spearfishing queries, and ranking the results is a pitched battle between the people who want to drive general engagement, the people who want to promote specific categories, the people who are targeting demographic cohorts and want to push those results, the monetization people, etc etc.
YouTube takes thousands of data points and signals to rank search results and there are millions of videos to search and rank, it's quite hard problem to solve. Initially I wasn't a fan of vertical search engine/s but I think vertical video search engine for particular niche/s would be a good idea because YouTube is slacking.
I would assume the reason is money, barring some counter evidence. Some videos must lead to longer sessions with higher engagement on average. Even for a single view session some videos will have more ads and/or more profitable ads in them.
I actually liked continuous scroll and I thought it is a good change but I guess Google figured out that SEO people got upset because they couldn't brag how: "We can make your website show up on the first page of Google search results."
Bring back the plain old google search without any of the AI! All I want is regular keyword search! I don't get any relevant results these days without adding additional qualifiers like site:reddit.com!
This doesn't just hide the AI panels, but also the big knowledge graph panels that appear at the top when e.g. some movie happens to have a name similar to our search query.
when it was introduced it necessitated an additional step (having to scroll) when wishing to ctrl+f the search results page (I presume most of us have the number of results set too 100, the maximum).
It’s not the same as infinite scroll - apparently, it’s quite recent:
„Google launched continuous scroll on desktop search in December 2022 and first in mobile search in October 2021. So continuous scroll only lasted about a year and a half for desktop search.“
> Google also told us that it found that loading more results automatically didn’t lead to significantly higher satisfaction with Google Search.
I suspect that the low satisfaction with Google Search has less to do with UI/UX issues such as this and more with the quality of search results. But I guess bringing back pagination is easier...
Counterpoint: the flexible approach makes things more surprising and difficult to reason about, both for technical people and laymen. The principle of least surprise is more important than marginal improvements in edge case handling.
That's okay. I'm using duckduckgo anyways.
Either it just gives me better results or I got so used to it over the years that I can't make good google-compatible queries anymore.
I find DDG better than Google in general but it's annoying getting local results for unrelated things inserted. For an example, if I search for "protonvpn opensuse", then on page 2 I get results for a local charity, a state political party, and a local cannabis dispensary.
Edit to add: I do not get those results when using bing directory.
This is something that I’ve pondered from time to time. I used to have fantastic Google-fu (or alternately, Google used to be fantastic, and it was them that had the fu.) I switched to DDG in 2017 or something, and initially my search results (for whatever reason) weren’t as good. After a few months I found that they were pretty good, and more so, any time I dropped back to Google when I got stuck, the results were hot garbage. I put this down to losing my Google-fu and simultaneously acquiring DDG-fu, but I do wonder…
from seo/visibility perspective, i feel like there will be a race to be on the first page yet again. i am sure it got switched to "first n" result or something but fun to be back i suppose.
Yeah, the cynic in me wonders if this is to drive more traffic to page-1 links (which, incidentally, are often mostly 'sponsored' links). Maybe too many people had learned that hitting pgdn a few times yielded better search results.
Eg. a search for "facebook" only has 68 results!