Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This might just be a detail, but the author seems to think their phone's CPU is doing some heavy lifting when they search something on Google (in this case, why erasers are pink). If I'm not mistaken, even the voice recognition happens on a server, and the phone just records audio and sends it. And the interpretation of the query and returning a result have nothing to do with the device itself either.

"Google has to be smart enough to figure out in context that I said pink and that I’m asking about the historical reason for the color of erasers, not their health or the way they’re shaped. And it did. In less than a second. With nothing more than a cheap little microprocessor and a slow link to the internet."



Not a detail at all: the author seems to confuse "the whole Google cloud system works as expected - approximately in line with the 80/20 adage" for "the phone in my pocket is independently capable of intelligent, contextful, humanlike reasoning."

Google just needs to see where the previous million people went after asking this particular question - odds are good that the million-and-first person will also want that particular answer. It's a clever trick all right, and a marvel of engineering, but no robots: this is mostly human behavior: collected, averaged, and parrotted back. (Yes, there is machine search at the base of it; unlike Altavista before, the relevance of the results is shaped by its human users though)




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: