It just means you're not doing anything interesting on your job.
If you have constraints, i.e. not enough compute, not enough memory, not enough storage, then you'll need a solution and, odds are, somebody has had the same problem before and it's been written up already in Knuth's TAOCP. Of course, the problem won't be exactly in the form it's been written up, but you have to have enough experience and knowledge to know the abstract form of your problem in order to figure out what parts of TAOCP would be useful to you.
Want a concrete example? Well, I'm in the generation that go to assume that storage was basically constant access regardless of whether you wanted to access the 1st byte, or the Nth one. New, high-performance memory like NAND flash, is weird because it's fundamentally unreliable. In order to make NAND reliable you have to do stuff, and some of the stuff you can do, makes NAND look like tape, and accessing tape is best done sequentially. Guess what TAOCP has? All these great ideas about maximizing random access to sequential media.
Serious question - why wouldn’t you search for the algorithm you need (or the problem statement) on Google scholar these days? I have the books but if I ever found myself reaching for them at work, I would consult most recent literature
I read a lot of papers I find on Google Scholar, and it's very common that they are wrong, or don't accurately survey related work, or pointless. For an overview of a field or a subfield, you want a review paper, and TAOCP is the best review of algorithms I've found. It's true that some of the volumes are outdated, but in many areas the new work is incremental.
Consider, as an example, Fascicle 5b, Introduction to Backtracking. His citations in the first 25 pages are from 1899, 1918, 2017, 1957, 1967, 1963, 1900, 1947, 1882, 1848, 1849, 1850, and 1960. Had he somehow managed to write this chapter in 01968 instead of 02019 he would only have been missing one of them, though surely the illustrations and experimental results of his own he reports would not have been as excellent, and surely the overall framing of the section benefits from the additional 61 years of hindsight.
In fact, although enormous progress has been made in backtracking in recent years, it hasn't affected the introductory aspects of the question. But clearly you can learn an enormous amount about backtracking without straying into the literature of the current millennium.
Nowadays, most papers in CS are filler (or even worse, sometimes they're just fraudulent), written to meet arbitrary bureucratic requirments. Sifting through that noise is really tedious.
If you have constraints, i.e. not enough compute, not enough memory, not enough storage, then you'll need a solution and, odds are, somebody has had the same problem before and it's been written up already in Knuth's TAOCP. Of course, the problem won't be exactly in the form it's been written up, but you have to have enough experience and knowledge to know the abstract form of your problem in order to figure out what parts of TAOCP would be useful to you.
Want a concrete example? Well, I'm in the generation that go to assume that storage was basically constant access regardless of whether you wanted to access the 1st byte, or the Nth one. New, high-performance memory like NAND flash, is weird because it's fundamentally unreliable. In order to make NAND reliable you have to do stuff, and some of the stuff you can do, makes NAND look like tape, and accessing tape is best done sequentially. Guess what TAOCP has? All these great ideas about maximizing random access to sequential media.