I think it very much depends on the language and platform/environment.
I've been programming professionally for nearly 20 years, in languages ranging from Perl to C/C++ to Javascript, and the only times I've used GDB in the last 15 years have been when C/C++ code is segfaulting and I want a stack trace. The rest of the time I use print/printf statements. I've built some pretty successful software this way, some of which you've probably heard of or used.
On the other hand if I were coding exclusively in C/C++, or Java, or using an IDE for my work, I might use a debugger more.
I think the use of the word printf makes which ones we're discussing pretty clear.
I'm getting up there too. And I know lots of older programmers with very bad habits. Many who have also working on household names. Age+popularity != skill/quality.
Doing that in anything but interpreted and micro applications seems like a massive waste of time and effort(both placing the printfs, recompiling, removing them recompiling.
We've all done it. It doesn't mean its a good way to do it.
And like I said above it can introduce bugs into your code doing so. I remember one case where a printf for debugging changed the memory in such a way as to make an uninitialized variable work. After testing when the printfs were removed the application would stop working.
Introducing code into your code base with the intention of removing it later when there is an easy way to not do so is asking for trouble.
Once you learn how to use a debugger properly it is infinitely better than printfs.