Adding printf statements implies recompiling or rerunning code. A good debugger lets you set breakpoints against your app while it runs, and have the breakpoint log a message instead. So even if you just want to log messages at certain spots, a good debugger works better (to say nothing of following program execution through long sequences of source code files, which is even handier when the source code is some open source component that you didn't write).
I agree that printf works better than "blindly running around setting breakpoints", but so does pounding a 12-pack of beer and then hitting yourself in the face with a mallet until the solution comes to you.
On the other hand, learning to use a modern, advanced debugger and then applying that skill along with some of that logical thinking can be extremely effective in finding the cause of bugs as quickly as possible.
For languages like Objective-C or Java, I think the 'real men don't need a debugger bro' attitude is nuts. It is different for a lot of newer and/or more dynamic programming languages, which simply don't have debuggers anywhere close to as powerful as those of IntelliJ or Xcode/lldb.
I agree that printf works better than "blindly running around setting breakpoints", but so does pounding a 12-pack of beer and then hitting yourself in the face with a mallet until the solution comes to you.
On the other hand, learning to use a modern, advanced debugger and then applying that skill along with some of that logical thinking can be extremely effective in finding the cause of bugs as quickly as possible.
For languages like Objective-C or Java, I think the 'real men don't need a debugger bro' attitude is nuts. It is different for a lot of newer and/or more dynamic programming languages, which simply don't have debuggers anywhere close to as powerful as those of IntelliJ or Xcode/lldb.