The logging thing is really really dumb. Having logs in a real database lets you do incredible things (e.g. correlating user feedbacks to random problems you noticed). We use our db-driven logs many times a day to debug issues and find patterns.
Don't be afraid of putting logs in databases and then figuring out how to scale it. It's not that hard and it's very worth it.
Well, he advised on using tools like Splunk which can do all that for you. But I'm pretty sure they use database internally. So conflict advices I'd say.
Splunk, last I checked, used PostgreSQL for it's backend. While maybe badly worded, my point was more toward the users who have a two column log table with a timestamp and a syslog style line of text. The problem with "advice" posts like these is you can't cover every possible scenario without confusing the less experienced readers. Do I think loggly or splunk cloud should store their logs as flat files? Hell no, their app IS logging. But in the general case of application architecture I don't think it is appropriate or terribly useful for most. I know they say "never say never", but I felt it justified when I believe it's a bad idea in 99% of apps.
It's really more about using the right tool for the job, Splunk, Loggly, NoSQL solutions, etc. are a MUCH better fit than an RDBMS for this sort of work.
Makes sense, it's been a long time since I installed or had shell access to a Splunk box, I just remember it needing PG installed. Likely that's changed over the years.
Don't be afraid of putting logs in databases and then figuring out how to scale it. It's not that hard and it's very worth it.