Access logs that write continually from different processes or threads (like PHP) can cause System IO to reach critical proportions. In any non-trivial enterprise application it's completely reasonable to have a stat-logic-layer for recording events... The access log can still do it's own sorta thing - this is more true if you have more than 1 server.
Does seem potentially problematic if you don't want to be exposing server-logic. Especially when the developer may not know what software will be ultimately serving the static files over http.
As a newbie webdev, what would you suggest? Making an API-based web application? (the backend would serve and receive JSON and the front-end would simply play around with it) Or am I completely misunderstanding this.
That's a reasonable way to build an application, and one that I personally prefer, but there isn't a silver bullet. But if you're building a Node app your server js and your client js don't need to commingle. Make a src/js/client directory, and stick your package.json in there (if you want to use this tool).
Agreed, this is why our solution was to use Node.JS and achieve parallelism through eventually consistent independent data-stores separated on each physical web-server application instance, I'll gladly switch to a distributed centrally managed system when it becomes available.