Firstly, web app vs web site isn't a binary distinction - it's a gradient. Is Flickr an application or a site? It provides extensive tools for uploading and organising photos... but the bulk of the site is flat pages people use for navigating vast amounts of content.
Secondly, URLs really, really matter. Twitter have a big engineering task on now to clean up the mess made when they switched to using broken-URL #!s. The ability to link to things is the most important factor in the Web's success. An application built as a fat JavaScript client that talks to an API is opting out of the URL-driven web.
Even if something uses #! or pushState to emulate linkable URLs, if I can't hit a URL with an HTTP client library (not a full-blown JS+DOM implementation) and get back a content representation, it's not a full-blown member of the web - it's no better than content that's been wrapped up in a binary Flash container.
Don't think I'm not pro-JavaScript - I'm a big fan of using JavaScript to improve user experience etc (heck, I have code in the first ever release of jQuery). I'm just anti JavaScript being required to consume and manipulate the Web.
I'll concede that there are plenty of applications for which a fat client does make sense (image manipulators, data vis tools, possibly interfaces like gmail although I'm continuously infuriated by my inability to open gmail messages in new tabs). But the thinking embraced by the original article, that Rails-style frameworks are on the way out because EVERY site should be built as a fat client, is dangerously short-sighted in my opinion.
Hashbangs are just a hack while we are waiting for the browsers of the world to all support push state. It is easy to use, well supported, and works really well. What twitter was saying is that in a public, content based app like theirs, the trade off of using that hack is not worth it, so they are moving to push state, and degrading to a worse experience for browsers that don't support it. Twitter isn't an argument against js apps, it is an argument against js hacks to provide fancy functionality to old browsers.
But then we need to use the same set of templates on server-side (on full page loads) and on client-side (when updating via JSON) ? Or we do like Quora and generate HTML on the server-side?
I'm not saying I agree with twitter, was just trying to explain their argument :) I think it is better to go fully one direction or the other. Either don't support IE9-, or go full reloads until they feel comfortable not supporting IE9-. (or stick with the hash bangs)
In a more general way, I use backbone to make data driven components. Those components are always rendered client side, and the layout/static content is rendered server side. I think duplicating would be theath to madness. Generally, it's fine to bootstrap initial data on page load, and render everything. But in times where that takes too long, I have rendered a "dead" version on the server (like, greyes out with a spinner) then replaced it on the client.
Firstly, web app vs web site isn't a binary distinction - it's a gradient. Is Flickr an application or a site? It provides extensive tools for uploading and organising photos... but the bulk of the site is flat pages people use for navigating vast amounts of content.
Secondly, URLs really, really matter. Twitter have a big engineering task on now to clean up the mess made when they switched to using broken-URL #!s. The ability to link to things is the most important factor in the Web's success. An application built as a fat JavaScript client that talks to an API is opting out of the URL-driven web.
Even if something uses #! or pushState to emulate linkable URLs, if I can't hit a URL with an HTTP client library (not a full-blown JS+DOM implementation) and get back a content representation, it's not a full-blown member of the web - it's no better than content that's been wrapped up in a binary Flash container.
Don't think I'm not pro-JavaScript - I'm a big fan of using JavaScript to improve user experience etc (heck, I have code in the first ever release of jQuery). I'm just anti JavaScript being required to consume and manipulate the Web.
I'll concede that there are plenty of applications for which a fat client does make sense (image manipulators, data vis tools, possibly interfaces like gmail although I'm continuously infuriated by my inability to open gmail messages in new tabs). But the thinking embraced by the original article, that Rails-style frameworks are on the way out because EVERY site should be built as a fat client, is dangerously short-sighted in my opinion.