The power of HTML5 and javascript for creating applications is vastly over-hyped. I get that people are excited by the possibilities, but it still sort of sucks in terms of allowing you to actually take advantage of even a fraction of the power of the machine. (But we can have gradients and bouncy animations now! Weeeeeeeeeeeeee!)
Here's what you can't do in a web app yet: anything that requires serious CPU horsepower or memory efficiency. That's a lot of things! And the people that think that clever interpreters for javascript are the solution to this are nuts -- everything comes at a cost. Even the most clever JIT is probably going to double your memory usage, which sucks for cache coherency, and, well, for memory usage in general.
I see all these news items here that can be paraphrased as "check out this thing we did that was cutting edge on desktops in 1996! Now it's on the web!"
And on one hand: yeah, cool hack; but I mean, you're running the equivalent of what people would have thought of as a super computer 15 years ago, but web standards are essentially confining you to making toy apps.
Performance is one thing, but to me it isn't the big one. I can even get over the crapfest that is HTML/CSS/Javascript if I really have to. The real problem with the web is that it doesn't work for anything serious. It's not really scriptable from the user-end and there are huge problems with data interchange.
On the web I can't grab all of my Facebook photos, zip them up and send them somewhere else unless I write a monstrous pile of hacky code and break Facebook's TOS, but on a real computer it's absolutely trivial.
If I want to slap together a disposable playlist of videos on Youtube I have no idea how. Maybe I have to log in? What if I want videos from a bunch of other websites?
And even watching individual videos is terrible on the web. I either get a crappy website-dependent flash player or a crappy browser player, seemingly having lost the freedom to embed the video-playing application I want. The best user experience for watching online videos seems to be
1. Start buffering the video.
2. Browse to /proc/`pgrep plugin-container`/fd on my hard-drive.
3. Open the video in the player of my choice.
That's insane, but I do it regularly.
When the web lets me grab a text stream or a video or some music and lets me do what I want with it then it'll be ready for serious use. Until then it's just the "iPad" of technologies, a playground instead of a set of LEGO.
But thats what this shift to cloud computing is about. Web applications simply shift all difficult computation to the server and deliver the results straight to the client. As for client-side JS code, your point about the increased resource usage is correct, but moot for the next 5-10 years IMO. Hardware is going to follow Moore's law for a few years at the least, and as long as that happens, the efficiency simply isn't an important factor for most applications. History has shown that (at least as long as Moore's law holds) the tradeoff between developer time vs. running time is constantly shifted to favor developers.
Well the only place where moore's law is still working is in doubling the amount of transistors, not doubling the speed. So you can get multiprocessing, but that's very hard to take advantage of, and as far as I know there's basically no concurrency in javascript regardless.
Anything that takes a lot more time to process than the round trip time, might be worth running on the cloud. CAD operations, for instance are very optimisable this way by having a local and remote version of a function such as a 3DBoolean, where easy operations are run locally and more complex ones leverage the network and run quicker than they would natively on the host machine.
No, but it can remove a lot of load, which gives more horses for the realtime stuff. And when you factor in webGL, for shifting load to the graphics hardware, then there is a hell of a lot that can be done in the browser.
Here's what you can't do in a web app yet: anything that requires serious CPU horsepower or memory efficiency. That's a lot of things! And the people that think that clever interpreters for javascript are the solution to this are nuts -- everything comes at a cost. Even the most clever JIT is probably going to double your memory usage, which sucks for cache coherency, and, well, for memory usage in general.
I see all these news items here that can be paraphrased as "check out this thing we did that was cutting edge on desktops in 1996! Now it's on the web!"
And on one hand: yeah, cool hack; but I mean, you're running the equivalent of what people would have thought of as a super computer 15 years ago, but web standards are essentially confining you to making toy apps.