Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can't make a JavaScript engine run anywhere near fast enough in PNaCl, so you need at least all of V8 in your "kernel". You could run WebKit on top of PNaCl with some magic V8 bindings, but I don't think this has been done before, and considering how big WebKit and NaCl are, it would probably be a lot of work. Useful work to do, but a lot of work...

(And of course, Chromium already supports a process sandbox, so while this could theoretically be a really cool win for architecture, having less platform specific code to maintain, it wouldn't necessarily be much of a security win - the analogy of the kernel only applies so far.)



Chances are if you want your browser to run fast you won't have one built in javascript :)

Javascript is nowhere near the performance of C++, asm.js is getting there, but this obviously isn't built in that.


I'm aware of all this. I'm not saying it would be easy - and the belief that high speed Javascript is necessary is one reason we've been held back for so long. Nobody should be writing a browser in Javascript (when I mentioned this possibility I was simply humoring the poster's ideas). An "Exobrowser" (or perhaps "Microbrowser") is what we should have by now, but the linked project is not it. NaCl and asm.js are only necessary because operating systems have failed to successfully implement process separation (more crap, monolithic design at work), so now it's being reinvented in user land with the resulting performance overheads. Javascript doesn't need to be fast! The whole idea of high performance programming done in Javascript is stupid, especially if it means saying "no" to a better architecture designed around security and speed for better systems languages. Even Mozilla's penny has dropped on low-level code execution - hence asm.js. If operating systems had been done right in the first place then you could have the following arrangement:

- Each browser tab is a separate OS process w/o any access to system calls except for calling browser services

- Browser provides services (e.g. graphics primitives, input)

- Processes from the same domain can talk to each other

- Browser comes preloaded with some preferred, but optional portability layers for the processes

- Everything else is libraries, with one domain being able to provide services to another. So mozilla.org could provide its rendering engine either as a library or as a background process (to reduce memory overheads).

And so on. This way there's no more waiting 10 years for Mozilla to implement whatever it thinks you need, with their 640k ought to be enough attitude. Their rendering engine has to compete with others. The most popular engines are most likely to already be in cache when someone visits your site, so there is room for lots of vigorous competition. This is all so painfully BASIC, but it will likely be decades before people get it right, if they ever get it right.

Chrome is going in the right direction, the poster's linked project is not.


On NaCl I partially agree and partially disagree. While it's true that the sandboxing is only necessary due to insufficiently flexible kernels, imo the most interesting part of NaCl and asm.js is their use of a portable bytecode that compiles into native code. Portability really is necessary - CPU architectures don't change that often, but if people started distributing websites as native code a decade ago, none of them would have envisioned that a large portion of web browsing is now done on ARM based devices. Yet you cannot make a fast portable JIT. You say that JavaScript doesn't need to be fast, but it's really nice to have a high-level, dynamic language that still runs fast - in fact, it was compelling enough to be one factor in the success of JavaScript on the server, despite the language's weaknesses.

However, there's no reason JavaScript (or some suitably compatible dynamic-language bytecode) JIT couldn't be provided as a fundamental API in addition to the static compiler. Yeah, it doesn't feel like a clean architecture when you want to use Python or Lisp and it almost translates neatly to that bytecode but with little runtime differences that end up adding a lot of overhead... but it's better than nothing.

I think that your hypothetical arrangement would be very cool. I'm not sure that it would actually be better than what we have - for example, writing a screen reader would likely be a nightmare if some random webpage might be using a browser library that didn't support it; good luck implementing anything like user scripts/custom CSS, scrapers, Readability, magic text reflow for iPhones, smooth zooming, etc. Good luck doing something like the transition to hardware accelerated rendering browsers did a few years ago (sure, you could only support it for new sites, but as is I get smooth scrolling for all sites). And since different engines would now be very fundamentally different rather than the usually relatively thin layers over HTML that are currently popular, developers would have to spend more time learning different APIs. If some engine stopped being maintained, then it would be very difficult to retrofit websites that use it to support the newest features. Et cetera. Meanwhile, these days browsers move pretty damn fast, lessening the advantage of non-standardized development - and many new APIs are hooks to the OS anyway, not things that UI layers could implement on their own.

But it would be cool. I don't mean to be too negative: there would be a lot of advantages, and it would be interesting to try out.

I suppose it might happen. PNaCl and asm.js are soon going to be supported in two of the most popular browsers; alternatively, if JS engines get good enough that specific support for asm.js isn't required to achieve performance for low-level code (https://bugzilla.mozilla.org/show_bug.cgi?id=860923), with the competitiveness all major browsers already have on JS speed, the latter will be "supported" everywhere on short notice. It might not be that long until the first serious attempt to make an alternative UI stack for browsers...


NaCl is not a portability layer. It is a security layer. PNaCl is a portability layer, built inside NaCl more-or-less in the manner I just described, but with all the overheads and limitations of NaCl (which are real). So NaCl is in total agreement with me. asm.js is basically a joke, rolling portability and security into one layer, but when you're dealing with the web you take what you can get sometimes.

>Yet you cannot make a fast portable JIT.

So? What difference does sticking this in the browser as a privileged component make? There's no reason google.com can't provide a DOZEN compilations of V8 in the setup I described. The difference is I can write my own portability layer. Maybe some authority can control which portability layers are valid to prevent too much native code. Mozilla is the perfect candidate with their police-the-web attitude.

>However, there's no reason JavaScript (or some suitably compatible dynamic-language bytecode) JIT couldn't be provided as a fundamental API in addition to the static compiler.

Did you even read my list of points? You don't need this! You just give the user access to properly sandboxed native (not NaCl, which has limitations and overhead) and provide portability layers plus the ability to add new portability layers. There is NO reason Javascript needs to be privileged in the manner you're suggesting.

>for example, writing a screen reader would likely be a nightmare if some random webpage might be using a browser library that didn't support it

How is it any different if people start building all their stuff with WebGL? What about when people use tonnes of images without alt tags? Accessibility never works automatically! And it can be provided properly as a browser service, which different renderers hook into. Hell, it could probably even be in userland. Mozilla could even provide disincentives to non-compliant renderers. They love playing the policeman, so why not do it properly instead of doing it by holding technology back as much as possible?

>good luck implementing anything like user scripts/custom CSS, scrapers, Readability, magic text reflow for iPhones, smooth zooming, etc.

Firstly, HTML would still most likely be the standard for most web pages. So there's no need for "luck"; it would be done the same way it always has been. You're trying to set up an opposition between my ideas and HTML. My ideas are opposed to HTML, DOM, Javascript as privileged entities. And they would have to compete with other markups and document models and languages. Just like C++, C# dominate on the desktop, but they have to compete with more specialized languages - to everyone's benefit. And aside from the most basic, unstyled HTML, it has always taken some forethought on the part of the webpage author to get things like accessibility and compatibility with different window sizes to work. I can tell you this because I have terrible eyesight and view many pages zoomed a long way in.

>Good luck doing something like the transition to hardware accelerated rendering browsers did a few years ago (sure, you could only support it for new sites, but as is I get smooth scrolling for all sites).

Why on Earth would this be a problem? Even though the renderer is in user mode it's not baked in statically, or even necessarily linked in at all. It could be spoken to via message passing. First ask the system to give you a shared rectangle inside your tab, then send the handle to mozilla.org along with some web content, saying "please draw this". Similarly for input events etc. And of course, you can have preloads that do all this for you so on the server-side you just send down the HTML in the usual way. These kind of arguments are always such rubbish, just like when Mozilla says binary codes can't evolve as easily as source code. What does that even mean? Source codes ARE binary codes!

>If some engine stopped being maintained, then it would be very difficult to retrofit websites that use it to support the newest features.

Which is why most people would use HTML, and people who are trying to do things that HTML is totally unsuitable for would not, paying the resulting costs.

>Meanwhile, these days browsers move pretty damn fast, lessening the advantage of non-standardized development - and many new APIs are hooks to the OS anyway, not things that UI layers could implement on their own.

The browser is a technological slug. V8, Flash, NaCl and Unity are the only reason we have had any real advancement, and it's an advancement back to decades ago. Web developers just have extremely low expectations and are always trying to resist the approach of superior technologies. I can remember telling people years ago that sockets were needed (there's this wonderful thing called interrupt driven programming you see) and got much the same sort of criticisms you outlined above from all the "web experts". Of course it has since been implemented.

>many new APIs are hooks to the OS anyway, not things that UI layers could implement on their own.

I already said this! Perhaps you missed the point of the post, which is that the point of such primitives is to implement applications (e.g. UI layers). It is a post against the monoliths.

>JS speed

I'm sorry: "JS speed" doesn't exist on current hardware. The reason asm.js was so fast with minimal additions to the optimizer is that the JS optimizers all work best on statically typed code (in other words, not Javascript), which is of zero surprise to anyone who knows anything about compilers or optimization. Essentially, the people working on "Javascript" engines have really been writing optimizers for a small subset of the language that discards everything dynamic. Whether this was intentional or not is irrelevant; that is what they have done. That's how bad Javasscript is for this task, and how GOOD the old, statically typed ideas are: so good they couldn't help but do it, even when they were trying to optimize their "dynamic" language.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: