Hacker News new | past | comments | ask | show | jobs | submit login
Web App Speed (ignorethecode.net)
124 points by tbassetto on May 31, 2015 | hide | past | favorite | 37 comments



Flash sites, apps and games also earned a bad reputation for eating up CPU, loading too long, doing too much and running poorly. And it was mostly due to the culture of the developers.

In the Flash vs HTML5 iPhone wars, people argued that if Flash was hated because of the developers who abused it, what's going to stop the same thing happening to HTML5/JS/canvas?

It's happening. Usually not on some interactive visualization game but on simple articles. And the response are things like Facebook Instant that introduce a babysitter against ambitious developers and dysfunctional organizations.

But it's really something that should be built into the web. Sites that eat up a lot of bandwidth and CPU can be marked. Sites that legitimately require more resources, are experimental in some way, can be marked by the developer as such, warning users. An article would rarely be one of those.

If no centralized measures are taken against poor UX, only relying on culture to maintain UX across the board, the web stack will meet the same fate as Flash. Competitors like Facebook Instant will route around it.


This was posted the other day:

https://news.ycombinator.com/item?id=9599905

It's an animated company logo in HTML5. The standout "feature" is that the page pegs a CPU core at 100%. Turns out that this is because the developer wanted to update the page favicon at 60fps.

In the post, the developer explained this decision:

"I got this idea in my head that it'd be really slick to animate the FavIcon to stay in sync with the logo. I love little moments like that. Others might feel that's frivolous or unneccessary but I disagree - details like that are the little moments that make the difference."

Nothing (short of taking away APIs) will stop developers from coming up with these "little moments" that will burn CPU time and battery to oblivion.


glad i'm not the only one who found that feature to be gratuitous!


> Flash sites, apps and games also earned a bad reputation for eating up CPU, loading too long, doing too much and running poorly. And it was mostly due to the culture of the developers.

That's one reason. The other reason is that all Flash runtimes other than the heavily-optimizied primary implementation for Windows universally suck. That ends up being a rather large pain point when developers are more likely than average users to be running Mac or Linux.


To be fair, in Linux, X window GUI sucks in general. For Mac system, everything besides Quicktime sucks.

Also NPAPI sucks compared with ActiveX.

Flash in its original popularity was its vector animation streaming capability, think SVG withe a timeline with streaming. Plus a decent ECMA scripting environment, and later hardware accelerated 3D apis and C++ compilation support. It's a good platform past its prime time and caught in a marketing controversy in an unfortunate mobile age.


Flash was always very buggy. Perhaps you didn't see them if you kept to a strict web developer subset of Flash. But the problem was that it had a different set of bugs on every platform. I guess that codebase was very old and could have been shared better between platforms. So at least for for non-Windows users that meant the plugin crashed daily on whatever input you happened upon on the web.

It also had (and still has) an enormous set of language features, a billion codecs, a handful of protocols and several different scripting languages, a vanishingly small subset of which was still maintained. The result was probably the most insecure software ever to grace the face of the Earth. Running by default. Taking input on every web page you ever visited.

So it was never just a "marketing controversy". Sure, Apple and Adobe could never agree on the terms for flash on non-Mac devices. But that alone would never have been enough for the demise of Flash. Google, and to a lesser extent Microsoft, did large parts of the work. They were rightfully concerned not just with Flash as a vector for malware, but also the power it brought to Adobe who could single handedly decide what Flash would run and on which platforms.


> "caught in a marketing controversy in an unfortunate mobile age."

I understand this argument, but I don't believe that this was primarily a marketing war. Browser plugins are dirty, they always have been. It is not surprising that the latest web standards are attempting to plug this gaping security hole.

Furthermore, all of the features you have listed were only implemented because Adobe had a huge financial monopoly on the technology.


No, abused and over-powered browser plugins are dirty.

Quicktime is a plugin, do you feel its presence? Yet WWDC video pages were not HTML5 compatible and using Quicktime plugin.

The same applies to Canvas or SVG, yes they are standards, but their performance will suck worse than Flash if advertisers start abuse animations with canvas. Best part, you can not disable canvas like you can with Flash.


On a related note, Google penalises websites with longer load times. http://googlewebmastercentral.blogspot.nl/2010/04/using-site...


I understand the reasoning but I don't agree with their implementation, because it essentially favours large sites run by those who can afford the faster infrastructure, while penalising the small ones that may actually have more relevant and detailed content but not very fast servers. Sites which load much faster get a ranking advantage even if they offer only superficial lower-quality content.

Maybe if they didn't look at server response time, and only counted things like JS execution time, it would be a fairer ranking to those looking for substantive content. I'm quite willing to wait a lot longer, if it means I will spend more time reading a lengthy page once it loads, than to find a fast-loading page with little content.

That article was written 5 years ago, and while speed might not have been a huge factor back then, they could've changed it since --- certainly there is no evidence to suggest otherwise, and my experience with how Google's results have changed over time agree.


Interesting take.

I'd hedge that 95-99% of alexa top 1000 sites could be delivered under 30kb -- just content+ css. (assuming ajax/video/fonts are asyncronous in google's algo and not counted towards load time).

In which case it's easy even for the cheapest AWS server to host and server instantaneously. I think the bigger sites have a distinct disadvantage -- they have so many tracking and ad resources, that it kills their load time.


Yup. For my company website I made sure that most pages require just a single request to render: CSS is inlined and (small) scripts are loaded at the bottom of the page. This means that even when hosting it on a crappy host, without a CDN, you get stellar page load performance.


Infrastructure is incredibly cheap these days. Hell there's even a free, high quality, CDN small organizations can take advantage of (cloudflare).

Software architecture has a lot more to do with app speed these days than expensive infrastructure does.


Hell there's even a free, high quality, CDN small organizations can take advantage of (cloudflare).

Cost is only part of the equation. What about those who don't want to use a CDN for, among other things, privacy reasons?


Ironic, since blogspot is a prime offender for showing a blank page while the js gears turn and grind until the article text finally appears.


It also breaks scrolling and zooming. I was hoping someone would mention that.

"Do as I say, not as I do".


That isn't going to catch sites that load reasonably but behave slowly afterwards (eating CPU time so potentially slowing other apps down too).

There are a couple of sites I visit that do this. If I leave a few tabs of imgur open in Chrome they'll consume a pile of CPU time (this doesn't always happen, so perhaps it depends which ads are on the page?). IFLS pages eat CPU and memory too (again, almost certainly due to the ad content). In both cases it can be noticeable on a powerful desktop machine, so it must be a significant drain on low powered laptops and such.


The root of all evils, as you commented: "People don’t want to learn JavaScript".

However, I think it's because DOM/BOM API sucks. JavaScript, namely the language itself, doesn't even specify any event driven or asynchronous things. Not to mention why NodeList horribly isn't an array, or the need to write the long and annoying 'getElementsByClassName' rather than a simple $( ).

(Yes I know things like 'querySelectorAll' are much better, but it's too late for most of prejudices)


someone at Google or Mozilla should already have been writing an API that's similar to jQuery that just "compiles down" to the DOM methods so that the loading of jquery isn't necessary. I'm surprised no one's pushed for using jQuery's API as an additional DOM API. Old browsers you load jquery, new browsers you don't and your code doesn't change. It would cut down on a few seconds of load time for sure and it'd be better than using a CDN.


People are using the canvas now to achieve fast speeds on mobile web. See Flipboard's post on this [1]. The big issue ppl brought up was that using canvas instead of DOM meant losing accessibility. Btw Flipboard released an interesting framework called React-Canvas[2] based on their efforts.

1. http://engineering.flipboard.com/2015/02/mobile-web/ 2. https://github.com/Flipboard/react-canvas


Flipboard assumed DOM is slow. I don't even think they created a prototype to see how slow DOM is.

I implemented this in few hours. Works on iOS, Android 4+ and WP

- Demo for mobile: http://premii.com/play/flipboard-style-news/

- I am using it in real app here. http://reddit.premii.com/#/r/news/


Your demo works, but I get an average of 10fps on my LG G3 and 20fps on my iPad, with the occasional freeze. The Flipboard demos are incredibly smooth with no lag or hanging at all.


iPad was released 5 years ago. I have iPad2 and iPad air 2, and both works fine.

I don't have LG. I have tested on Asus, Samsung S3, and Nexus 5. Try it on Chrome browser, and see how it works.

Some images (Washington post, latimes, etc) are bigger than 20 MB each. All browser freezes when you render those images. My demo specifically includes those to see how bad it performance on different devices.

One of the benefit of being a big company is you can control everything. Flipboard scales down those images to 300kb before sending it to browser. I use original images.


iPad Mini 3 if you really want me to be specific. I was trying it in the latest version of Chrome on my LG, that's where I get 10fps. Using the stock browser it freezes every other slide (and won't progress past slide 4).

> Some images (Washington post, latimes, etc) are bigger than 20 MB each

Why not use resized ones for your test? Sort of invalidates it if you can't tell if it's being janky because of the DOM or because of image size.


DOM is slow. And your example on iPhone 6… well it „works“, but for sure it does not work well.


What doesn't work well?

Does it freezes on 5th and 18th item? That is on purpose.


1. Regarding native complexity: surely, using CoreGraphics is just as easy as canvas?

2. That game looks like it's running at about 15fps. I'm not sure that constitutes very good proof of the argument. The best apps and games run at 60.

3. I'm not sure who's using web MVC frameworks en masse and to what end, but I would NOT want to write a data-driven web app like Discourse from scratch.


Agree, similar arguments came to my mind when I read the article. Two other relevant points:

4. Even if a game like the one in the article performs reasonably well using HTML5 Canvas + JavaScript, that still doesn't mean it's 'efficient'. A native application could be much smoother still, use less battery, etc.

5. The article is titled 'Web App speed', but it's only (somewhat) relevant in the context of sprite-based games. With WebGL it's possible to do quite complex 3D-based graphics in the browser as well, but anything non-game related is a whole different matter. Surely 'boring' apps with lots of buttons, scroll bars, other widgets etc. will not be easier and more efficient running inside a web browser, compared to using native controls.


Absolutely agree with respect to ads. While the steady state performance of most banner ads is fine, loading new ads often force large javascript files to be fetched and parsed as well as needless reflows. On our game building site [1], we try to refresh ads only during natural breaks in gameplay (level changes). While this doesn't improve the loading time, it does help to shift the disruption to a time when it is less likely to be noticed.

1. https://www.1dash1.com


well, it's not surprising web apps are slow. as an example, 1dash1.com mentioned above: 57.480 lines of JS across 8 separate domains. and the result? site is loading slow (at least on my 500MBit/s fiber connection) and the browser is executing a lot of stuff, most of which is not really efficient or time sensitive (or required). this is how most sites does it, it has to be better ways.


Sunspider does next to no processing, it's really not a good benchmark for judging computation speed of a device. Arguably, it's not a good benchmark, period. Kraken or Octane are more interesting benchmarks for compute-heavy workloads such as games.


Reading this article was a fresh breeze.

The Canvas is very easy to work with. You can forget about JQuery and everything else concerning the DOM and just write pure JavaScript. I've even started working on a canvas based text editor.


I've even started working on a canvas based text editor.

Careful! Mozilla tried that as far back as early 2009 (the project was called Bespin, you can google for "mozilla bespin canvas" to find out more).

The canvas-based approach was eventually abandoned and the project merged into Ace, a more traditional DOM-based editor.


I did some research before I started and read about Bespin. I don't think they merged into Ace because of the Canvas. It was probably because another team on the same company also worked on an editor, witch had the same goals, and aimed for the same market.


At some point, the text editor in chrome's devtools was based on canvas. I don't know if that's still true, though.


There are issues however with the way that some browsers render canvas, particularly stock browsers. I've had wildly different fps on different browsers.


But how would one implement e.g. progressive rendering in the background?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: