Hacker News new | past | comments | ask | show | jobs | submit login

There's some research around suggesting that with so many versions of jQuery in use that the chances of finding the version of jQuery your site uses in cache is quite small. (can't find the article at the moment)

Components don't seem to stay in cache for very long these days because browser caches are max only 50MB (phones are much smaller) and with a bit of surfing it's easy to get to a position where components get ejected.

Also there is no guarantee that retrieving it from Google's CDN is faster than retrieving it from your servers e.g. there's DNS resolution, TCP connections to be setup etc., some of which will already be done for the main site.




I've just run my own analysis on the HTTP Archive data, and the fragmentation issues are very real. The most popular URL used to load jQuery was:

http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min...

...and it was used by just 2.7% (945) of the 35,204 pages in the dataset. Note that it's not just version fragmentation - you have to take protocol into account too as browsers cache HTTP and HTTPS separately.

The next most popular was:

http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min...

...used by 1.3% (460) of pages, followed by:

http://ajax.googleapis.com/ajax/libs/jquery/1.6.2/jquery.min...

...used by 0.8% (285) of pages.

At this point there really isn't much of a debate; unless you have evidence to the contrary (e.g. all our visitors come from Facebook, and Facebook use the same version of jQuery as we do) using Google's CDN to load jQuery isn't likely to benefit the majority of your first-time visitors.


For those interested, I delved a little further and wrote up my findings here:

http://statichtml.com/2011/google-ajax-libraries-caching.htm...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: