There isn't a per-site cache in Railgun because it's part of our large shared in-memory cache in our infrastructure.
Currently, cookies are not part of the hash.
We have customers of all types using Railgun. As an example, there's a British luggage manufacturer who launched a US e-commerce site last month. They are using it to help alleviate the cross-Atlantic latency. At the same time they see high compression levels as the site boilerplate does not change from person to person viewing the site.
What sort of sites do you think it doesn't apply to?
> What sort of sites do you think it doesn't apply to?
Single page webapps. In those cases the html/js is normally static and already CDN'ed and the data is a JSON API which varies on a per user basis.
There would be some gain as the dictionary would learn the JSON keys but I doubt it would be very dramatic vs deflate compared to the content sites referenced in the article.
Yes. That's up to the particular configuration of the site. It varies from site to site, but for optimal results you want it big enough to keep the content of the common pages of your site.
Currently, cookies are not part of the hash.
We have customers of all types using Railgun. As an example, there's a British luggage manufacturer who launched a US e-commerce site last month. They are using it to help alleviate the cross-Atlantic latency. At the same time they see high compression levels as the site boilerplate does not change from person to person viewing the site.
What sort of sites do you think it doesn't apply to?