Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's not necessarily true. My own site is a Jekyll site. To host that on S3, I'd need to generate it first and upload the generated files as opposed to my source files. Now that's not really a big deal, but I do enjoy the convenience of only having to do a `git push` to deploy my site on Pages.

That being said, I notice times similar to another commenter above, around 1-2s usually. I don't think I've seen a five second load time.



This sounds like the sort of thing that could easily be automated using a five-line Bash (Ruby, Python, etc) script.


I do something very similar to this, using Wintersmith and shell scripts. It essentially boils down to using two repositories for my site: the first being the raw/ungenerated files including the shell scripts, the second being the generated files that are served by GitHub pages.


s3_website[0] is a very neat solution to this. It integrates automatically with Jekyll. A simple 'jekyll build && s3_website push' uploads all your changes to S3. I'm using it to power all my static sites. It'll even automatically invalidate your Cloudfront distributions, if you like.

[0] https://github.com/laurilehmijoki/s3_website




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: