When I read Hacker News today, I found a very cool post: [cached]Make your website fast. (And why you should). Of course, I wanted to try this out myself - one of the major reasons I went to S3 for hosting was increased speed.
Now by default, octopress doesn't include anything to help you here - S3 requires that you gzip your static content files manually. Aditionally, you have to set the correct content header! Now, a bit of googling quickly led me to an [cached]excellent post from Frank Fusion. (After some trial and error with the aws/s3 gem from ruby, which I scrapped because it didn't work with the european S3 server)
He has a very nice bash script to compress and upload all the files, which I adapted a bit for Octopress (his version is for Jekyll). It's actually very simple:
1 2 3 4 5 6 7 8 9 10 11 12 13 |
|
Call it like this: ok_failed system("./gziped_sync.sh #{s3_bucket}")
. If you are wondering about the find command - gzip_if_not_gzipped
is a little script I wrote to only compress a file if it's not already compressed - otherwise, we always have to completely regenerate the whole site.
1 2 3 4 5 6 |
|
With this little trick, uploading to S3 is as fast as before, while we still have the advantages of gzipped content. Nice :)
Tags: web