Website Now Hosted on S3


S3

My website has been hosted on my home server since its Hello World! inception, about 2 years ago. It has been very stable; Uptime Robot, a free (for up to 50 monitors) service that tracks uptime, reports that my site was up 99.81% during the past month.

When my website becomes unreachable, this service pings my phone via Pushbullet. Sometimes, my IP changes and I need to update the DNS record in GoDaddy (my domain provider). Other times, there is a more serious issue with my home server.

In the interest of my sanity (I don’t hold my breath every time my phone buzzes), and for the benefit of home network security (I have no reason to open ports 80 and 443 on my router anymore), I began to pursue other hosting options.

Over the past year, I have become well-versed in AWS’ suite after working with it daily (funny how that happens…). Cloud services such as AWS have greatly eased the pain of development by mitigating worries about hardware. Cloud computing also becomes cheaper every year, and competition helps drive down prices even further. As a consumer, you have to love a free market!

Of course, Amazon is not infallible. Just recently, in late February, Amazon inadvertently took down much of the internet due to us-east-1 downtime in S3. If the layman wasn’t aware of much of the internet’s dependence on Amazon’s services before, they certainly were on February 28th.

Amazon promises that its S3 service will be available 99.99% of the time, which is certainly enough for my use; I don’t make any money through this website, so there isn’t any opportunity cost lost when there is downtime. You may already know that this site is generated by Jekyll. I found s3_website, a gem that ties Jekyll and S3 together.

The gem‘s usage was pretty self explanatory, and I was able to migrate my existing configuration to use S3 with little trouble. I use two cronjobs to publish changes I make to my website to S3. The first is invoked on reboot, and watches a directory in my Dropbox:

@reboot /usr/local/bin/jekyll build -s ~/Dropbox/website/ -d ~/Documents/website/_site/ --incremental --watch

The second calls s3_website push every minute:

* * * * * cd ~/Documents/website;/usr/local/bin/s3_website push

If no changes have been made, the program exits quickly, which consumes negligible CPU cycles. If this performed a more sophisticated check, I would definitely reconsider calling this every minute. I also change into my website directory first because s3_website is invoked on the current directory by default and its --site=[PATH TO WEBSITE] option didn’t work for me.

After a week of being on S3, Uptime Robot reports that my website has been reachable 100% of the time. I’d call that a successful transition from self-hosting to a cloud-based solution!