At first, I hosted my website on a small hosting package using static HTML pages. The drawback of this approach is that it is tedious to add new content, since an entire HTML page has to be built repeatedly. Therefore I decided to setup HUGO.
Hugo is a (fast) GO based static website generator which makes it easier to manage the website and adding new content via markdown. On compile HUGO generates HTML pages using predefined layouts. This allows, for instance, to dynamically expand these notes (blog) without having to add new HTML pages by hand. In addition, up-to-date content is possible without the use of PHP or other server-based languages. Since the output of hugo is just plain html code, I decided to deploy the website via a S3 bucket on AWS. By enabling the hosting feature and setting the permissions to public the S3 bucket is available over http.
Routing requests to S3
S3 does not support https, therefore I placed AWS CloudFront in front of the S3 bucket. For routing the www. address a simple CNAME record is set. However this approach is not possible for the root domain, since CNMAE only works for subdomains. Some providers (e.g. AWS, porkbun) support setting an ALIAS record to solve this. Unfortunately this flag is a non-standard DNS record. If ALIAS is not supported, Route53 from AWS may be a suitable option. Therefore the domain needs to be registered by AWS or the AWS name servers have to be set on part of domain registrar. Besides SSL termination and redirection from http to https, CloudFront also handles edge caching. This results in a faster delivery of the pages.
The following sketch shows the interaction of the individual components:
Github Action for automatic deployment on S3
I always wanted to try out Github Actions, therefore the main branch now synchronizes itself with the S3 Bucket. Amazing how easy the setup is 😄. All you have to do is to create a S3 access and secret key (including the needed permissions), add them in your Github repo and you are good to go. Finally this github action configuration does the magic:
|
|
First hugo compiles the site in a minified version, afterwards the resulting public dir (containing the html/css files) gets pushed to the S3 bucket. CloudFront does caching, so we have to add an invalidation request to the distribution in order to be sure to serve the new pages.
Of course the overall setup is a bit overkill for a simple website, but the own website is also a nice playground for a small pipeline 😉.