Oct 06, 2020

My Website on AWS S3 using Hugo and Github Actions

At first, I hosted my website on a small hosting package using static HTML pages. The drawback of this approach is that it is tedious to add new content, since an entire HTML page has to be built repeatedly. Therefore I decided to setup HUGO.

Hugo is a (fast) GO based static website generator which makes it easier to manage the website and adding new content via markdown. On compile HUGO generates HTML pages using predefined layouts. This allows, for instance, to dynamically expand these notes (blog) without having to add new HTML pages by hand. In addition, up-to-date content is possible without the use of PHP or other server-based languages. Since the output of hugo is just plain html code, I decided to deploy the website via a S3 bucket on AWS. By enabling the hosting feature and setting the permissions to public the S3 bucket is available over http.

Routing requests to S3

S3 does not support https, therefore I placed AWS CloudFront in front of the S3 bucket. For routing the www. address a simple CNAME record is set. However this approach is not possible for the root domain, since CNMAE only works for subdomains. Some providers (e.g. AWS, porkbun) support setting an ALIAS record to solve this. Unfortunately this flag is a non-standard DNS record. If ALIAS is not supported, Route53 from AWS may be a suitable option. Therefore the domain needs to be registered by AWS or the AWS name servers have to be set on part of domain registrar. Besides SSL termination and redirection from http to https, CloudFront also handles edge caching. This results in a faster delivery of the pages.

The following sketch shows the interaction of the individual components:

AWS S3 - website setup

Github Action for automatic deployment on S3

I always wanted to try out Github Actions, therefore the main branch now synchronizes itself with the S3 Bucket. Amazing how easy the setup is 😄. All you have to do is to create a S3 access and secret key (including the needed permissions), add them in your Github repo and you are good to go. Finally this github action configuration does the magic:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
name: Hugo S3 deployment
on: push
jobs:
  deploy:
    runs-on: ubuntu-18.04
    steps:
      - name: Git checkout
        uses: actions/checkout@v2

      # Hugo build
      - name: Setup hugo
        uses: peaceiris/actions-hugo@v2
        with:
          hugo-version: "0.75.1"
      - name: Build
        run: hugo --minify

      # Sync to s3
      - uses: jakejarvis/s3-sync-action@master
        with:
          args: --acl public-read --follow-symlinks --delete
        env:
          SOURCE_DIR: 'public'
          AWS_S3_BUCKET: ${{ secrets.AWS_S3_BUCKET }}
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          AWS_REGION: 'eu-central-1'

      # Invalidate CloudFront
      - name: invalidate
        uses: chetan/invalidate-cloudfront-action@master
        env:
          DISTRIBUTION: ${{ secrets.AWS_CF_DISTRIBUTION_ID }}
          PATHS: '/*'
          AWS_REGION: 'eu-central-1'
          AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
          AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}

First hugo compiles the site in a minified version, afterwards the resulting public dir (containing the html/css files) gets pushed to the S3 bucket. CloudFront does caching, so we have to add an invalidation request to the distribution in order to be sure to serve the new pages.

Of course the overall setup is a bit overkill for a simple website, but the own website is also a nice playground for a small pipeline 😉.

aws s3 s3sync github actions