Erasure Coder

New Website

My old site content is still up at http://catid.mechafetus.com.

Today I decided to update my site that I built back in college with some of the latest and greatest. I was a complete novice this morning in most of this technology so lots more to learn, probably.

To register the domains, I switched from Godaddy to Namecheap. Using namecheap cut the domain registration cost in half, and I have a further 20%-off promo code handy. I registered erasurecoder.com, mrcatid.com, catid.io, and a bunch of others today.

The new page is served from Cloudflare’s CDN via their SSL hosting solution. Not so much important for my blog but something I wanted to learn more about.

It pulls the data to mirror from an Amazon S3 bucket. This is also overkill - Gitlabs hosting is probably a better choice for most blogs.

The web content is authored using the GatsbyJS toolkit, which allows a mix of markdown styling and Facebook’s ReactJS for customization. GatsbyJS produces static websites and builds and pushes directly to S3 which is quickly picked up by Cloudflare.

The rough steps I followed to get it up and running on a Windows laptop:

(1) Install git bash

(2) Download node: https://nodejs.org/en/download/

(3) Follow Gatsby setup: https://www.gatsbyjs.org/docs/

> npm install -g gatsby
> git clone https://github.com/gatsbyjs/gatsby-starter-blog/
> cd gastsby-starter-blog
> npm install
> gatsby develop
[[ Allow firewall access here via UAC ]]

Then I was able to point my browser at localhost:8000.

> gatsby build

Via Amazon IAM console I set up a policy allowing an auto-uploader account to upload the website content. The custom policy looks like this:

{
    "Version": "2017-06-11",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket"
            ],
            "Resource": [
                "arn:aws:s3:::BUCKETNAME"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject"
            ],
            "Resource": [
                "arn:aws:s3:::BUCKETNAME/*"
            ]
        }
    ]
}

Then I added a group and user for it. My BUCKETNAME is “catid.io”. I also did the www bucket to redirect to “catid.io”.

Following instructions from LoFi’s blog I set up the post-build upload to S3:

npm install --save-dev git://github.com/andrewrk/node-s3-client.git
Added gatsby-node.js script to install a post-build hook as described.
Added secrets.json with my account details.

It seems that using the current HEAD works better than the specific version he mentions now.

Some other useful static website links here: A list of services for static websites A somewhat ranked list of static website generators

Posted June 11, 2017
READ THIS NEXT:

Multithreaded XOR Benchmarks

I recently ran an experiment to see how fast, using multi-threading on the CPU, we can XOR two sets of buffers in memory that do not overlap. A worker pool of the same size as the CPU count fights...


author Christopher A Taylor (catid)Development blog for Christopher A Taylor (catid), systems software engineer at Oculus/Facebook: Focus on erasure correction coding (ECC/FEC), cryptography, networking, lossless image compression.


Consult me via Email (mrcatid at gmail).
Follow me on twitter/@oculuscat.
Check out my free, BSD licensed software on github/catid.
Hobby coding for 22 years in GwBasic, QBasic, TI-BASIC, VB6, VBA, C, Intel assembly, C++, C#, JavaScript.