Running a static website from Amazon S3 (or not)!

(Very technical article)

STOP PRESS: Here’s an update. I have removed all my files from Amazon S3 and have gone back to hosting them at a regular old shared web host. (Bluehost).

Amazon was so complicated I ended up paying US$100-$300 per month!!!!!

I created an S3 bucket, but then needed to use Transfer facility to upload the files, then I could not unzip them so I started an EC2 something to try to unzip the files, then I needed to set up Cloudfront to get the https:// working, then Route 53 for the DNS, then I got my bill for $US300 followed by US$100 for the next month! So I’ve shut the whole thing down. Amazon were nice and gave me a refund on the $US300 but not on the $US100.

Anyway, you can read me journey on how to setup Amazon below. But beware, every single step has a charge associated with it!

Check this out: EC2, WAF, VPC etc. I have no idea what all these things are, but hopefully I’ve turned them all off now!

Read more: Running a static website from Amazon S3 (or not)!

Amazon S3 is a place to store files. It’s very simple basic. Simple is not the right word!

Here’s how to upload files to Amazon S3 form a Mac, in a way that they can be served to the world as a webpage.

Create your S3 bucket

Go to AWS console, search for s3, go to create bucket.

Under bucket name put the domain name (e.g.

Leave all the other settings as default and select ‘Create Bucket’ at the bottom of the page.

You’ve made a bucket! Now we need to make it serve the files to the world.

Select your new bucket bucket, and go to Properties.

Down the bottom select ‘Static Website Hosting’ then enable, leave everything as default. You’ll need to fill ‘index.html’ here:

Click Save.

Take note of the URL, this is where you will access your files publicly, and later you can point your domain name to here.

Set Permissions of S3 bucket

Now go to permissions

Under bucket policy select edit:

Give it a policy like this, replace the url name ( with your bucket name.

“Version”: “2012-10-17”,
“Statement”: [
“Sid”: “PublicReadGetObject”,
“Effect”: “Allow”,
“Principal”: ““, “Action”: “s3:GetObject”, “Resource”: “

Create a User

You’ll need to create a user and give them permission to upload. This is done in the IAM console. (Search for IAM).

Go to User, Create User. USE all the default, all you need is a name.

Now create a Group to put your user in.

Give this group permission for S3 Full Access.

Get the User login details

Select your user. Go to security Credentials

Go to IAM > Users > username then to ‘Security Credentials’

Here do ‘Create Access Key’ and you will get the 2 keys you need to enter into the FTP app. An access ID and a secret key. BE sure to save the secret Key as there’s no way to get it again.

Create an AIM role

Later, when you make a server, you will give your user this role on the server.

Go to IAM > Roles

Under IAM > Roles create a new role (eg ‘sftpaccess’.
Go to permissions:

Give it this policy:

“Version”: “2012-10-17”,
“Statement”: [
“Sid”: “VisualEditor0”,
“Effect”: “Allow”,
“Action”: [
“s3:“, “s3:ListBucket”, “s3:GetBucketLocation” ], “Resource”: “

Go to Trust relationship:

Paste in this:

“Version”: “2012-10-17”,
“Statement”: [
“Effect”: “Allow”,
“Principal”: {
“Service”: “”
“Action”: “sts:AssumeRole”

Create a Server

STOP PRESS: I think this is where things started getting expensive.

This is the server that you SFTP into to upload files into your bucket.

Go back to AWS console,

Find AWS Transfer Family,

Click ‘Create Server’

Use all the defaults and you’ll have a new server.

Select your server and choose ‘Add User’

For the User, choose a home directory, and select the IAM role that you made above.

Upload your files using Transmit

To connect to your server, for the address put:

Access Key ID is your IAM Access Key

Secret is your IAM Private Key.

They are both from here:

Now you can upload all your files to the s3 bucket.


  • The Amazon Transfer Family endpoint does not accept rsync. You can use SFTP but mine died multiple times halfway through the 40GB file transfer. I just got a ‘terminated’ from AWS. You cannot archive to a gzip and transfer the file because once it’s on the s3 bucket there’s no (simple) way to unarchive it. This is what drove me to use transmit and use the ”sync’ facility to upload the files.
  • The S3 bucket is also limited, you cannot unzip a ip file there. It’s possible to create an E2 instance to unzip files that you load via SFTP. Way too complicated. Forget trying to upload a zip as there’s no way to unzip it easily on the s3 bucket.
  • I set up Amazon Transfer Family endpoint and used Cyberduck to FTP, but then there seems to be a well known issue on the s3 bucket with filetypes. If you upload the files to S3 using Cyberduck, the filetypes are lost. You can read about it everywhere. You could manually go into the S3 bucket and set the filetypes (to html, txt, css etc). Transmit does it automatically for you as it uploads. I forked out US$45 just for this feature even though I have been using Cyberduck for years. Transmit has the capability to sync form a local folder to the s3 bucket.

Redirect your DNS to the Amazon site

Go to your DNS and change your records to point to the new site.

In your DNS record create a new CNAME

NAME is your domain name (e.g.

HOST is the Amazon endpoint (eg

Note you need to remove the http:// from the front, and add a ‘.’ at the end.

Get https://

Now you need to make it secure, using CloudFront.

Create a CloudFront distribution.

Select the site you already made in S3. All the rest – jus use the default settings.

Under SSL select ‘Request Certificate.

Nor in a new Certificate Manager site select ‘request certificate.

You can choose DNS (add a CNAME record) or email for verification. I used email as it was easier.

Now go back to the CloudFront page and select the certificate you just made, all other settings as default, then click ‘Create Distribution’ down the bottom.

Wait for it to reply…


This looks like this in Route53:
Set up an A record, using an alias, use the domain from your CloudFront.


oops. you need to set this so that it know stop go to index.html
above in the first step.


One response to “Running a static website from Amazon S3 (or not)!”

  1. Wayne

    Ok Importnat comment.
    Don’t use Amazon Transfer thingy as described above.
    I just got the bill for our first month.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.