Amazon Simple Storage Service (S3) is great for hosting static websites. My current site was created using Jekyll and is hosted on Amazon S3. Instead of having to manually upload all of my files to S3 each time I want to deploy a new version of my site, I created a simple Gulp script to do it for me.
My current site has two separate environments, staging and production. The production environment is the live version of the site, while the staging environment is used to test new features, preview future blog posts, etc. Each environment is associated to a separate S3 Bucket. For example, my production bucket is tusharghate.com
and my staging bucket is staging.tusharghate.com
.
By using a few Gulp plugins, I was able to create an easy script to deploy my website to each bucket based on whichever environment I specify.
For example, gulp deploy --env production
will deploy my site to tusharghate.com
and gulp deploy --env staging
will deploy my site to staging.tusharghate.com
. Let’s dive into the code:
Parsing arguments to Gulp tasks
The yargs
module makes it easy to parse arguments passed to Gulp tasks. We will use yargs
to figure out which bucket we want to deploy to based on the env
argument that the user passes in.
var gulp = require('gulp'),
argv = require('yargs').argv;
const ENV = {
production: 'production',
staging: 'staging'
};
const BUCKETS = {
production: 'tusharghate.com',
staging: 'staging.tusharghate.com'
};
gulp.task('deploy', function() {
// Figure out which bucket we are deploying to based on the environment
// specified by the User.
let validEnvironments = [ENV.production, ENV.staging];
var bucket;
if (argv.env &&
validEnvironments.indexOf(argv.env) != -1) {
bucket = BUCKETS[argv.env];
} else {
console.error('Error! Please specify a valid environment!');
return;
}
// TODO: Do something with our bucket!
});
As you can see above, we simply parse the env
variable passed in by the user. If it doesn’t exist or it isn’t valid, we return an error. Otherwise we keep a reference to the bucket that we want to deploy to. More information on how to use yargs
can be found on the project’s Github page.
Publishing to S3
Publishing to an S3 Bucket is made easy with the awspublish
module.
var gulp = require('gulp'),
awspublish = require('gulp-awspublish'),
AWS = require('aws-sdk');
gulp.task('deploy', function() {
// Figure out which bucket we are deploying to
var bucket;
// ... See above
// Create our publisher
let publisher = awspublish.create({
credentials: new AWS.SharedIniFileCredentials({
profile: '<YOUR-AWS-PROFILE>'
}),
region: '<YOUR-REGION>',
params: {
Bucket: bucket
}
});
const DIR_DIST = './dist'
let headers = {},
options = {};
return gulp.src([DIR_DIST + '**/*'], {base: DIR_DIST})
// g-zip all files
.pipe(awspublish.gzip())
// Publish files to S3
.pipe(publisher.publish(headers, options))
// Keep our bucket in-sync with our local files
.pipe(publisher.sync())
// Cache our files for next upload
.pipe(publisher.cache())
// Output log to the terminal
.pipe(awspublish.reporter());
});
You should be able to get a general idea of how awspublish
works from the code above. We simply configure our publisher, and then push our assets to the S3 bucket that was specified by the user. More information on how to configure and use your publisher can be found on the project’s Github page.
And that’s all there is to it! Running gulp deploy --env <YOUR-ENV>
will deploy our site to whichever bucket is associated to our specified environment. This script will save you the hassle of manually uploading assets to your S3 bucket, and makes it easy to add additional environments/buckets to deploy to as well.