We at H+S are dedicated to one simple cause: creating posts about oddly specific programming scenarios. Somewhere in the world as sad soul is looking to programmatically access files from an S3 server while keeping their bucket private. To that person: we heard you.

There are plenty of reasons you'd want to access files in S3. For example, let's say you read that post about using Pandas in a Lambda function. Since you're already familiar with PyMySQL, you may hypothetically be in a position to export data from a DB query to a CSV saved in S3. I bet you can guess what I've been doing lately.

Configure the AWS CLI on your VPS

The easiest and safest way to interact with other AWS services on your EC2 instance (or VPS of choice) is via the AWS CLI. This is easily installed as a global Python3 library:

$ pip3 install awscli

With the CLI installed we'll be able to do something truly magical: set our AWS configuration globally. This means that any time we use interact with a microservice (such as S3), the boto3 library will always look to the files stored in ~/.aws/ for our keys and secrets, without us specifying. This critical from a security perspective as it removes all mentions of credentials from our codebase: including the location of said secrets.

Use $ aws configure to kickstart the process:

$ aws configure
$ AWS Access Key ID [None]: YOURACCESSKEY
$ AWS Secret Access Key [None]: YOURSECRETKEY
$ Default region name [None]: us-east-2
$ Default output format [None]: json

This creates a couple config files for us. If we never need to modify these files, they can be found here:

$ vim ~/.aws/credentials
$ vim ~/.aws/config

Node Time

We'll assume you have an app set up with some basic routing, such as the barebones ExpressJS set up.

In your app we'll need to add 2 dependencies:

$ npm install --save aws-sdk
$ npm install --save aws-config

Now we'll create a route.

var awsConfig = require('aws-config');
var AWS = require('aws-sdk');

router.get('/export', function(req, res, next) {
    var file = 'df.csv';
    console.log('Trying to download file', fileKey);

    var s3 = new AWS.S3({});

    var options = {
        Bucket: 'your-bucket-name',
        Key: file,
    };

    s3.getObject(options, function(err, data) {
      res.attachment(file);
      res.send(data.Body);
  });
});

Notice the empty curly brackets in new AWS.S3({}). If we had decided to barbarically hardcode our credentials into our source code, normally those values would live between those brackets as an object. When the brackets are empty, the AWS library automagically knows to look to our AWS credentials file for our access and secret keys.

This is how you'd do things the wrong way, just in case you wanted to be entertained:

var s3 = new AWS.S3({
    'AccessKeyID': 'YOURACCESSKEY', 
    'SecretAccessKey': 'YOURSECRETACCESSKEY', 
    'Region': 'YOUR REGION'
});

Yeah, that totally won't get committed somewhere by accident. Shake-my-head fam.

That's pretty much it: this route will prompt a download of the target file upon hitting the route. As much as I'm sure we'd all love to sit here and go through more complicated use cases, let's just avoid Callback Hell altogether and enjoy the rest of our day.

Hell will have to wait until next time.