Programster's Blog

Tutorials focusing on Linux, programming, and open-source

Ubuntu - Getting Started With The AWS CLI

AWS

Steps

Ubuntu 20.04 Installation

Install the tools with the following commands:

sudo apt update \
  && sudo apt install python3-pip -y \
  && sudo pip3 install awscli

Pre Ubuntu 20.04 Installation

Install the tools with the following commands:

sudo apt-get update && sudo apt-get install python-pip -y
sudo pip install awscli

Personal Configuration

If you just want to use this for a single account with manual commands (e.g. not scripts), then perform the following steps. If you want to configure this for multiple accounts and/or scripts, then it is probably best to run the "Setup For Scripts" section instead.

Run the following command and answer the questions

aws configure

When it asks for the default output format, your choices are json, table, or text. Personally, I use json

Setup For Scripts

If you followed the previous section, then you do not need to run this one.

Now we need to set up our AWS credentials for automatic authentication

mkdir $HOME/.aws
vim $HOME/.aws/config

File contents

[default]
aws_access_key_id = [ID HERE]
aws_secret_access_key = [KEY HERE]
region = eu-west-1

It won't work if you put quotation marks around the values.

The configuration needs to be there, otherwise you need to set the AWS_CONFIG_FILE environment variable to its location e.g. export AWS_CONFIG_FILE="$HOME/.aws/config"

Protect the file from other users who have access to the same machine.

chmod 600 $HOME/.aws/config

Testing

You can test it's working with a simple command t fetch the regions from AWS:

aws ec2 describe-regions

You can send an email like so:

aws ses send-email \
  --text="this is a test" \
  --reply-to-addresses="admin@my.domain.com" \
  --from="admin@my.domain.com" \
  --to="to@gmail.com" \
  --subject="test"

Now you can use the CLI to transfer files to and from s3 like so:

aws s3 cp \
  /path/to/local/file.txt \
  s3://my-bucket/sub-folder/file.txt

or the other way around

aws s3 cp \
  s3://my-bucket/sub-folder/file.txt \
  /path/to/local/file.txt

The S3 file movement commands are pretty much the same as with linux so it should feel natural. e.g. cp, mv and the --recursive switch etc

Syncing Two Buckets

You can sync two buckets with:

aws s3 sync \
   s3://BucketFrom \
   s3://BucketTo

If the buckets are across two different accounts, you will need one of the two buckets to be public, and have your credentials configured for the private one.

Get The Size Of An S3 Bucket

aws s3 ls \
  --summarize \
  --human-readable \
  --recursive \
  s3://bucket-name/

Get The Size Of An S3 Bucket Folder

aws s3 ls \
  --summarize \
  --human-readable \
  --recursive \
  s3://bucket-name/directory

Get Size Of S3 Bucket (Faster)

The above commands get the size of a bucket by looping through every file which takes a long time if you have a large bucket.

Instead, do the following:

sudo apt-get install s4cmd
s4cmd du s3://my-bucket-name

You will need to sum up the integers and plug in "total x 512 bytes" into Google to get a human readable number.

Get Canonical ID

If you ever need your canonical ID, just run the following:

aws s3api list-buckets | grep ID

You can use this to grant another account access to your bucket, so you can sync two buckets between different accounts, without one of the buckets needing to be public.

References

Last updated: 26th August 2020
First published: 4th May 2020