Amazon S3 is a good place to back up and store your files cheaply on the cloud. While some people use it to store their personal data, there are others that use it to store images and scripts for their websites, and even use it as a CDN. On your desktop, you can easily access and manage your Amazon S3 account with apps like CloudBerry, DragonDisk, etc., but what if you need to access it from the command line? This is where S3cmd comes in useful.
S3cmd is a free command line tool for uploading, retrieving and managing data in Amazon S3. Other than S3, it also supports other cloud storage service providers that use the S3 protocol, such as Google Cloud Storage or DreamHost DreamObjects. The best use for this tool is on remote server where everything is done on the command-line. Alternatively, you can also use it to create batch scripts and automated backup to S3.
S3cmd is written in Python, so it is just about supported in all operating systems, as long as Python is installed.
S3cmd is found in most Linux repositories. In Debian/Ubuntu, or any other apt-based distro, you can install S3cmd with the command:
For Fedora, Centos, or any other yum-based distro:
Alternatively, you can download the source code, unzip the package and run the installer:
Before you get started, you will need to have your Amazon S3’s Access and Secret keys ready. You can locate the Access and Secret keys in the Amazon Management Console.
To get started, open the terminal and type:
It will first prompt you to enter your Access and Secret keys.
Next, you have to enter an encryption key and the path to the GPG program.
After the encryption key is the prompt for using HTTPS protocol. The default option is “No”, but you can set to “Yes” for a more secure connection to Amazon S3. Do note that using the HTTPS protocol will slow down the transfer speed and can’t be used if you are using a proxy.
Once you have configured and tested the settings, you can proceed to use S3cmd. If you need to change the settings, you can either edit the .s3cfg file in your Home directory, or run the
s3cmd --configure command again.
First, to view the buckets in your S3 account, you can use the
To create a bucket, use the
Note that you will need to prefix the bucket name with “s3://”
To list the content in a bucket, use the
ls command together with the bucket name. For example:
Uploading file to bucket
The easiest way to upload a file via S3cmd is with the
put command. For example:
In addition, you can use the
--encrypt parameter to encrypt the file before uploading to S3.
The default permission for the
put command is private, which means the file can only be viewed by you. If you need the file to be publicly accessible, you can add the parameter
For more granular control, use the
--acl-grant=PERMISSION:EMAIL or USER_CANONICAL_ID parameter. For example:
The “PERMISSION” can be “read,” “write,” “read_acp,” “write_acp,” “full_control,” “all.”
Retrieving files from bucket
To retrieve a file, use the
To download all files in the bucket, simply append the
Deleting files from bucket
The delete command for s3cmd is simply
del. For example:
You can also use the
--recursive parameter to delete all files in the bucket.
For more S3cmd commands, check out its usage guide here.
Advanced Usage: Synchronize a folder to S3
Let’s say you store all your important files in a folder and you want it to be synced to S3. S3cmd comes with a
sync command that can synchronize the local folder to the remote destination.
All you have to do is to create a cronjob to run the sync command regularly.
1. Open the crontab.
2. Add the following line to the end of the crontab. Save and exit the crontab.
That’s it. Your system will now sync the secret folder to S3 every 5 minutes. You can change the value to run the sync command at your preferred interval. Every file you removed from the secret folder will be removed from S3 too.
For those who need to work in the command line environment, S3cmd is really a great tool for you to access and manage Amazon S3 from the command line. Not only is it easy to use, it also comes with plenty of options for advanced usage and scripting needs.