How to transfer files between AWS s3 and AWS ec2

Solution 1:

Using the most recent AWS CLI (http://aws.amazon.com/cli/) you can use the following commands to copy files from your Ec2 Instance or even you local machine to S3 storage.

aws s3 cp myfolder s3://mybucket/myfolder --recursive

You'll then get something like:

upload: myfolder/file1.txt to s3://mybucket/myfolder/file1.txt 
upload: myfolder/subfolder/file1.txt to s3://mybucket/myfolder/subfolder/file1.txt

If this is your first usage of the aws CLI tool then you'll need to run:

aws configure

This will ask you to enter your access key & secret along with specifying a default region.

Solution 2:

There are a number of ways to send files to S3. I've listed them below along with installation and documentation where relevant.

  • S3CMD: (http://s3tools.org/s3cmd) You can install this on debian/ubuntu easily via apt-get install s3cmd, then run from command line. You could incorporate this into a bash script or your program.

  • S3FS: (http://www.pophams.com/blog/howto-setups3fsonubuntu1104x64 and https://code.google.com/p/s3fs/wiki/InstallationNotes) ... This mounts an s3 bucket, so that it looks just like a local disk. It takes a little more effort to setup, but once the disk is mounted, you don't need to do anything special to get the files in your bucket.

  • If you use a CMS (lets use Drupal as an example) you may have the option of using a module to handle access to your bucket eg http://drupal.org/project/storage_api

  • Finally, you can use programming language implementations to handle all the logical yourself, for PHP you can start with this http://undesigned.org.za/2007/10/22/amazon-s3-php-class and see documentation here http://undesigned.org.za/2007/10/22/amazon-s3-php-class/documentation

An example of the PHP implementation:

<?php

    // Simple PUT:
    if (S3::putObject(S3::inputFile($file), $bucket, $uri, S3::ACL_PRIVATE)) {
        echo "File uploaded.";
    } else {
        echo "Failed to upload file.";
    }

?>

An example of s3cmd:

s3cmd put my.file s3://bucket-url/my.file

Edit

Another option worth mention is the AWS CLI http://aws.amazon.com/cli/ This is widely available, for example it's already included on AmazonLinux and can be downloaded via Python (which is installed on many systems including linux and windows).

http://docs.aws.amazon.com/cli/latest/reference/s3/index.html

Available commands, cp ls mb mv rb rm sync website

http://docs.aws.amazon.com/cli/latest/reference/s3api/index.html for interacting with S3

Solution 3:

Install s3cmd Package as:

yum install s3cmd

or

sudo apt-get install s3cmd

depending on your OS. Then copy data with this:

s3cmd get s3://tecadmin/file.txt

also ls can list the files.

For more detils see this

Solution 4:

I'm using s3cmd to store nightly exported database backup files from my ec2 instance. After configuration of s3cmd, which you can read about at their site, you can then run a command like:

s3cmd put ./myfile s3://mybucket

Solution 5:

Use s3cmd for that:

s3cmd get s3://AWS_S3_Bucket/dir/file

See how to install s3cmd here:

This works for me...