Exporting data from Google Cloud Storage to Amazon S3

I would like to transfer data from a table in BigQuery, into another one in Redshift. My planned data flow is as follows:

BigQuery -> Google Cloud Storage -> Amazon S3 -> Redshift

I know about Google Cloud Storage Transfer Service, but I'm not sure it can help me. From Google Cloud documentation:

Cloud Storage Transfer Service

This page describes Cloud Storage Transfer Service, which you can use to quickly import online data into Google Cloud Storage.

I understand that this service can be used to import data into Google Cloud Storage and not to export from it.

Is there a way I can export data from Google Cloud Storage to Amazon S3?


You can use gsutil to copy data from a Google Cloud Storage bucket to an Amazon bucket, using a command such as:

gsutil -m rsync -rd gs://your-gcs-bucket s3://your-s3-bucket

Note that the -d option above will cause gsutil rsync to delete objects from your S3 bucket that aren't present in your GCS bucket (in addition to adding new objects). You can leave off that option if you just want to add new objects from your GCS to your S3 bucket.


Go to any instance or cloud shell in GCP

First of all configure your AWS credentials in your GCP

aws configure

if this is not recognising the install AWS CLI follow this guide https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-install.html

follow this URL for AWS configure https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html

Attaching my screenshot

enter image description here

Then using gsutil

gsutil -m rsync -rd gs://storagename s3://bucketname

enter image description here enter image description here

16GB data transferred in some minutes


Using Rclone (https://rclone.org/).

Rclone is a command line program to sync files and directories to and from

Google Drive
Amazon S3
Openstack Swift / Rackspace cloud files / Memset Memstore
Dropbox
Google Cloud Storage
Amazon Drive
Microsoft OneDrive
Hubic
Backblaze B2
Yandex Disk
SFTP
The local filesystem

I needed to transfer 2TB of data from Google Cloud Storage bucket to Amazon S3 bucket. For the task, I created the Google Compute Engine of V8CPU (30 GB).

Allow Login using SSH on the Compute Engine. Once logedin create and empty .boto configuration file to add AWS credential information. Added AWS credentials by taking the reference from the mentioned link.

Then run the command:

gsutil -m rsync -rd gs://your-gcs-bucket s3://your-s3-bucket

The data transfer rate is ~1GB/s.

Hope this help. (Do not forget to terminate the compute instance once the job is done)