how to pipe a mysql dump to s3cmd
I want to transfer a mysql dump, compressed, to s3.
I tried:
mysqldump -u root -ppassword -all-databases | gzip -9 | s3cmd put s3://bucket/sql/databases.sql.gz
but then I get:
ERROR: Not enough paramters for command 'put'
How can I do this (in one line)?
Solution 1:
This is possible with s3cmd
1.5+ (link):$ mysqldump ... | s3cmd put - s3://bucket/file-name.sql
Solution 2:
You are missing the actual file you want to backup to start.
s3cmd put /backup_dir/somefile.sql.gz s3://bucket/sql/
s3cmd takes two basic arguments, the file, and the bucket to backup too.
Secondly, I can't take credit for the following, but its basically doing what you want with an intermediate script. Basically, create a bak.sh file with the following, and then that shell script will be runnable via bash. (Credit: http://www.wong101.com/tech-cloud/configure-s3cmd-cron-automated-mysql-backup)
S3BUCKET="<bucketname>"
# Array of Databases
DBS=("<db1>" "<db2>" "<db3>" "<db4>")
for i in "${DBS[@]}"
do
DBNAME=$i
FILE=$DBNAME-`date "+%Y%m%d-%H%M"`.sql.gz
mysqldump $DBNAME -u[uname] -p[password] | gzip -9> /home/$FILE
#location of s3cmd may vary, modify if needed
/usr/bin/s3cmd --config /root/.s3cfg put /home/$FILE s3://$S3BUCKET/DBBackup-$DBNAME/ >> /var/log/mysqlback.log
sleep 5
rm /home/$FILE
done
Solution 3:
This appear to now be possible. With s3cmd v1.6.1:
curl -v "http://remote-server/file.mp4" |
s3cmd [-c .s3cfg-aws] put - [-P] s3://my-bucket/[folder/]filename.mp4