running a tar command eats all resources
I am running a Rails application on my ubuntu 12.04 server. I have set up a cron job to take backup of all files uploaded to my application every morning at 2 am. It is a lot of files (around 900 mb). But when the users has to use the app later in the morning they can access it. So I logged in with SSH (which goes incredible slow) when finally in I run a top command and sees that gzip process is filling up all top 10. They are quite small but i suspect there is even more of them.
This is the tar command that gets executed
tar -cvzf $BASEBACKUP/uploads-$DATE.tar.gz /var/www/bptrial/current/public/uploads/* --exclude=tmp
My crontab
* 2 * * * cd /home/user/backup && sh mysql_backup.sh && sh files_backup.sh >> /tmp/cron.log
Should really take so many hours to zip 900mb of files? And why does it have to eat all the resources?
Solution 1:
I think fkraiem is correct.
To elaborate:
According to the crontab man pages
user@host $ man 5 crontab
The time and date fields are:
field allowed values
----- --------------
minute 0-59
hour 0-23
day of month 1-31
month 1-12 (or names, see below)
day of week 0-7 (0 or 7 is Sun, or use names)
A field may be an asterisk (*), which always stands for ``first-last''.
In your command
#m h dom mon dow command
* 2 * * * cd /home/user/backup...
You're saying, EVERY minute of hour 2 of every day of every month of every day of week. In other words, every minute of 2am will start another instance of your given command.