Generic solution to prevent a long cron job from running in parallel?
I am looking for a simple and generic solution that would allow you to execute any script or application in crontab and prevent it from running twice.
The solution should be independent on the executed command.
I assume it should look like lock && (command ; unlock)
where lock will return false if there was another lock.
The second part would be like if it acquired the lock, run command and unlock after command is executed, even if it returns error.
Take a look at the run-one
package. From the manpage for the run-one
command :
run-one is a wrapper script that runs no more than one unique instance of some command with a unique set of arguments.
This is often useful with cronjobs, when you want no more than one copy running at a time.
Like time
or sudo
, you just prepend it to the command. So a cronjob could look like:
*/60 * * * * run-one rsync -azP $HOME example.com:/srv/backup
For more information and background, check out the blog post introducing it by Dustin Kirkland.
A very simple way of settup a lock:
if mkdir /var/lock/mylock; then
echo "Locking succeeded" >&2
else
echo "Lock failed - exit" >&2
exit 1
fi
A scripts which want to run needs te create the lock. If the lock exists, another script is busy, so the first script can't run. If the file don't exists, no script has acquired the lock. So the current script acquires the lock. When the script has finished the lock needs to be realeased by removeing the lock.
For more information about bash locks, check this page