How can I cause a script to log in a separate file the number of times it has been executed?
I need to write a script which writes to a file how many times this script has been executed.
How can I do that?
Solution 1:
I assume you want to have a single file countfile
that only contains one single number representing the execution counter.
You can read this counter into a shell variable $counter
e.g. using one of these lines:
read counter < countfile
counter=$(cat countfile)
Simple integer additions can be done in Bash itself using the $(( EXPRESSION ))
syntax. Then simply write the result back to our countfile
:
echo "$(( counter + 1 ))" > countfile
You should probably also safeguard your script for the case that countfile
doesn't exist yet and create one initialized with the value 1 then.
The whole thing might look like this:
#!/bin/bash
if [[ -f countfile ]] ; then
read counter < countfile
else
counter=0
fi
echo "$(( counter + 1 ))" > countfile
Solution 2:
Just let the script create a log file, add for example a line in your script at the end:
echo "Script has been executed at $(date +\%Y-\%m-\%d) $(date +\%H-\%M-\%S)" >> ~/script.log
This way you can format the way you present date and time yourself, but if you simply want to go with a full date and time (and HH:MM:SS
is an acceptable format for you) you can as well simply use:
echo "Script has been executed at $(date +\%F-\%T)" >> ~/script.log
Then you could do:
wc -l ~/script.log
Which counts the newline characters and give you an estimate of how many lines are inside the log file. Up to that you can see within the log file even when it was executed. To adapt it for your needs, you can change the paths and names used for logging. I just did an example here which saves the logfile in ~
.
So for example you want the script add this count to the line you added at the end of your script you could do something like this at the start of your script:
count=$(( $(wc -l ~/script.log | awk '{print $1}') + 1 ))
# the next line can be simply skipped if you not want an output to std_out
echo "Script execution number: $count"
And change your line at the end of the script to something including even that information:
echo "Script has been executed $count times at $(date +\%F-\%T)" >> ~/script.log
Solution 3:
This solution uses the same approach as Byte Commander’s answer but it doesn't rely on shell arithmetic or other Bashisms.
exec 2>&3 2>/dev/null
read counter < counter.txt || counter=0
exec 3>&2 3>&-
expr "$counter" + 1 > counter.txt
The stream redirections
- duplicate the standard error stream (2) to a different file descriptor (3),
- replace it (2) with a redirection to
/dev/null
(to suppress the error message in the subsequent redirection of the input of theread
command if the counter file is missing expectedly), - later duplicate the original standard error stream (now at 3) back into place (2) and
- close the copy of the standard error stream (3).
Solution 4:
A different approach
A separate counter file has disadvantages:
- It takes 4096 bytes (or whatever your block size is) for each counter file.
- You have to look up the name of the file in the bash script and then open the file to see the count.
- There is no file locking (in other answers) so it's possible that two people update the counter at the exact same time (called race condition in comments under Byte Commander's answer).
So this answer does away with a separate counter file and puts the count in the bash script itself!
- Putting the counter in the bash script itself allows you to see within your script itself how many times it has been run.
- Using
flock
guarantees that for a brief moment it's not possible for two users to run the script at the same time. - Because counter file name isn't hard coded, you don't need to change the code for different scripts, you can simply source it or copy and paste it from a stub / boilerplate file.
The code
#!/bin/bash
# NAME: run-count.sh
# PATH: $HOME/bin
# DESC: Written for AU Q&A: https://askubuntu.com/questions/988032/how-can-i-cause-a-script-to-log-in-a-separate-file-the-number-of-times-it-has-be
# DATE: Mar 16, 2018.
# This script run count: 0
# ======== FROM HERE DOWN CAN GO INTO FILE INCLUDED WITH SOURCE COMMAND =======
[ "${FLOCKER}" != "$0" ] && exec env FLOCKER="$0" flock -en "$0" "$0" "$@" || :
# This is useful boilerplate code for shell scripts. Put it at the top of
# the shell script you want to lock and it'll automatically lock itself on
# the first run. If the env var $FLOCKER is not set to the shell script
# that is being run, then execute flock and grab an exclusive non-blocking
# lock (using the script itself as the lock file) before re-execing itself
# with the right arguments. It also sets the FLOCKER env var to the right
# value so it doesn't run again.
# Read this script with entries separated newline " " into array
mapfile -t ScriptArr < "$0"
# Build search string that cannot be named
SearchStr="This script"
SearchStr=$SearchStr" run count: "
# Find our search string in array and increment count
for i in ${!ScriptArr[@]}; do
if [[ ${ScriptArr[i]} = *"$SearchStr"* ]]; then
OldCnt=$( echo ${ScriptArr[i]} | cut -d':' -f2 )
NewCnt=$(( $OldCnt + 1 ))
ScriptArr[i]=$SearchStr$NewCnt
break
fi
done
# Rewrite our script to disk with new run count
# BONUS: Date of script after writing will be last run time
printf "%s\n" "${ScriptArr[@]}" > "$0"
# ========= FROM HERE UP CAN GO INTO FILE INCLUDED WITH SOURCE COMMAND ========
# Now we return you to your original programming....
exit 0
Another Approach using a Log File
Similar to Videonauth's answer I wrote a log file answer here: Bash script to maintain audit trail / log of files accessed to log every time root powers were used with gedit
or nautilus
.
The catch though is rather than using gksu
the script is named gsu
and invokes pkexec
the "modern" way of using sudo in the GUI, so I am told.
Another advantage is not only does it say each time root powers were used with gedit
but it logs the file name that was edited.
Here is the code.
~/bin/gsu
:
#!/bin/bash
# Usage: gsu gedit file1 file2...
# -OR- gsu natuilus /dirname
# & is used to spawn process and get prompt back ASAP
# > /dev/null is used to send gtk warnings into dumpster
COMMAND="$1" # extract gedit or nautilus
pkexec "$COMMAND" "${@:2}"
log-file "${@:2}" gsu-log-file-for-"$COMMAND"
/usr/local/bin/log-file
:
#! /bin/bash
# NAME: log-file
# PATH: /usr/local/bin
# DESC: Update audit trail/log file with passed parameters.
# CALL: log-file FileName LogFileName
# DATE: Created Nov 18, 2016.
# NOTE: Primarily called from ~/bin/gsu
ABSOLUTE_NAME=$(realpath "$1")
TIME_STAMP=$(date +"%D - %T")
LOG_FILE="$2"
# Does log file need to be created?
if [ ! -f "$LOG_FILE" ]; then
touch "$LOG_FILE"
echo "__Date__ - __Time__ - ______File Name______" >> "$LOG_FILE"
# MM/DD/YY - hh:mm:ss - "a/b/c/FileName"
fi
echo "$TIME_STAMP" - '"'"$ABSOLUTE_NAME"'"' >> "$LOG_FILE"
exit 0
Contents of log file gsu-log-file-for-gedit
after a few edits:
__Date__ - __Time__ - ______File Name______
11/18/16 - 19:07:54 - "/etc/default/grub"
11/18/16 - 19:08:34 - "/home/rick/bin/gsu"
11/18/16 - 19:09:26 - "/home/rick/bin/gsu"