Pipe output to bash function
To answer your actual question, when a shell function is on the receiving end of a pipe, standard input is inherited by all commands in the function, but only commands that actually read form their standard input consume any data. For commands that run one after the other, later commands can only see what isn't consumed by previous commands. When two commands run in parallel, which commands see which data depends on how the OS schedules the commands.
Since printf
is the first and only command in your function, standard input is effectively ignored. There are several ways around that, including using the read
built-in to read standard input into a variable which can be passed to printf
:
jc_hms () {
read foo
hr=$(($foo / 3600))
min=$(($foo / 60))
sec=$(($foo % 60))
printf "%d:%02d:%02d" "$hr" "$min" "$sec"
}
However, since your need for a pipeline seems to depend on your perceived need to use awk
, let me suggest the following alternative:
printstring=$( jc_hms $songtime )
Since songtime
consists of a space-separated pair of numbers, the shell performs word-splitting on the value of songtime
, and jc_hms
sees two separate parameters. This requires no change in the definition of jc_hms
, and no need to pipe anything into it via standard input.
If you still have a different reason for jc_hms
to read standard input, please let us know.
You can't pipe stuff directly to a bash function like that, however you can use read
to pull it in instead:
jc_hms() {
while read -r data; do
printf "%s" "$data"
done
}
should be what you want