tail a log file but show only specific lines
I'm tailing a log file with -f flag. Then I'm piping this to grep, to find only lines that contain "X". That's working perfectly fine. Now I want to pipe this again into another grep, that will remove all the lines containing "Y". When I add the second pipe, the file stop refreshing and it looks like no data is coming.
This is the command that works: tail -f my_file.log | grep "X"
This is the command that doesn't: tail -f my_file.log | grep "X" | grep -v "Y"
How should I structure this so that the command works?
Solution 1:
As the output of grep
is buffered, use --line-buffered
option of grep
to enable line buffering:
tail -f /path/to/log | grep --line-buffered 'X' | grep -v 'Y'
If your grep
does not have the option, you can use stdbuf
as an alternative:
tail -f /path/to/log | stdbuf -oL grep 'X' | grep -v 'Y'
Solution 2:
I normally find more useful awk
for these kind of logical checks:
tail -f /path/to/log | awk '/X/ && !/Y/'
# ^^^ ^^^^
# this I want but not this
Tested by having two tabs, one in which I keep writing seq 20 >> myfile
and the other one having for example tail -f myfile | awk '/3/ && !/13/'
.
Solution 3:
Another approach would be to use a single grep
invocation instead of two and so avoid the buffering issue. Just use a regular expressions that matches lines consisting of 0 or more non-Y characters, then an X and then 0 or more non-Ys again till the end of the line"
tail -f /path/to/log | grep '^[^Y]*X[^Y]*$'