How do I list word count in multiple files?
Say I need to find out how many words are in each file that has the word 'work' in its name.
I know that to find files with 'work' in the name, it would be ls work
. And to figure out the number of words in a file it would be wc -w
.
However I tried the following and it seems to be just displaying the number of files, not the number of words combined in all files (which I need):
ls work | wc -w
So say if there are 14 files that have 'work' in the name, it would display 14, not the number of words.
The syntax is wc -w [FILE]
. If you don't use FILE but pipe in the output of ls work
it will only count what it will read on stdin.
You need to pipe in the text itself:
cat *work* | wc -w
Alternative you could execute wc
with find -exec
. But be aware that this could show multiple "total" sums as find
will call wc
multiple times if there are lots of files.
find ./ -type f -name "*work*" -exec wc -w {} +
You can run wc
with multiple files and then use shell built-in *
which adds every non hidden files in working directory to wc
's parameters.
wc -w *work*