How to count number of file in a bucket-folder with gsutil

The gsutil ls command with options -l (long listing) and -R (recursive listing) will list the entire bucket recursively and then produce a total count of all objects, both files and directories, at the end:

$ gsutil ls -lR gs://pub
    104413  2011-04-03T20:58:02Z  gs://pub/SomeOfTheTeam.jpg
       172  2012-06-18T21:51:01Z  gs://pub/cloud_storage_storage_schema_v0.json
      1379  2012-06-18T21:51:01Z  gs://pub/cloud_storage_usage_schema_v0.json
   1767691  2013-09-18T07:57:42Z  gs://pub/gsutil.tar.gz
   2445111  2013-09-18T07:57:44Z  gs://pub/gsutil.zip
      1136  2012-07-19T16:01:05Z  gs://pub/gsutil_2.0.ReleaseNotes.txt
... <snipped> ...

gs://pub/apt/pool/main/p/python-socksipy-branch/:
     10372  2013-06-10T22:52:58Z  gs://pub/apt/pool/main/p/python-socksipy-branch/python-socksipy-branch_1.01_all.deb

gs://pub/shakespeare/:
        84  2010-05-07T23:36:25Z  gs://pub/shakespeare/rose.txt
TOTAL: 144 objects, 102723169 bytes (97.96 MB)

If you really just want the total, you can pipe the output to the tail command:

$ gsutil ls -lR gs://pub | tail -n 1
TOTAL: 144 objects, 102723169 bytes (97.96 MB)

UPDATE

gsutil now has a du command. This makes it even easier to get a count:

$ gsutil du gs://pub | wc -l
232

You want to gsutil ls -count -recursive in gs://bucket/folder? Alright; gsutil ls gs://bucket/folder/** will list just full urls of the paths to files under gs://bucket/folder without the footer or the lines ending in a colon. Piping that to wc -l will give you the line-count of the result.

gsutil ls gs://bucket/folder/** | wc -l


If you have the option to not use gsutil, the easiest way is to check it on Google Cloud Platform. Go to Monitoring > Metrics explorer :

  • Resource type : GCS Bucket
  • Metric : Object count Then, in the table below, you have for each bucket the number of document it contains.