GCSFuse v0.33.2 - Large objects (11GB) upload to GCSBucket Fails

Receiving below errors when we try to upload 11 GB file to bucket; also, filesystem is going into "????????" as we can't list the objects available in the bucket.

This requires umount and mount back again to see the objects in the bucket. But, still the upload is not working and utilizing all the available memory for the upload.

The GCSFuse process is not releasing the memory utilized for hours. How can we fix this issue. We are using v0.33.2 of GCSFuse.

Errors:

Error 1 - Transport endpoint is not connected
Error 2 - Software caused connection abort

Solution 1:

GCSFUSE has high latency when working with large files, so when you are working with larger files is advisable to use gsutil cp.

The gsutil cp command allows you to copy data between your local file system and the cloud, within the cloud, and between cloud storage providers.

With it you can have more utilities like resumable uploads and it is easy to use, for example

gsutil cp *.txt gs://my-bucket