Attempt to copy files from google cloud bucket to VM instance using gsutil is falling after reaching size limit

I have a bucket with a file whose size is larger than 11 GB. I note that I can access all bucket's data from the standard google cloud console. Unfortunately, this console has very limited configurations since, I believe, it was primarily thought of as a tool to manage files and small operations within google cloud.

To better analyze my data I create a VM instance with 32 GB RAM. The problem is: once I am inside the VM instance I can no longer access the data from my bucket. After a bit of research I figure out that I can copy files from my bucket to my instance using:

gsutil -m cp -r gs://<my-bucket>/* .

I thought I have found a solution, but the operation stop after reaching 39%:

- [1/2 files][  4.4 GiB/ 11.2 GiB]  39% Done     0.0 B/s 

I don't know what configuration my VM must have to complete this operation successfully or if the problem is with the VM at all.

Now, I just start to use google cloud and I'm in no way sure that I'm doing the right thing here. Is this the standard procedure for use bucket data within a VM instance? How can I copy large files from bucket to my VM instance?


Solution 1:

The problem is: once I am inside the VM instance I can no longer access the data from my bucket

Maybe you have Access Control Lists set in some of your bucket folders.

You can use your account in the VM by gcloud auth login.