Authenticating GCP service account with Bitbucket Pipelines
Currently trying to authenticate the Linux machine in a bitbucket pipeline to run this code within a test that allows it to move a file from a GCS bucket to itself.
storage_client = storage.Client()
source_bucket = storage_client.bucket('gs://xxxx')
source_blob = source_bucket.blob(xxxx)
_ = source_bucket.copy_blob(source_blob, 'xxxx', destination_blob_name)
In order to authenticate I put this in the bitbucket-pipelines.yml
at the repository root directory:
image: python:3.8
options:
max-time: 20
pipelines:
default:
- step:
size: 2x
caches:
- pip
- pipenv
script:
- curl -O https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-sdk-365.0.0-linux-x86_64.tar.gz
- tar -xvf google-cloud-sdk-365.0.0-linux-x86_64.tar.gz
- ./google-cloud-sdk/install.sh
- export PATH=$PATH:$(pwd)/google-cloud-sdk/bin
- echo $GCLOUD_SERVICE_KEY | gcloud auth activate-service-account --key-file=-
- pip3 install -U pip pipenv
- pipenv install --deploy --dev
- gcloud auth list
- pipenv run pytest -v --junitxml=test-reports/report.xml
Where GCLOUD_SERVICE_KEY
is a repository variable on Bitbucket. However when the line pipenv run pytest -v --junitxml=test-reports/report.xml
is run, I get the error:
> storage_client = storage.Client()
tests/gcs/test_gcs.py:58:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/root/.local/share/virtualenvs/build-3vGKWv3F/lib/python3.8/site-packages/google/cloud/storage/client.py:124: in __init__
super(Client, self).__init__(
/root/.local/share/virtualenvs/build-3vGKWv3F/lib/python3.8/site-packages/google/cloud/client.py:318: in __init__
_ClientProjectMixin.__init__(self, project=project, credentials=credentials)
/root/.local/share/virtualenvs/build-3vGKWv3F/lib/python3.8/site-packages/google/cloud/client.py:266: in __init__
project = self._determine_default(project)
/root/.local/share/virtualenvs/build-3vGKWv3F/lib/python3.8/site-packages/google/cloud/client.py:285: in _determine_default
return _determine_default_project(project)
/root/.local/share/virtualenvs/build-3vGKWv3F/lib/python3.8/site-packages/google/cloud/_helpers.py:186: in _determine_default_project
_, project = google.auth.default()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
scopes = None, request = None, quota_project_id = None, default_scopes = None
def default(scopes=None, request=None, quota_project_id=None, default_scopes=None):
"""Gets the default credentials for the current environment.
`Application Default Credentials`_ provides an easy way to obtain
credentials to call Google APIs for server-to-server or local applications.
Now some people would want to save the GCLOUD_SERVICE_KEY
as a file on the repository or somehow copy it to the linux machine running the pipeline itself, but I think it's best if we use the line echo $GCLOUD_SERVICE_KEY | gcloud auth activate-service-account --key-file=-
and not commit any private keys.
Solution 1:
The command gcloud auth activate-service-account does not set up ADC (Application Default Credentials) for the python program.
Write the contents of the service account to a file and set the environment variable GOOGLE_APPLICATION_CREDENTIALS to point to the file.
Another option is write the contents to a known location and then specify that location when creating the client:
storage.Client.from_service_account_json('<PATH_TO_SERVICE_ACCOUNT_JSON>')
There are additional options such as creating the credentials from a JSON string that you pass to the Python program. Typically you would base64 encode/decode first.
credentials = service_account.Credentials.from_service_account_info(str)
storage.Client(credentials=credentials)