how to access multiple repositories in CI build?

We have a project that is composed of multiple (non-public) repositories.

To build the whole project, the build system needs to have the files of all repositories (master branches).

Is there a way I can configure GitLab CI to provide the repositories I need?

I guess I could do a git fetch or similar during the CI build, but how to deal with authentication then?


If you are running gitlab version 8.12 or later, the permissions model was reworked. Along with this new permission model comes the the CI environment variable CI_JOB_TOKEN. The premium version of GitLab uses this environment variable for triggers, but you can use it to clone repos.

dummy_stage:
  script:
    - git clone https://gitlab-ci-token:${CI_JOB_TOKEN}@gitlab.instance/group/project.git

A couple of workarounds (I hate this word!) that work-around-ed for me:

  1. Using git submodule, see https://docs.gitlab.com/ce/ci/git_submodules.html

  2. Re-using $CI_REPOSITORY_URL defined by Gitlab and available even inside child Docker containers. This env var already contains username and password, that can be used for another repo on the same server. See snippet from .gitlab-ci.yml:

- BASE_URL=`echo $CI_REPOSITORY_URL | sed "s;\/*$CI_PROJECT_PATH.*;;"`
- REPO_URL="$BASE_URL/thirdparty/gtest.git"
- REPO_DIR=thirdparty/gtest
- rm -fr $REPO_DIR
- git clone $REPO_URL $REPO_DIR
  1. Even storing that URL with username\password in ~/.git-credentials file and configuring git to use it via credential.helper. All further "git clone" commands will use it.
- echo Storing git credentials to be used by "git clone" commands without username and password ...
- GIT_CREDENTIALS_FILE=~/.git-credentials
- BASE_URL=`echo $CI_REPOSITORY_URL | sed "s;\/*$CI_PROJECT_PATH.*;;"`
- echo $BASE_URL > $GIT_CREDENTIALS_FILE
- git config --global credential.helper store --file=$GIT_CREDENTIALS_FILE

HOWEVER !

Having spent quite a few years in CI \ CD field, I don't think it's a good design that requires linking repositories as sources.

Yes, in classic CI tools like Jenkins or TeamCity you can create a job that fetches several Git repos in different subdirectories.

But I like GitLab CI way of Pipeline As Code, where .gitlab-ci.yml controls the build of this very repo and you don't even have to think about that whole pre-build step of getting sources. Then such build would publish binary artifacts and downstream projects\repos can use those instead of sources of dependencies. It's also faster.

Separation of concerns.

I don't there is an official way in my .gitlab-ci.yml to use artifacts of another project. But there are other ways like hooks, Gitlab API, though such bespoke solutions require maintenance.

There's better way - to publish\fetch artifacts to\from external widely-adopted Package Manager. Depending on your language it could be Maven, NuGet, npm, jFrog Artifactory, Nexus, etc. Another advantage of this method is that developers can follow the same process in their local builds, which is not easily done if dependencies are defined in .gitlab-ci.yml

It's a bigger problem for native code (Cxx) mainly due to Binary Interface compatibility, but things like Conan.io etc are slowly catching up.