GitLab CI vs. Jenkins [closed]
Solution 1:
This is my experience:
At my work we manage our repositories with GitLab EE and we have a Jenkins server (1.6) running.
In the basis they do pretty much the same. They will run some scripts on a server/Docker image.
TL;DR;
- Jenkins is easier to use/learn, but it has the risk to become a plugin hell
- Jenkins has a GUI (this can be preferred if it has to be accessible/maintainable by other people)
- Integration with GitLab is less than with GitLab CI
- Jenkins can be split off your repository
Most CI servers are pretty straight forward (concourse.ci), gitlab-ci, circle-ci, travis-ci, drone.io, gocd and what else have you). They allow you to execute shell/bat scripts from a YAML file definition. Jenkins is much more pluggable, and comes with a UI. This can be either an advantage or disadvantage, depending on your needs.
Jenkins is very configurable because of all the plugins that are available. The downside of this is that your CI server can become a spaghetti of plugins.
In my opinion chaining and orchestrating of jobs in Jenkins is much simpler (because of the UI) than via YAML (calling curl commands). Besides that Jenkins supports plugins that will install certain binaries when they are not available on your server (don't know about that for the others).
Nowadays (Jenkins 2 also supports more "proper ci" with the Jenkinsfile
and the pipline plugin which comes default as from Jenkins 2), but used to be less coupled to the repository than i.e. GitLab CI.
Using YAML files to define your build pipeline (and in the end running pure shell/bat) is cleaner.
The plug-ins available for Jenkins allow you to visualize all kinds of reporting, such as test results, coverage and other static analyzers. Of course, you can always write or use a tool to do this for you, but it is definitely a plus for Jenkins (especially for managers who tend to value these reports too much).
Lately I have been working more and more with GitLab CI. At GitLab they are doing a really great job making the whole experience fun. I understand that people use Jenkins, but when you have GitLab running and available it is really easy to get started with GitLab CI. There won't be anything that will integrate as seamlessly as GitLab CI, even though they put quite some effort in third-party integrations.
- Their documentation should get you started in no time.
- The threshold to get started is very low.
- Maintenance is easy (no plugins).
- Scaling runners is simple.
- CI fully part of your repository.
- Jenkins jobs/views can get messy.
Some perks at the time of writing:
- Only support for a single file in the community edition. Multiples files in the enterprise edition.
Solution 2:
I agree with most of Rik's notes, but my opinion about which is a simpler is the opposite: GitLab is proving to be an awesome tool to work with.
Most of the power comes from being self-contained and integrating everything in the same product under the same browser tab: from repository browser, issue board or build history to deployment tools and monitoring.
I'm using it right now to automate and test how an application installs on different Linux distributions, and it's just blazing fast to configure (try to open a complex Jenkins job configuration in Firefox and wait for the non-responsive script to come up vs. how lightweight is to edit .gitlab-ci.yml
).
The time spent on configuring/scaling slaves is considerably less thanks to the runner binaries; plus the fact that in GitLab.com you get quite decent and free shared runners.
Jenkins feels more manual after some weeks of being a power user of GitLab CI, e.g. duplicating jobs per branch, installing plugins to do simple stuff such as SCP upload. The only use-case I have faced where I miss it as for today is when more than one repository is involved; that needs to be nicely figured out yet.
BTW, I'm currently writing a series on GitLab CI to demonstrate how it's not that hard to configure your repository CI infrastructure with it. Published last week, the first piece is introducing the basics, pros and cons and differences with other tools: Fast and natural Continuous Integration with GitLab CI
Solution 3:
First of all, as of today, GitLab Community Edition can be fully interoperable with Jenkins. No question.
In what follows, I give some feedback on a successful experience combining both Jenkins and GitLab CI. I shall also discuss whether you should use both or only one of them, and for what reason.
I hope this will give you quality information on your own projects.
GitLab CI and Jenkins strengths
GitLab CI
GitLab CI is naturally integrated in GitLab SCM. You can create pipelines using gitlab-ci.yml
files and manipulate them through a graphical interface.
These pipelines as code can obviously be stored in the code base, enforcing the "everything as code" practice (access, versioning, reproducibility, reusability, etc.).
GitLab CI is a great visual management tool:
- all members of the teams (including the non-technical ones) have quick and easy access to applications life cycle status.
- therefore it can be used as a interactive and operational dashboard for release management.
Jenkins
Jenkins is a great build tool. It's strength is in its many plugins. Especially, I've had great luck in using interface plugins between Jenkins and other CI or CD tools. This is always a better option than to redevelop (possibly badly) a dialog interface between two components.
Pipeline as code is also available using groovy
scripts.
Using GitLab CI and Jenkins together
It might sound a bit redundant at first, but combining GitLab CI and Jenkins is quite powerful.
- GitLab CI orchestrates (chains, runs, monitors...) pipelines and one can benefit its graphical interface integrated to GitLab
- Jenkins runs the job and facilitates dialog with third-party tools.
Another benefit of this design is to have loose coupling between the tools:
- we could replace any of the build factory components without having to rework the entire CI/CD process
- we could have a heterogeneous build environment, combining (possibly several) Jenkins, TeamCity, you name it, and still have a single monitoring tool.
The Trade-off
Well, of course, there is a price to pay for this design: the initial set-up is cumbersome and you need to have a minimal level of understanding of many tools.
For this reason, I don't recommend such a set-up unless
- you have many third-party tools to deal with. That's when Jenkins comes in super handy with its many plugins.
- you have to deal with complex applications with heterogeneous technologies, having each a different build environment, and still need to have a unified application life cycle management UI.
If you are in neither of these situations, you're probably better off with only one of the two, but not both.
If I had to pick one
Both GitLab CI and Jenkins have pros and cons. Both are powerful tools. So which one to choose?
Answer 1
Choose the one that your team (or someone close) has already a certain level of expertise in.
Answer 2
If you're all complete freshman in CI technologies, just pick one and get going.
- If you're using GitLab and have a knack for everything as code, it makes total sens to choose GitLab CI.
- If you have to dialog with many other CI/CD tools or absolutely need that GUI to build your jobs, go for Jenkins.
Those of you that are using GitLab and are not sure they will keep doing so still have to keep in mind that, having chosen GitLab CI would imply to trash all your CI / CD pipelines.
Final word is: the balance leans a little bit towards Jenkins because of its many plugins, but chances are GitLab CI will quickly fill the gap.
Solution 4:
I would like to add some findings from my recent experimenting with GitLab CI. Features that came with 11.6 and 11.7 are just awesome!
Specifically I love only
conditions which basically allow you to build separate pipelines for merge_request
or push
(the complete list is here)
Also, I really like the absence of plugins. When I need some more complex functionality I just write a custom Docker image that handles required functionality (it's the same concept as you can see in drone.io).
If you are wondering about DRY, it's absolutely possible nowadays! You can write your "templates,"
.myTemplate:
image: node:10.14.2
script:
- npm install
- npm run test
Put them to some public repository, include them in the main pipeline:
include:
- remote: https://....
And use them to extend some job:
test:
extends: .myTemplate
only:
refs: ["master"]
variables:
- $CI_PIPELINE_SOURCE == "push"
I love GitLab CI so much! Yeah, it (so far) can't draw nice graphs with coverage and so on, but overall it's a really neat tool!
Edit (2019-02-23): here's my post about things I love in GitLab CI. It was written in 11.7 "era" so when you're reading this answer, GitLab CI probably has many more features.
Edit (2019-07-10): Gitlab CI now supports multiple extends
e.g.
extends:
- .pieceA
- .pieceB
Check the official documentation to get more info about multiple extends
Solution 5:
if your build/publish/deploy and test jobs are not heavily complex then using gitlab ci has natural advantages.
Since gitlab-ci.yml is present alongside your code in every branch, you can modify your ci/cd steps particularly tests(which differs across environments) more effectively.
For an example, you want to do unit testing for any checkin to dev branch whereas you might want carry out full fledged functional testing on QA branch and a limited only get type of tests on production this can be achieved easily using gitlab ci.
second advantage apart from great UI is its ability to use docker images for executing any stage keeps the host runner intact and thus less error prone.
moreover gitlab ci would automatically checkin for you and you dont have to manage jenkins master separately