Why shared libraries between microservices are bad? [closed]

Solution 1:

The evils of too much coupling between services are far worse than the problems caused by code duplication

The author is very unspecific when he uses the generic word "coupling". I would agree with certain types of coupling being a strict no-no (like sharing databases or using internal interfaces). However the use of common libraries is not one of those. For example if you develop two micro services using golang you already have a shared dependency (towards golang's basic libraries). The same applies to libraries that you develop yourself for sharing purpose. Just pay attention to the following points:

  • Treat libraries that are shared like you would dependencies to 3rd party entities.
  • Make sure each component / library / service has a distinct business purpose.
  • Version them correctly and leave the decision which version of the library to use to the corresponding micro service teams.
  • Set up responsibilities for development and testing of shared libraries separately from the micro services teams.

Don't forget - The microservices architectural style is not so much focusing on code organization or internal design patterns, but on the larger organizational and process relevant aspects to allow scaling application architectures, organizations and deployments. See this answer for an overview.

Solution 2:

Short

The core concept of the microservice architecture is that microservices have their independent development-release cycles. "Shared libraries" undermining this.

Longer

From my own experience, it's very important to keep microservices isolated and independent as much as possible. Isolation is basically about being able to release & deploy the service independently of any other services most of the time. In other words its something like:

  • you build a new version of a service
  • you release it (after tests)
  • you deploy it into production
  • you have not caused the deployment cascade of your whole environment.

"Shared libraries" in my definition those libraries, do hinder you to do so.


It's "funny" how "Shared Libraries" poison your architecture:

Oh we have a User object! Let's reuse it everywhere!

This leads to a "shared library" for the whole enterprise and starts to undermine Bounded Contexts (DDD), forces you to dependent on one technology

we already have this shared library with TDOs you need, written in java...

Repeating myself. The new version of this kind of shared libs will affect all services and complicate your deployments up to very fragile setups. The consequence is at some point, that nobody trusts himself to develop the next releases of the common shared library or everyone fears the big-bang releases.

All of this just for the sake of "Don't repeat yourself"? - This is not worth it (My experience proves it). T The shared compromised "User" object is very seldom better than several focused User objects in the particular Microservices in the praxis.

However, there is never a silver bullet and Sam gives us only a guideline and advice (a heuristic if you like) based on his projects.

My take

I can give you my experience. Don't start a microservice project with reasoning about shared libraries. Just don't do them in the beginning and accept some code repetition between services. Invest time in DDD and the quality of your Domain Objects and Service Boundaries. Learn on the way what are stable parts and what evolves fast.

Once you or your team gained enough insides you can refactor some parts to libraries. Such refactoring is usually very cheap in comparison to the reverse approach.

And these libraries should probably cover some boilerplate code and be focussed on one task - have several of them, not one common-lib-for- everything In the comment above Oswin Noetzelmann gave some advice on how to proceed. Taking his approach to the maximum would lead to good and focused libraries and not toxic "shared libraries"

Solution 3:

Good example of tight coupling where duplication would be acceptable can be shared library defining interface/DTOs between services. In particular using the same classes/structs to serialize/deserialize data.

Let's say you have two services - A and B - they both may accept slightly different but overall almost same looking JSON input.

It would be tempting to make one DTO describing common keys, also including the very few ones used by service A and service B as a shared library.

For some time system works fine. Both services add shared library as dependency, build and run properly.

With time, though, service A requires some additional data that would change the structure of JSON where is was the same before. As a result you can't use the same classes/structs to deserialize the JSON for both services at the same time - the change is needed for service A, but then service B won't be able to deserialize the data.

You must change shared library, add new feature to service A and rebuild it, then rebuild service B to adjust it to new version of shared library even though no logic has been changed there.

Now, would you have the DTOs defined separately, internally, for both services from the very beginning, later on, their contracts could evolve separately and safely in any direction you could imagine. Sure, at first it might have looked smelly to keep almost the same DTOs in both services but on the long run it gives you a freedom of change.

At the end of the day, (micro)services don't differ that much from monolith. Separation of concerns and isolation are critical. Some dependencies can't be avoided (language, framework, etc.) but before you introduce any additional dependency by yourself think twice about future implications.

I'd rather follow given advice - duplicate DTOs and avoid shared code unless you can't avoid it. It has bitten me in the past. Above scenario is trivial one, but it may be much more nuanced and affect much more services. Unfortunately it hits you only after some time, so the impact may be big.