Do I need dependency injection in NodeJS, or how to deal with ...?
Solution 1:
In short, you don't need a dependency injection container or service locater like you would in C#/Java. Since Node.js, leverages the module pattern
, it's not necessary to perform constructor or property injection. Although you still can.
The great thing about JS is that you can modify just about anything to achieve what you want. This comes in handy when it comes to testing.
Behold my very lame contrived example.
MyClass.js
:
var fs = require('fs');
MyClass.prototype.errorFileExists = function(dir) {
var dirsOrFiles = fs.readdirSync(dir);
for (var d of dirsOrFiles) {
if (d === 'error.txt') return true;
}
return false;
};
MyClass.test.js
:
describe('MyClass', function(){
it('should return an error if error.txt is found in the directory', function(done){
var mc = new MyClass();
assert(mc.errorFileExists('/tmp/mydir')); //true
});
});
Notice how MyClass
depends upon the fs
module? As @ShatyemShekhar mentioned, you can indeed do constructor or property injection as in other languages. But it's not necessary in Javascript.
In this case, you can do two things.
You can stub the fs.readdirSync
method or you can return an entirely different module when you call require
.
Method 1:
var oldmethod = fs.readdirSync;
fs.readdirSync = function(dir) {
return ['somefile.txt', 'error.txt', 'anotherfile.txt'];
};
*** PERFORM TEST ***
*** RESTORE METHOD AFTER TEST ****
fs.readddirSync = oldmethod;
Method 2:
var oldrequire = require
require = function(module) {
if (module === 'fs') {
return {
readdirSync: function(dir) {
return ['somefile.txt', 'error.txt', 'anotherfile.txt'];
};
};
} else
return oldrequire(module);
}
The key is to leverage the power of Node.js and Javascript. Note, I'm a CoffeeScript guy, so my JS syntax might be incorrect somewhere. Also, I'm not saying that this is the best way, but it is a way. Javascript gurus might be able to chime in with other solutions.
Update:
This should address your specific question regarding database connections. I'd create a separate module to encapsulate your database connection logic. Something like this:
MyDbConnection.js
: (be sure to choose a better name)
var db = require('whichever_db_vendor_i_use');
module.exports.fetchConnection() = function() {
//logic to test connection
//do I want to connection pool?
//do I need only one connection throughout the lifecyle of my application?
return db.createConnection(port, host, databasename); //<--- values typically from a config file
}
Then, any module that needs a database connection would then just include your MyDbConnection
module.
SuperCoolWebApp.js
:
var dbCon = require('./lib/mydbconnection'); //wherever the file is stored
//now do something with the connection
var connection = dbCon.fetchConnection(); //mydbconnection.js is responsible for pooling, reusing, whatever your app use case is
//come TEST time of SuperCoolWebApp, you can set the require or return whatever you want, or, like I said, use an actual connection to a TEST database.
Do not follow this example verbatim. It's a lame example at trying to communicate that you leverage the module
pattern to manage your dependencies. Hopefully this helps a bit more.
Solution 2:
require()
and most recently ES Modules (import
) are THE way for managing dependencies in Node.js and surely it is intuitive and effective, but it has also its limitations.
My advice is to take a look at some of the Dependency Injection containers available today for Node.js to have an idea of what are their pros/cons. Some of them are:
- awilix
- injection-js
- bottlejs
- inversify
- node-dependency-injection
Just to name a few.
Now the real question is, what can you achieve with a Node.js DI container, compared to a simple require()
or import
?
Pros:
- better testability: modules accepts their dependencies as input
- Inversion of Control: decide how to wire your modules without touching the main code of your application.
- a customizable algorithm for resolving modules: dependencies have "virtual" identifiers, usually they are not bound to a path on the filesystem.
- Better extensibility: enabled by IoC and "virtual" identifiers.
- Other fancy stuff possible:
- Async initialization
- Module lifecycle management
- Extensibility of the DI container itself
- Can easily implement higher level abstractions (e.g. AOP)
Cons:
- Different from the Node.js "experience": using DI definitely feels like you are deviating from the Node way of thinking.
- The relationship between a dependency and its implementation is not always explicit. A dependency may be resolved at runtime and influenced by various parameters. The code becomes more difficult to understand and debug
- Slower startup time
- Most DI containers will not play well with module bundlers like Browserify and Webpack.
As with anything related to software development, choosing between DI or require()
/import
depends on your requirements, your system complexity, and your programming style.
Solution 3:
I know this thread is fairly old at this point, but I figured I'd chime in with my thoughts on this. The TL;DR is that due to the untyped, dynamic nature of JavaScript, you can actually do quite a lot without resorting to the dependency injection (DI) pattern or using a DI framework. However, as an application grows larger and more complex, DI can definitely help the maintainability of your code.
DI in C#
To understand why DI isn't as big of a need in JavaScript, it's helpful to look at a strongly typed language like C#. (Apologies to those who don't know C#, but it should be easy enough to follow.) Say we have an app that describes a car and its horn. You would define two classes:
class Horn
{
public void Honk()
{
Console.WriteLine("beep!");
}
}
class Car
{
private Horn horn;
public Car()
{
this.horn = new Horn();
}
public void HonkHorn()
{
this.horn.Honk();
}
}
class Program
{
static void Main()
{
var car = new Car();
car.HonkHorn();
}
}
There are few issues with writing the code this way.
- The
Car
class is tightly coupled to the particular implementation of the horn in theHorn
class. If we want to change the type of horn used by the car, we have to modify theCar
class even though its usage of the horn doesn't change. This also makes testing difficult because we can't test theCar
class in isolation from its dependency, theHorn
class. - The
Car
class is responsible for the lifecycle of theHorn
class. In a simple example like this it's not a big issue, but in real applications dependencies will have dependencies, which will have dependencies, etc. TheCar
class would need to be responsible for creating the entire tree of its dependencies. This is not only complicated and repetitive, but it violates the "single responsibility" of the class. It should focus on being a car, not creating instances. - There is no way to reuse the same dependency instances. Again, this isn't important in this toy application, but consider a database connection. You would typically have a single instance that is shared across your application.
Now, let's refactor this to use a dependency injection pattern.
interface IHorn
{
void Honk();
}
class Horn : IHorn
{
public void Honk()
{
Console.WriteLine("beep!");
}
}
class Car
{
private IHorn horn;
public Car(IHorn horn)
{
this.horn = horn;
}
public void HonkHorn()
{
this.horn.Honk();
}
}
class Program
{
static void Main()
{
var horn = new Horn();
var car = new Car(horn);
car.HonkHorn();
}
}
We've done two key things here. First, we've introduced an interface that our Horn
class implements. This lets us code the Car
class to the interface instead of the particular implementation. Now the code could take anything that implements IHorn
. Second, we've taken the horn instantiation out of Car
and pass it in instead. This resolves the issues above and leaves it to the application's main function to manage the specific instances and their lifecycles.
What this means is that would could introduce a new type of horn for the car to use without touching the Car
class:
class FrenchHorn : IHorn
{
public void Honk()
{
Console.WriteLine("le beep!");
}
}
The main could just inject an instance of the FrenchHorn
class instead. This also dramatically simplifies testing. You could create a MockHorn
class to inject into the Car
constructor to ensure you are testing just the Car
class in isolation.
The example above shows manual dependency injection. Typically DI is done with a framework (e.g. Unity or Ninject in the C# world). These frameworks will do all of the dependency wiring for you by walking your dependency graph and creating instances as needed.
The Standard Node.js Way
Now let's look at the same example in Node.js. We would probably break our code into 3 modules:
// horn.js
module.exports = {
honk: function () {
console.log("beep!");
}
};
// car.js
var horn = require("./horn");
module.exports = {
honkHorn: function () {
horn.honk();
}
};
// index.js
var car = require("./car");
car.honkHorn();
Because JavaScript is untyped, we don't have the quite the same tight coupling that we had before. There is no need for interfaces (nor do they exist) as the car
module will just attempt to call the honk
method on whatever the horn
module exports.
Additionally, because Node's require
caches everything, modules are essentially singletons stored in a container. Any other module that performs a require
on the horn
module will get the exact same instance. This makes sharing singleton objects like database connections very easy.
Now there is still the issue that the car
module is responsible for fetching its own dependency horn
. If you wanted the car to use a different module for its horn, you'd have to change the require
statement in the car
module. This is not a very common thing to do, but it does cause issues with testing.
The usual way people handle the testing problem is with proxyquire. Owing to the dynamic nature of JavaScript, proxyquire intercepts calls to require and returns any stubs/mocks you provide instead.
var proxyquire = require('proxyquire');
var hornStub = {
honk: function () {
console.log("test beep!");
}
};
var car = proxyquire('./car', { './horn': hornStub });
// Now make test assertions on car...
This is more than enough for most applications. If it works for your app then go with it. However, in my experience as applications grow larger and more complex, maintaining code like this becomes harder.
DI in JavaScript
Node.js is very flexible. If you aren't satisfied with the method above, you can write your modules using the dependency injection pattern. In this pattern, every module exports a factory function (or a class constructor).
// horn.js
module.exports = function () {
return {
honk: function () {
console.log("beep!");
}
};
};
// car.js
module.exports = function (horn) {
return {
honkHorn: function () {
horn.honk();
}
};
};
// index.js
var horn = require("./horn")();
var car = require("./car")(horn);
car.honkHorn();
This is very much analogous to the C# method earlier in that the index.js
module is responsible for instance lifecycles and wiring. Unit testing is quite simple as you can just pass in mocks/stubs to the functions. Again, if this is good enough for your application go with it.
Bolus DI Framework
Unlike C#, there are no established standard DI frameworks to help with your dependency management. There are a number of frameworks in the npm registry but none have widespread adoption. Many of these options have been cited already in the other answers.
I wasn't particularly happy with any of the options available so I wrote my own called bolus. Bolus is designed to work with code written in the DI style above and tries to be very DRY and very simple. Using the exact same car.js
and horn.js
modules above, you can rewrite the index.js
module with bolus as:
// index.js
var Injector = require("bolus");
var injector = new Injector();
injector.registerPath("**/*.js");
var car = injector.resolve("car");
car.honkHorn();
The basic idea is that you create an injector. You register all of your modules in the injector. Then you simply resolve what you need. Bolus will walk the dependency graph and create and inject dependencies as needed. You don't save much in a toy example like this, but in large applications with complicated dependency trees the savings are huge.
Bolus supports a bunch of nifty features like optional dependencies and test globals, but there are two key benefits I've seen relative to the standard Node.js approach. First, if you have a lot of similar applications, you can create a private npm module for your base that creates an injector and registers useful objects on it. Then your specific apps can add, override, and resolve as needed much like how AngularJS's injector works. Second, you can use bolus to manage various contexts of dependencies. For example, you could use middleware to create a child injector per request, register the user id, session id, logger, etc. on the injector along with any modules depending on those. Then resolve what you need to serve requests. This gives you instances of your modules per request and prevents having to pass the logger, etc. along to every module function call.
Solution 4:
I've also written a module to accomplish this, it's called rewire. Just use npm install rewire
and then:
var rewire = require("rewire"),
myModule = rewire("./path/to/myModule.js"); // exactly like require()
// Your module will now export a special setter and getter for private variables.
myModule.__set__("myPrivateVar", 123);
myModule.__get__("myPrivateVar"); // = 123
// This allows you to mock almost everything within the module e.g. the fs-module.
// Just pass the variable name as first parameter and your mock as second.
myModule.__set__("fs", {
readFile: function (path, encoding, cb) {
cb(null, "Success!");
}
});
myModule.readSomethingFromFileSystem(function (err, data) {
console.log(data); // = Success!
});
I've been inspired by Nathan MacInnes's injectr but used a different approach. I don't use vm
to eval the test-module, in fact I use node's own require. This way your module behaves exactly like using require()
(except your modifications). Also debugging is fully supported.
Solution 5:
I built Electrolyte for just this purpose. The other dependency injection solutions out there were too invasive for my tastes, and messing with the global require
is a particular grievance of mine.
Electrolyte embraces modules, specifically those that export a "setup" function like you see in Connect/Express middleware. Essentially, these types of modules are just factories for some object they return.
For example, a module that creates a database connection:
var mysql = require('mysql');
exports = module.exports = function(settings) {
var connection = mysql.createConnection({
host: settings.dbHost,
port: settings.dbPort
});
connection.connect(function(err) {
if (err) { throw err; }
});
return connection;
}
exports['@singleton'] = true;
exports['@require'] = [ 'settings' ];
What you see at the bottom are annotations, an extra bit of metadata that Electrolyte uses to instantiate and inject dependencies, automatically wiring your application's components together.
To create a database connection:
var db = electrolyte.create('database');
Electrolyte transitively traverses the @require
'd dependencies, and injects instances as arguments to the exported function.
The key is that this is minimally invasive. This module is completely usable, independent of Electrolyte itself. That means your unit tests can test just the module under test, passing in mock objects without need for additional dependencies to rewire internals.
When running the full application, Electrolyte steps in at the inter-module level, wiring things together without the need for globals, singletons or excessive plumbing.