Understanding Node.js modules: multiple requires return the same object?
I have a question related to the node.js documentation on module caching:
Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.
Multiple calls to require('foo') may not cause the module code to be executed multiple times. This is an important feature. With it, "partially done" objects can be returned, thus allowing transitive dependencies to be loaded even when they would cause cycles.
What is meant with may
?
I want to know if require will always return the same object. So in case I require a module A in app.js
and change the exports object within app.js
(the one that require returns) and after that require a module B in app.js
that itself requires module A, will I always get the modified version of that object, or a new one?
// app.js
var a = require('./a');
a.b = 2;
console.log(a.b); //2
var b = require('./b');
console.log(b.b); //2
// a.js
exports.a = 1;
// b.js
module.exports = require('./a');
If both app.js
and b.js
reside in the same project (and in the same directory) then both of them will receive the same instance of A
. From the node.js documentation:
... every call to
require('foo')
will get exactly the same object returned, if it would resolve to the same file.
The situation is different when a.js
, b.js
and app.js
are in different npm modules. For example:
[APP] --> [A], [B]
[B] --> [A]
In that case the require('a')
in app.js
would resolve to a different copy of a.js
than require('a')
in b.js
and therefore return a different instance of A
. There is a blog post describing this behavior in more detail.
node.js has some kind of caching implemented which blocks node from reading files 1000s of times while executing some huge server-projects.
This cache is listed in the require.cache
object. I have to note that this object is read/writeable which gives the ability to delete files from the cache without killing the process.
http://nodejs.org/docs/latest/api/globals.html#require.cache
Ouh, forgot to answer the question. Modifying the exported object does not affect the next module-loading. This would cause much trouble... Require always return a new instance of the object, no reference. Editing the file and deleting the cache does change the exported object
After doing some tests, node.js does cache the module.exports. Modifying require.cache[{module}].exports
ends up in a new, modified returned object.
Since the question was posted, the document has been updated to make it clear why "may" was originally used. It now answers the question itself by making things explicit (my emphasis to show what's changed):
Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.
Provided require.cache is not modified, multiple calls to require('foo') will not cause the module code to be executed multiple times. This is an important feature. With it, "partially done" objects can be returned, thus allowing transitive dependencies to be loaded even when they would cause cycles.