JavaScript memory limit
In Chrome and Chromium OS, the memory limit is defined by the browser, and you can inspect the limit with the following command in the Developer Tools command-line by hitting F12:
> window.performance.memory.jsHeapSizeLimit
1090519040
On my Windows 10 OS, it is about 1 GB.
On Chrom(e/ium), you can get around the heap size limit by allocating native arrays:
var target = []
while (true) {
target.push(new Uint8Array(1024 * 1024)); // 1Meg native arrays
}
This crashes the tab at around 2GB, which happens very rapidly. After that Chrom(e/ium) goes haywire, and repeating the test is not possible without restarting the browser.
I also recommend reading TrackJS's blogpost about Monitoring JavaScript Memory before you get deep into the weeds trying to diagnose or measure anything memory related in the browser.
You can also search comp.lang.javascript for javascript memory limit.
See also these Stack Overflow posts:
Maximum size of an Array in Javascript, which suggests you can store up to 232-1 = 4,294,967,295 = 4.29 billion elements.
Maximum number of arguments a JavaScript function can accept
There is additional knowledge on the JS9 astronomical image display library website: Dealing with Memory Limitations.
(I was trying to find a good answer, and the "there is no upper limit" answer provided here was just silly to me. I cannot run into a production issue for a multi-million dollar project and say to management, "Well, I assumed there is no upper limit and everything would be okay." Try to do a proof-of-concept, e.g. loading lots of combobox controls in your JavaScript UI framework of choice, etc. You may discover your framework has some performance degradation.)
Here are some components that I've found scale very well both in CPU performance and memory performance:
-
Microsoft Monaco editor
- This is used by several commercial projects:
- Postman, as of v7.1.1-canary08
- VS Code
- This is used by several commercial projects:
Here are some examples of frameworks with well-known performance degradation:
- Angular: Poor change detection approach.
- For each async event, compare each of the bindings (Model-Dom binding) to its old value to decide if to re-render.
- NG1: >2500 watchers, performance grinds to a halt
- NG2: the same problem remains but you have a long tiring workaround: Switch to immutables and spread ChangeDetectionStrategy.onPush all over your app to turn off the default problematic strategy
- For each async event, compare each of the bindings (Model-Dom binding) to its old value to decide if to re-render.
- React
- Again, Immutable collections of JS objects only scale so far.
- create-react-app internally uses Immutable.JS, and Immutable.JS can only create about 500k immutable collections before it dies.
- Again, Immutable collections of JS objects only scale so far.
Here are some other things to think about:
- Use array.slice for manipulating arrays to minimize additional array allocations; array.slice will modify the array in place, which will reduce garbage collection and overall heap size.
AFAIK, there is no upper limit, your script can basically use memory until the system runs out of memory (including swap). No upper limit doesn't mean you have to eat it all, users may not like it.
Firefox supports the option "javascript.options.mem.max" and if you search on that you can find discussions about sensible values that people have found workable.
Not sure how many people can be bothered going in there and setting it, but speaking for myself I set it to 128000 (which is 128M).
I think the memory limitation is from the browser. We can use the DevTools to figure that out. Like in chrome, press F12 and enter window.performance.memory you can see the memory info.
window.performance.memory
There are no memory limitations for a Javascript program. Your script can hog all the RAM on your machine. However, it is not recommended to use up all the memory on users machine. If you are dealing with a lot of data I would suggest that you check out caching.