I am confused about JavaScript engines right now. I know that V8 was a big deal because it compiled JavaScript to native code.

Then I started reading about Mozilla SpiderMonkey, which from what I understand is written in C and can compile JavaScript. So how is this different from V8 and if this is true, why does Firefox not do this?

Finally, does Rhino literally compile the JavaScript to Java byte code so you would get all the speed advantages of Java? If not, why do people not run V8 when writing scripts on their desktops?


Solution 1:

There are various approaches to JavaScript execution, even when doing JIT. V8 and Nitro (formerly known as SquirrelFish Extreme) choose to do a whole-method JIT, meaning that they compile all JavaScript code down to native instructions when they encounter script, and then simply execute that as if it was compiled C code. SpiderMonkey uses a "tracing" JIT instead, which first compiles the script to bytecode and interprets it, but monitors the execution, looking for "hot spots" such as loops. When it detects one, it then compiles just that hot path to machine code and executes that in the future.

Both approaches have upsides and downsides. Whole-method JIT ensures that all JavaScript that is executed will be compiled and run as machine code and not interpreted, which in general should be faster. However, depending on the implementation it may mean that the engine spends time compiling code that will never be executed, or may only be executed once, and is not performance critical. In addition, this compiled code must be stored in memory, so this can lead to higher memory usage.

The tracing JIT as implemented in SpiderMonkey can produce extremely specialized code compared to a whole-method JIT, since it has already executed the code and can speculate on the types of variables (such as treating the index variable in a for loop as a native integer), where a whole-method JIT would have to treat the variable as an object because JavaScript is untyped and the type could change (SpiderMonkey will simply "fall off" trace if the assumption fails, and return to interpreting bytecode). However, SpiderMonkey's tracing JIT currently does not work efficiently on code with many branches, as the traces are optimized for single paths of execution. In addition, there's some overhead involved in monitoring execution before deciding to compile a trace, and then switching execution to that trace. Also, if the tracer makes an assumption that is later violated (such as a variable changing type), the cost of falling off trace and switching back to interpreting is likely to be higher than with a whole-method JIT.

Solution 2:

V8 is the fastest, because it compiles all JS to machine code.

SpiderMonkey (what FF uses) is fast too, but compiles to an intermediate byte-code, not machine code. That's the major difference with V8. EDIT- Newer Firefox releases come with a newer variant of SpideMonkey; TraceMonkey. TraceMonkey does JIT compilation of critical parts, and maybe other smart optimizations.

Rhino compiles Javascript into Java classes, thus allowing you to basically write "Java" applications in Javascript. Rhino is also used as a way to interpret JS in the backend and manipulate it, and have complete code understanding, such as reflection. This is used for example by the YUI Compressor.

The reason why Rhino is used instead of V8 all over the place is probably because V8 is relatively new, so a lot of projects have already been using Rhino/Spidermonkey as their JS engine, for example Yahoo widgets. (I assume that's what you're referring to with "scripts on their desktops")

edit- This link might also give some insight of why SpiderMonkey is so widely adopted. Which Javascript engine would you embed in your application?