But why's the browser DOM still so slow after 10 years of effort?

Solution 1:

When you change something in the DOM it can have myriad side-effects to do with recalculating layouts, style sheets etc.

This isn't the only reason: when you set element.innerHTML=x you are no longer dealing with ordinary "store a value here" variables, but with special objects which update a load of internal state in the browser when you set them.

The full implications of element.innerHTML=x are enormous. Rough overview:

  • parse x as HTML
  • ask browser extensions for permission
  • destroy existing child nodes of element
  • create child nodes
  • recompute styles which are defined in terms of parent-child relationships
  • recompute physical dimensions of page elements
  • notify browser extensions of the change
  • update Javascript variables which are handles to real DOM nodes

All these updates have to go through an API which bridges Javascript and the HTML engine. One reason that Javascript is so fast these days is that we compile it to some faster language or even machine code, masses of optimisations happen because the behaviour of the values is well-defined. When working through the DOM API, none of this is possible. Speedups elsewhere have left the DOM behind.

Solution 2:

Firstly, anything you do to the DOM could be a user visible change. If you change the DOM, the browser has to lay everything out again. It could be faster, if the browser caches the changes, then only lays out every X ms (assuming it doesn't do this already), but perhaps there's not a huge demand for this kind of feature.

Second, innerHTML isn't a simple operation. It's a dirty hack that MS pushed, and other browsers adopted because it's so useful; but it's not part of the standard (IIRC). Using innerHTML, the browser has to parse the string, and convert it to a DOM. Parsing is hard.

Solution 3:

Original test author is Hixie (http://nontroppo.org/timer/Hixie_DOM.html).

This issue has been discussed on StackOverflow here and Connect (bug-tracker) as well. With IE10, the issue is resolved. By resolved, I mean they have partially moved on to another way of updating DOM.

IE team seems to handle the DOM update similar to Excel-macros team at Microsoft, where it's considered a poor practice to update the live-cells on the sheet. You, the developer, is supposed to take the heavy lifting task offline and then update the live team in batch. In IE you are supposed to do that using document-fragment (as opposed to document). With new emerging ECMA and W3C standards, document-frags are depreciated. So IE team has done some pretty work to contain the issue.

It took them few weeks to strip it down from ~42,000 ms in IE10-ConsumerPreview to ~600 ms IE10-RTM. But it took lots of leg pulling to convince them that this IS an issue. Their claim was that there is no real-world example which has 10,000 updates per element. Since the scope and nature of rich-internet-applications (RIAs) can't be predicted, its vital to have performance close to the other browsers of the league. Here is another take on DOM by OP on MS Connect (in comments):

When I browse to http://nontroppo.org/timer/Hixie_DOM.html, it takes ~680ms and if I save the page and run it locally, it takes ~350ms!

Same thing happens if I use button-onclick event to run the script (instead of body-onload). Compare these two versions:

jsfiddle.net/uAySs/ <-- body onload

vs.

jsfiddle.net/8Kagz/ <-- button onclick

Almost 2x difference..

Apparently, the underlying behavior of onload and onclick varies as well. It may get even better in future updates.