Prevent long running javascript from locking up browser

I have JavaScript which performs a whole lot of calculations as well as reading/writing values from/to the DOM. The page is huge so this often ends up locking the browser for up to a minute (sometimes longer with IE) with 100% CPU usage.

Are there any resources on optimising JavaScript to prevent this from happening (all I can find is how to turn off Firefox's long running script warning)?


if you can turn your calculation algorithm into something which can be called iteratively, you could release control back the browser at frequent intervals by using setTimeout with a short timeout value.

For example, something like this...

function doCalculation()
{
   //do your thing for a short time

   //figure out how complete you are
   var percent_complete=....

   return percent_complete;
}

function pump()
{
   var percent_complete=doCalculation();

   //maybe update a progress meter here!

   //carry on pumping?
   if (percent_complete<100)
   {
      setTimeout(pump, 50);
   }
}

//start the calculation
pump();

Use timeouts.

By putting the content of your loop(s) into separate functions, and calling them from setTimeout() with a timeout of 50 or so, the javascript will yield control of the thread and come back some time later, allowing the UI to get a look-in.

There's a good workthrough here.


I had blogged about in-browser performance some time ago, but let me summarize the ones related to the DOM for you here.

  • Update the DOM as infrequently as possible. Make your changes to in-memory DOM objects and append them only once to the DOM.
  • Use innerHTML. It's faster than DOM methods in most browsers.
  • Use event delegation instead of regular event handling.
  • Know which calls are expensive, and avoid them. For example, in jQuery, $("div.className") will be more expensive than $("#someId").

Then there are some related to JavaScript itself:

  • Loop as little as possible. If you have one function that collects DOM nodes, and another that processes them, you are looping twice. Instead, pass an anonymous function to the function that collects the nodes, and process the nodes as your are collecting them.
  • Use native functionality when possible. For example, forEach iterators.
  • Use setTimeout to let the browser breathe once in a while.
  • For expensive functions that have idempotent outputs, cache the results so that you don't have to recompute it.

There's some more on my blog (link above).


This is still a little bit bleeding edge, but Firefox 3.5 has these things called Web Workers, I'm not sure about their support in other browsers though.

Mr. Resig has an article on them here: http://ejohn.org/blog/web-workers/

And the Simulated Annealing is probably the simplest example of it, if you'll notice the spinning Firefox logo does not freeze up, when the worker threads are doing their requests (thus not freezing the browser).