How to improve performance of Jquery autocomplete

I was planning to use jquery autocomplete for a site and have implemented a test version. Im now using an ajax call to retrieve a new list of strings for every character input. The problem is that it gets rather slow, 1.5s before the new list is populated. What is the best way to make autocomplete fast? Im using cakephp and just doing a find and with a limit of 10 items.


Solution 1:

This article - about how flickr does autocomplete is a very good read. I had a few "wow" experiences reading it.

"This widget downloads a list of all of your contacts, in JavaScript, in under 200ms (this is true even for members with 10,000+ contacts). In order to get this level of performance, we had to completely rethink how we send data from the server to the client."

Solution 2:

Try preloading your list object instead of doing the query on the fly.

Also the autocomplete has a 300 ms delay by default.
Perhaps remove the delay

$( ".selector" ).autocomplete({ delay: 0 });

Solution 3:

1.5-second intervals are very wide gaps to serve an autocomplete service.

  1. Firstly optimize your query and db connections. Try keeping your db connection alive with memory caching.
  2. Use result caching methods if your service is highly used to ignore re-fetchs.
  3. Use a client cache (a JS list) to keep the old requests on the client. If user types back and erases, it is going to be useful. Results will come from the frontend cache instead of backend point.
  4. Regex filtering on the client side wont be costly, you may give it a chance.

Solution 4:

Before doing some optimizations you should first analyze where the bottle-neck is. Try to find out how long each step (input → request → db query → response → display) takes. Maybe the CakePHP implementation has a delay not to send a request for every character entered.

Solution 5:

Server side on PHP/SQL is slow.

Don't use PHP/SQL. My autocomplete written on C++, and uses hashtables to lookup. See performance here.

This is Celeron-300 computer, FreeBSD, Apache/FastCGI.

And, you see, runs quick on huge dictionaries. 10,000,000 records isn't a problem.

Also, supports priorities, dynamic translations, and another features.