How can I find files quicker than find or locate?

I have been using find command to find files on my 1 tb hard disk. it takes very long. then I used locate which proved to be faster with regular update using updatedb. But the limitation of locate is that I cannot find files with certain size or modified/created time. can you suggest me any ideas on how to find files at more speed or in that case how to pipe output of locate command in a way that all other information like size, time, etc. can be displayed or redirected to a file.


I haven't seen an answer that comes near the wishes of Chaitanya. If you want to search on filename, a combination of locate, find, ls and grep could be sufficient. But I think Chaitanya want to search for example for 'all files created before 2011'. This can perfectly done with find, but I can imagine it will take a long time searching through 1TB (depends more on the amount of files, not necessarily the total size). To speed this up I think indexing is inevitable. The problem of locate (indexing with updatedb) is that it doesn't index creation time.

So what Chaitanya need is something that indexes the needed attributes of files (file name, file size, creation date, more?). And later something that can search on these attributes. As far as I know there is no out-of-the-box solution for this on Ubuntu.

An important comment of Chaitanya: "Now the thing is that I am designing a php based web gui...". Because your problem sounds quite specific, maybe you want to build somehting yourself. Some suggestions:

  • Index the files in a database (with cronjob) and use SQL to search.

  • Use Lucene to index and search (Zend Lucene for PHP)