SQL WHERE ID IN (id1, id2, ..., idn)

I need to write a query to retrieve a big list of ids.

We do support many backends (MySQL, Firebird, SQLServer, Oracle, PostgreSQL ...) so I need to write a standard SQL.

The size of the id set could be big, the query would be generated programmatically. So, what is the best approach?

1) Writing a query using IN

SELECT * FROM TABLE WHERE ID IN (id1, id2, ..., idn)

My question here is. What happens if n is very big? Also, what about performance?

2) Writing a query using OR

SELECT * FROM TABLE WHERE ID = id1 OR ID = id2 OR ... OR ID = idn

I think that this approach does not have n limit, but what about performance if n is very big?

3) Writing a programmatic solution:

  foreach (var id in myIdList)
  {
      var item = GetItemByQuery("SELECT * FROM TABLE WHERE ID = " + id);
      myObjectList.Add(item);
  }

We experienced some problems with this approach when the database server is queried over the network. Normally is better to do one query that retrieve all results versus making a lot of small queries. Maybe I'm wrong.

What would be a correct solution for this problem?


Option 1 is the only good solution.

Why?

  • Option 2 does the same but you repeat the column name lots of times; additionally the SQL engine doesn't immediately know that you want to check if the value is one of the values in a fixed list. However, a good SQL engine could optimize it to have equal performance like with IN. There's still the readability issue though...

  • Option 3 is simply horrible performance-wise. It sends a query every loop and hammers the database with small queries. It also prevents it from using any optimizations for "value is one of those in a given list"


An alternative approach might be to use another table to contain id values. This other table can then be inner joined on your TABLE to constrain returned rows. This will have the major advantage that you won't need dynamic SQL (problematic at the best of times), and you won't have an infinitely long IN clause.

You would truncate this other table, insert your large number of rows, then perhaps create an index to aid the join performance. It would also let you detach the accumulation of these rows from the retrieval of data, perhaps giving you more options to tune performance.

Update: Although you could use a temporary table, I did not mean to imply that you must or even should. A permanent table used for temporary data is a common solution with merits beyond that described here.