Will SQLite performance degrade if the database size is greater than 2 gigabytes?
Last year when I checked about SQLite on their web site, the recommended SQLite database size was 2 gigabytes. But now, I could not find that recommendation again.
So has anyone tried to work with an SQLite database that is bigger than 2 gigabytes using latest version of it? How well did SQLite perform?
P.S: I would like to make a mobile application that requires big database (for example storing Wikipedia articles) that works locally.
There is no 2 GB limit.
SQLite database files have a maximum size of about 140 TB.
On a phone, the size of the storage (a few GB) will limit your database file size, while the memory size will limit how much data you can retrieve from a query. Furthermore, Android cursors have a limit of 1 MB for the results.
The database size will, by itself, not affect your performance. Your queries will be fast as long as they do not access more data than fits into the DB's page cache (2 MB by default).
Usually the larger the database the more data you have in it. The more data you have, the longer searches may take. They don't have to, it depends on a search.
As for inserts, they may take longer if you have many indexes on a table. Rebuilding an index may take some time, so expect insert speed degradation with the amount of data.
Updates may also be slower - fitting rows must be found first (search), then values have to be changed (may trigger an index rebuild).
I am telling you this from experience: if you expect a lot of data in your database, consider splitting it into multiple databases. This works if your data is gathered daily and you can create a database for each day. May make your search code more complex, but will speed things up for limited searches/