Convert SQLITE SQL dump file to POSTGRESQL

I've been doing development using SQLITE database with production in POSTGRESQL. I just updated my local database with a huge amount of data and need to transfer a specific table to the production database.

Based on running sqlite database .dump > /the/path/to/sqlite-dumpfile.sql, SQLITE outputs a table dump in the following format:

BEGIN TRANSACTION;
CREATE TABLE "courses_school" ("id" integer PRIMARY KEY, "department_count" integer NOT NULL DEFAULT 0, "the_id" integer UNIQUE, "school_name" varchar(150), "slug" varchar(50));
INSERT INTO "courses_school" VALUES(1,168,213,'TEST Name A',NULL);
INSERT INTO "courses_school" VALUES(2,0,656,'TEST Name B',NULL);
....
COMMIT;

How do I convert the above into a POSTGRESQL compatible dump file that I can import into my production server?


Solution 1:

You should be able to feed that dump file straight into psql:

/path/to/psql -d database -U username -W < /the/path/to/sqlite-dumpfile.sql

If you want the id column to "auto increment" then change its type from "int" to "serial" in the table creation line. PostgreSQL will then attach a sequence to that column so that INSERTs with NULL ids will be automatically assigned the next available value. PostgreSQL will also not recognize AUTOINCREMENT commands, so these need to be removed.

You'll also want to check for datetime columns in the SQLite schema and change them to timestamp for PostgreSQL. (Thanks to Clay for pointing this out.)

If you have booleans in your SQLite then you could convert 1 and 0 to 1::boolean and 0::boolean (respectively) or you could change the boolean column to an integer in the schema section of the dump and then fix them up by hand inside PostgreSQL after the import.

If you have BLOBs in your SQLite then you'll want to adjust the schema to use bytea. You'll probably need to mix in some decode calls as well. Writing a quick'n'dirty copier in your favorite language might be easier than mangling the SQL if you a lot of BLOBs to deal with though.

As usual, if you have foreign keys then you'll probably want to look into set constraints all deferred to avoid insert ordering problems, placing the command inside the BEGIN/COMMIT pair.

Thanks to Nicolas Riley for the boolean, blob, and constraints notes.

If you have ` on your code, as generated by some SQLite3 clients, you need to remove them.

PostGRESQL also doesn't recognize unsigned columns, so you might want to drop that or add a custom-made constraint such as this:

CREATE TABLE tablename (
    ...
    unsigned_column_name integer CHECK (unsigned_column_name > 0)
);

While SQLite defaults null values to '', PostgreSQL requires them to be set as NULL.

The syntax in the SQLite dump file appears to be mostly compatible with PostgreSQL so you can patch a few things and feed it to psql. Importing a big pile of data through SQL INSERTs might take a while but it'll work.

Solution 2:

pgloader

I came across this post when searching for a way to convert an SQLite dump to PostgreSQL. Even though this post has an accepted answer (and a good one at that +1), I think adding this is important.

I started looking into the solutions here and realized that I was looking for a more automated method. I looked up the wiki docs:

https://wiki.postgresql.org/wiki/Converting_from_other_Databases_to_PostgreSQL

and discovered pgloader. Pretty cool application and it's relatively easy to use. You can convert the flat SQLite file into a usable PostgreSQL database. I installed from the *.deb and created a command file like this in a test directory:

load database  
    from 'db.sqlite3'  
    into postgresql:///testdb 
       
with include drop, create tables, create indexes, reset sequences  
         
set work_mem to '16MB', maintenance_work_mem to '512 MB';

like the docs state. I then created a testdb with createdb:

createdb testdb

I ran the pgloader command like this:

pgloader command

and then connected to the new database:

psql testdb

After some queries to check the data, it appears it worked quite well. I know if I had tried to run one of these scripts or do the stepwise conversion mentioned herein, I would have spent much more time.

To prove the concept I dumped this testdb and imported into a development environment on a production server and the data transferred over nicely.