Bulk INSERT in Postgres in GO using pgx
You are crafting the SQL statement by hand, which is fine, but you are not leveraging pgx
which can help with this (see below).
Appending to the SQL string like so can be inefficient for large inputs
dollars = fmt.Sprintf("%s($%d, $%d),", dollars, count, count+1)
but also the final value has a trailing ,
where instead you need a termination character ;
to indicate the end of the statement.
BTW this string truncation line is redundant:
sqlStr = sqlStr[0 : len(sqlStr)-1] // this is a NOOP
Anyway, better to use something more performant like strings.Builder when crafting long strings.
From the pgx
docs, use pgx.Conn.CopyFrom:
func (c *Conn) CopyFrom(tableName Identifier, columnNames []string, rowSrc CopyFromSource) (int, error)
CopyFrom uses the PostgreSQL copy protocol to perform bulk data insertion. It returns the number of rows copied and an error.
example usage of Copy:
rows := [][]interface{}{
{"John", "Smith", int32(36)},
{"Jane", "Doe", int32(29)},
}
copyCount, err := conn.CopyFrom(
pgx.Identifier{"people"},
[]string{"first_name", "last_name", "age"},
pgx.CopyFromRows(rows),
)
use batch (https://github.com/jackc/pgx/blob/master/batch_test.go):
batch := &pgx.Batch{}
batch.Queue("insert into ledger(description, amount) values($1, $2)", "q1", 1)
batch.Queue("insert into ledger(description, amount) values($1, $2)", "q2", 2)
br := conn.SendBatch(context.Background(), batch)