Writing a csv file into SQL Server database using python

Solution 1:

Consider building the query dynamically to ensure the number of placeholders matches your table and CSV file format. Then it's just a matter of ensuring your table and CSV file are correct, instead of checking that you typed enough ? placeholders in your code.

The following example assumes

  1. CSV file contains column names in the first line
  2. Connection is already built
  3. File name is test.csv
  4. Table name is MyTable
  5. Python 3
...
with open ('test.csv', 'r') as f:
    reader = csv.reader(f)
    columns = next(reader) 
    query = 'insert into MyTable({0}) values ({1})'
    query = query.format(','.join(columns), ','.join('?' * len(columns)))
    cursor = connection.cursor()
    for data in reader:
        cursor.execute(query, data)
    cursor.commit()

If column names are not included in the file:

...
with open ('test.csv', 'r') as f:
    reader = csv.reader(f)
    data = next(reader) 
    query = 'insert into MyTable values ({0})'
    query = query.format(','.join('?' * len(data)))
    cursor = connection.cursor()
    cursor.execute(query, data)
    for data in reader:
        cursor.execute(query, data)
    cursor.commit()

Solution 2:

I modified the code written above by Brian as follows since the one posted above wouldn't work on the delimited files that I was trying to upload. The line row.pop() can also be ignored as it was necessary only for the set of files that I was trying to upload.

def upload_table(path, filename, delim, cursor):
    """
    Function to upload flat file to sqlserver
    """
    tbl = filename.split('.')[0]
    cnt = 0
    with open (path + filename, 'r') as f:
        reader = csv.reader(f, delimiter=delim)
        for row in reader:
            row.pop() # can be commented out
            row = ['NULL' if val == '' else val for val in row]
            row = [x.replace("'", "''") for x in row]
            out = "'" + "', '".join(str(item) for item in row) + "'"
            out = out.replace("'NULL'", 'NULL')
            query = "INSERT INTO " + tbl + " VALUES (" + out + ")"
            cursor.execute(query)
            cnt = cnt + 1
            if cnt % 10000 == 0:
                cursor.commit()
        cursor.commit()
    print("Uploaded " + str(cnt) + " rows into table " + tbl + ".")

Solution 3:

You can pass the columns as arguments. For example:

for rows in csv_data: # Iterate through csv
    cur.execute("INSERT INTO MyTable(Col1,Col2,Col3,Col4) VALUES (?,?,?,?)", *rows)

Solution 4:

If you are using MySqlHook in airflow , if cursor.execute() with params throw san error

TypeError: not all arguments converted during string formatting

use %s instead of ?

with open('/usr/local/airflow/files/ifsc_details.csv','r') as csv_file:
    csv_reader = csv.reader(csv_file)
    columns = next(csv_reader)
    query = '''insert into ifsc_details({0}) values({1});'''
    query = query.format(','.join(columns), ','.join(['%s'] * len(columns)))
    mysql = MySqlHook(mysql_conn_id='local_mysql')
    conn = mysql.get_conn()
    cursor = conn.cursor()
    for data in csv_reader:
        cursor.execute(query, data)
    cursor.commit()