Datetime field overflow when uploading csv file to SQL Server Database using BCP
I have been trying for days to fix this problem. I have a Pandas dataframe which I export to a csv file like this:
df.to_csv(csv_name, index=False, header = False, encoding='utf-8-sig', sep='\t')
I need to upload this csv file to a SQL Server Database using BCP, using this command:
bcp [DB].[dbo].[Table] in file.csv -Sserver -Uuser -Ppass -c -C65001 -t "\t" -e error.log
Every time I run it, I get this error:
Error = [Microsoft][ODBC Driver 17 for SQL Server]Datetime field overflow. Fractional second precision exceeds the scale specified in the parameter binding.
When looking at the error.log, I see there is a problem with my date value I want to upload. In the table definition, the date format is "smalldatetime". The actual value I want to upload is "2020-05-30 12:55:22".
I am just exporting a one row dataframe to csv.
This is the error message I receive (which also shows the content of the csv file):
@ Row 1, Column 8: Datetime field overflow. Fractional second precision exceeds the scale specified in the parameter binding. @#
165405677 156147965 1135358 1425879 3.5 "Nice value 🇦🇷Bordeaux Blend 🍇👍🏼
Deep purple color. Blackberry plum tobacco and oaky with hints of pepper. Full-bodied dry and slightly spicy taste.
Let it breathe for at least 1 hr
Pairs well with pecorino and salami crostini😋🍷
#uncorkingArgentina" en 2020-05-30T19:55:42.000Z True 2 444897692 67 8 "49 292 320 334 422" "'blackberry' 'oaky' 'pepper' 'plum' 'tobacco'" 3353 0 Completed 89036 3.7 536982,90,Normal,71,3.6,484,26,,,"Nice value 🇦🇷Bordeaux Blend 🍇👍🏼
Deep purple color. Blackberry, plum, tobacco and oaky with hints of pepper. Full-bodied, dry and slightly spicy taste.
Let it breathe for at least 1 hr
Pairs well with pecorino and salami crostini😋🍷
#uncorkingArgentina",False,True
This is my table definition in SQL Server
CREATE TABLE [Table] (
[column1] bigint PRIMARY KEY,
[column2] bigint,
[column3] bigint,
[column4] bigint,
[column5] float(53),
[column6] nvarchar(1500) COLLATE Latin1_General_100_CI_AI_SC_UTF8,
[column7] nvarchar(5) COLLATE Latin1_General_100_CI_AI_SC_UTF8,
[column8] smalldatetime,
[column9] bit,
[column10] nvarchar(50) COLLATE Latin1_General_100_CI_AI_SC_UTF8,
[column11] bigint,
[column12] integer,
[column13] integer,
[column14] nvarchar(150) COLLATE Latin1_General_100_CI_AI_SC_UTF8,
[column15] nvarchar(500) COLLATE Latin1_General_100_CI_AI_SC_UTF8,
[column16] bigint,
[column17] integer,
[column18] nvarchar(50) COLLATE Latin1_General_100_CI_AI_SC_UTF8,
[column19] integer,
[column20] float(53),
[column21] integer,
[column22] integer,
[column23] nvarchar(60) COLLATE Latin1_General_100_CI_AI_SC_UTF8,
[column24] integer,
[column25] float(53),
[column26] integer,
[column27] integer,
[column28] nvarchar(255) COLLATE Latin1_General_100_CI_AI_SC_UTF8,
[column29] nvarchar(255) COLLATE Latin1_General_100_CI_AI_SC_UTF8,
[column30] nvarchar(1500) COLLATE Latin1_General_100_CI_AI_SC_UTF8,
[column31] bit,
[column32] bit
)
GO
I tried to upload another rows, but I always have a problem with the date value of each one of them, no matter what the other content of the row is.
I tried everything to fix it. What do you think is the reason of this problem? Thank you!
Load the data into a staging table where all the columns are strings.
Then you can look at the data and determine where the conversion error is happening, using logic such as:
select s.*
from staging s
where try_convert(smalldatetime, column8) is null and column8 is not null;