Replacing Pandas or Numpy Nan with a None to use with MysqlDB
@bogatron has it right, you can use where
, it's worth noting that you can do this natively in pandas:
df1 = df.where(pd.notnull(df), None)
Note: this changes the dtype of all columns to object
.
Example:
In [1]: df = pd.DataFrame([1, np.nan])
In [2]: df
Out[2]:
0
0 1
1 NaN
In [3]: df1 = df.where(pd.notnull(df), None)
In [4]: df1
Out[4]:
0
0 1
1 None
Note: what you cannot do recast the DataFrames dtype
to allow all datatypes types, using astype
, and then the DataFrame fillna
method:
df1 = df.astype(object).replace(np.nan, 'None')
Unfortunately neither this, nor using replace
, works with None
see this (closed) issue.
As an aside, it's worth noting that for most use cases you don't need to replace NaN with None, see this question about the difference between NaN and None in pandas.
However, in this specific case it seems you do (at least at the time of this answer).
df = df.replace({np.nan: None})
Note: this changes the dtype of all affected columns to object
.
Credit goes to this guy here on this Github issue.
You can replace nan
with None
in your numpy array:
>>> x = np.array([1, np.nan, 3])
>>> y = np.where(np.isnan(x), None, x)
>>> print y
[1.0 None 3.0]
>>> print type(y[1])
<type 'NoneType'>
After stumbling around, this worked for me:
df = df.astype(object).where(pd.notnull(df),None)
Another addition: be careful when replacing multiples and converting the type of the column back from object to float. If you want to be certain that your None
's won't flip back to np.NaN
's apply @andy-hayden's suggestion with using pd.where
.
Illustration of how replace can still go 'wrong':
In [1]: import pandas as pd
In [2]: import numpy as np
In [3]: df = pd.DataFrame({"a": [1, np.NAN, np.inf]})
In [4]: df
Out[4]:
a
0 1.0
1 NaN
2 inf
In [5]: df.replace({np.NAN: None})
Out[5]:
a
0 1
1 None
2 inf
In [6]: df.replace({np.NAN: None, np.inf: None})
Out[6]:
a
0 1.0
1 NaN
2 NaN
In [7]: df.where((pd.notnull(df)), None).replace({np.inf: None})
Out[7]:
a
0 1.0
1 NaN
2 NaN