Python JSON serialize a Decimal object
Simplejson 2.1 and higher has native support for Decimal type:
>>> json.dumps(Decimal('3.9'), use_decimal=True)
'3.9'
Note that use_decimal
is True
by default:
def dumps(obj, skipkeys=False, ensure_ascii=True, check_circular=True,
allow_nan=True, cls=None, indent=None, separators=None,
encoding='utf-8', default=None, use_decimal=True,
namedtuple_as_object=True, tuple_as_array=True,
bigint_as_string=False, sort_keys=False, item_sort_key=None,
for_json=False, ignore_nan=False, **kw):
So:
>>> json.dumps(Decimal('3.9'))
'3.9'
Hopefully, this feature will be included in standard library.
I would like to let everyone know that I tried Michał Marczyk's answer on my web server that was running Python 2.6.5 and it worked fine. However, I upgraded to Python 2.7 and it stopped working. I tried to think of some sort of way to encode Decimal objects and this is what I came up with:
import decimal
class DecimalEncoder(json.JSONEncoder):
def default(self, o):
if isinstance(o, decimal.Decimal):
return str(o)
return super(DecimalEncoder, self).default(o)
Note that this will convert the decimal to its string representation (e.g.; "1.2300"
) to a. not lose significant digits and b. prevent rounding errors.
This should hopefully help anyone who is having problems with Python 2.7. I tested it and it seems to work fine. If anyone notices any bugs in my solution or comes up with a better way, please let me know.
Usage example:
json.dumps({'x': decimal.Decimal('5.5')}, cls=DecimalEncoder)
How about subclassing json.JSONEncoder
?
class DecimalEncoder(json.JSONEncoder):
def default(self, o):
if isinstance(o, decimal.Decimal):
# wanted a simple yield str(o) in the next line,
# but that would mean a yield on the line with super(...),
# which wouldn't work (see my comment below), so...
return (str(o) for o in [o])
return super(DecimalEncoder, self).default(o)
Then use it like so:
json.dumps({'x': decimal.Decimal('5.5')}, cls=DecimalEncoder)