python dump sqlalchemy json array result into json file
A postgres query returns json array and I am trying to dump this into json file.
query = f"""
SELECT array_to_json(array_agg(results))
FROM
(SELECT * from lookup_brands limit 1000) as results
"""
json_results = exec_query(self.conn_string, query)
print(json_results) # outputs <sqlalchemy.engine.cursor.LegacyCursorResult object at 0x7f30d64bd8b0>
print(type(json_results) # outputs <class 'sqlalchemy.engine.cursor.LegacyCursorResult'>
I tried writing json_results
to json like:
with open(output_path, 'w') as outfile:
json.dump(json_results, outfile)
And I get this error
error Object of type LegacyCursorResult is not JSON serializable
which is expected.
How can I dump this json array result to the file?
Solution 1:
SQLAlchemy's LegacyCursorResult
object is an iterator over the resultset, and values in the results are converted to their Python equivalents. You need to create an object that can be serialised yourself from the results, for example:
with engine.connect() as conn:
result = conn.execute(sql)
print(json.dumps([dict(row) for row in result.mappings()]))
This seems wasteful, as Postgresql has already serialised the data. We can get around this by casting the query result to text
on the database side, so that we receive it as a valid JSON str
:
sql = """\
SELECT array_to_json(array_agg(results))::text AS data
FROM
(SELECT * FROM t70692551) AS results
"""
with engine.connect() as conn:
result = conn.execute(sql)
print(result.scalar_one())