psycopg2 insert python dictionary as json

cur.execute("INSERT INTO product(store_id, url, price, charecteristics, color, dimensions) VALUES (%s, %s, %s, %s, %s, %s)", (1,  'http://www.google.com', '$20', json.dumps(thedictionary), 'red', '8.5x11'))

That will solve your problem. However, you really should be storing keys and values in their own separate columns. To retrieve the dictionary, do:

cur.execute('select charecteristics from product where store_id = 1')
dictionary = json.loads(cur.fetchone()[0])

From the psycopg docs:

Note You can use register_adapter() to adapt any Python dictionary to JSON, either registering Json or any subclass or factory creating a compatible adapter:

psycopg2.extensions.register_adapter(dict, psycopg2.extras.Json)

This setting is global though, so it is not compatible with similar adapters such as the one registered by register_hstore(). Any other object supported by JSON can be registered the same way, but this will clobber the default adaptation rule, so be careful to unwanted side effects.

So, in my case what I did was:

from psycopg2.extensions import register_adapter

register_adapter(dict, Json)

It worked like a charm.


You can use psycopg2.extras.Json to convert dict to json that postgre accept.

from psycopg2.extras import Json

thedictionary = {'price money': '$1', 
'name': 'Google', 'color': '', 'imgurl': 'http://www.google.com/images/nav_logo225.png', 'charateristics': 'No Description', 'store': 'google'}

item ={
    "store_id":1,
    "url": 'http://www.google.com', 
    "price":'$20', 
    "charecteristics":Json(thedictionary), 
    "color":'red', 
    "dimensions":'8.5x11'
}

def sql_insert(tableName, data_dict):
    '''
    INSERT INTO product (store_id,  url,  price,  charecteristics,  color,  dimensions)
    VALUES (%(store_id)s, %(url)s, %(price)s, %(charecteristics)s, %(color)s, %(dimensions)s );
    '''
    sql = '''
        INSERT INTO %s (%s)
        VALUES (%%(%s)s );
        '''   % (tableName, ',  '.join(data_dict),  ')s, %('.join(data_dict))
    return sql

tableName = 'product'
sql = sql_insert(tableName, item)

cur.execute(sql, item)

For more information, you can see the official document.

class psycopg2.extras.Json(adapted, dumps=None)

    An ISQLQuote wrapper to adapt a Python object to json data type.

    Json can be used to wrap any object supported by the provided dumps function. If none is provided, the standard json.dumps() is used (simplejson for Python < 2.6; getquoted() will raise ImportError if the module is not available).

    dumps(obj)
    Serialize obj in JSON format.

    The default is to call json.dumps() or the dumps function provided in the constructor. You can override this method to create a customized JSON wrapper.

First, the error means that you're trying to push a dict value into a column type that cannot accept it ( TEXT etc .. )

The accepted solution was correct to convert it from JSON/dict -> string in order to store it.

BUT, there is a column type that can accept it: JSON

I would suggest to create a JSON field in the first place in order to keep dict-like objects. and the reason is that:

  1. you can simply push the dict as is to the DB w/o json.dumps or other conversions ( because remember that when you push - you need to json.dumps but when you read it in python later on you need json.loads ( to convert back from string -> dict ).
  2. You can query its content in a real JSON column, what you cannot do when it's a string.

https://www.postgresqltutorial.com/postgresql-json/

So when creating a column i would suggest make a default of {} vs NULL:

CREATE TABLE my_table (
   ...
   my_json_col JSON default '{}'::JSON
   ...
)