list memory usage in ipython and jupyter
Assuming that you are using ipython
or jupyter
, you will need to do a little bit of work to get a list all of the objects you have defined. That means taking everything available in globals()
and filtering out objects that are modules
, builtins
, ipython objects
, etc. Once you are sure you have those objects, then you can proceed to grabbing their sizes with sys.getsizeof
. This can be summed up as follows:
import sys
# These are the usual ipython objects, including this one you are creating
ipython_vars = ['In', 'Out', 'exit', 'quit', 'get_ipython', 'ipython_vars']
# Get a sorted list of the objects and their sizes
sorted([(x, sys.getsizeof(globals().get(x))) for x in dir() if not x.startswith('_') and x not in sys.modules and x not in ipython_vars], key=lambda x: x[1], reverse=True)
Please keep in mind that for python objects (those created with python's builtin functions), sys.getsizeof
will be very accurate. But it can be a bit inaccurate on objects created using third-party libraries. Furthermore, please be mindful that sys.getsizeof
adds an additional garbage collector overhead if the object is managed by the garbage collector. So, some things may look a bit heavier than they actually are.
As a side note, numpy
's .nbytes
method can be somewhat misleading in that it does not include memory consumed by non-element attributes of the array object.
I hope this helps.