Python: How to estimate / calculate memory footprint of data structures?

Guppy has a nice memory profiler (Heapy):

>>> from guppy import hpy
>>> hp = hpy()
>>> hp.setrelheap() # ignore all existing objects
>>> d = {}
>>> d['key'] = [ (1131, 3.11e18), (9813, 2.48e19), (4991, 9.11e18) ]
>>> hp.heap()
 Partition of a set of 24 objects. Total size = 1464 bytes.
 Index  Count   %     Size   % Cumulative  % Kind (class / dict of class)
     0      2   8      676  46       676  46 types.FrameType
     1      6  25      220  15       896  61 str
     2      6  25      184  13      1080  74 tuple
 ...

Heapy is a little underdocumented, so you might have to dig through the web page or source code a little, but it's very powerful. There are also some articles which might be relevant.


You can do this with a memory profiler, of which there are a couple I'm aware of:

  1. PySizer - poissibly obsolete, as the homepage now recommends:

  2. Heapy.

This is possibly a duplicate of this question.