My problem: I am writing a simple Python tool to help me visualize my data as a function of many parameters. Each change in parameters involves a non-trivial amount of time, so I would like to cache each step's resulting imagery and supporting data in a dictionary. But then I worry that this dictionary could grow too large over time. Most of my data is in the form of Numpy arrays.
My question: How would one go about computing the total number of bytes used by a Python dictionary. The dictionary itself may contain lists and other dictionaries, each of which contain data stored in Numpy arrays.
Ideas?