Dear All,
In using the numpy.darray, I met a memory overflow problem due to the size of data,for example:
Suppose I have a 100000000 * 100000000 * 100000000 float64 array data source, when I want to read data and process it in memory with np. It will raise a Memoray Error because it works out all memory for storing such a big array in memory.
Then maybe using a disk file / database as a buffer to store the array is a solution, when I want to use data, it will get the necessary data from the file / database, otherwise, it is just a python object take few memory.
Is it possible write such a adapter?
Thanks.
Rgs, KC