There is no reliable way to predict how much memory an arbitrary numpy
operation will need, no. However, in most cases the main memory cost will
be simply the need to store the input and output arrays; for large arrays,
all other allocations should be negligible.

The most effective way to avoid running out of memory, therefore, is to
avoid creating temporary arrays, by using only in-place operations.

E.g., if a and b each require N bytes of ram, then memory requirements
(roughly).

c = a + b: 3N
c = a + 2*b: 4N
a += b: 2N
np.add(a, b, out=a): 2N
b *= 2; a += b: 2N

Note that simply loading a and b requires 2N memory, so the latter code
samples are near-optimal.

Of course some calculations do require the use of temporary storage space...

-n
On 24 Jan 2014 15:19, "Dinesh Vadhia" <dineshbvad...@hotmail.com> wrote:

>  I want to write a general exception handler to warn if too much data is
> being loaded for the ram size in a machine for a successful numpy array
> operation to take place.  For example, the program multiplies two floating
> point arrays A and B which are populated with loadtext.  While the data is
> being loaded, want to continuously check that the data volume doesn't pass
> a threshold that will cause on out-of-memory error during the A*B
> operation.  The known variables are the amount of memory available, data
> type (floats in this case) and the numpy array operation to be performed.
> It seems this requires knowledge of the internal memory requirements of
> each numpy operation.  For sake of simplicity, can ignore other memory
> needs of program.  Is this possible?
>
>
> _______________________________________________
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
http://mail.scipy.org/mailman/listinfo/numpy-discussion

Reply via email to