A Monday 17 January 2011 11:13:58 Ben Elliston escrigué:
> I would like to run Numpy-like methods (eg. max) on a very large
> CArray that is too big to fit in physical memory.  Currently I have
> to chunk access to the array by slicing the array into (say) four
> very NumPy arrays (one at a time).
> 
> Is there a way to get a NumPy array that will transparently manage
> memory use?

Maybe you want to use the iterator for doing this:

>>> f = tables.openFile("/tmp/test.h5", "w")
>>> carr = f.createCArray(f.root, 'carr', tables.Float64Atom(dflt=1.), 
(1000,1000))
>>> s = 0
>>> for r in carr: s += r.sum()
   ....: 
>>> s
1000000.0

This is very efficient because a buffer is used internally for the I/O.

HTH,

-- 
Francesc Alted

------------------------------------------------------------------------------
Protect Your Site and Customers from Malware Attacks
Learn about various malware tactics and how to avoid them. Understand 
malware threats, the impact they can have on your business, and how you 
can protect your company and customers by using code signing.
http://p.sf.net/sfu/oracle-sfdevnl
_______________________________________________
Pytables-users mailing list
Pytables-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/pytables-users

Reply via email to