On 4 February 2014 15:01, RayS <[email protected]> wrote: > I was struggling with methods of reading large disk files into numpy > efficiently (not FITS or .npy, just raw files of IEEE floats from > numpy.tostring()). When loading arbitrarily large files it would be nice to > not bother reading more than the plot can display before zooming in. There > apparently are no built in methods that allow skipping/striding... >
Since you mentioned the plural "files", are your datasets entirely contained within a single file? If not, you might be interested in Biggus ( https://pypi.python.org/pypi/Biggus). It's a small pure-Python module that lets you "glue-together" arrays (such as those from smmap) into a single arbitrarily large virtual array. You can then step over the virtual array and it maps it back to the underlying sources. Richard
_______________________________________________ NumPy-Discussion mailing list [email protected] http://mail.scipy.org/mailman/listinfo/numpy-discussion
