I have files (from an external source) that contain ~10 GB of
big-endian uint16's that I need to read into a series of arrays. What
I'm doing now is
import numpy as np
import struct
fd = open('file.raw', 'rb')
for n in range(10000)
count = 1024*1024
a = np.array([struct.unpack('>H', fd.read(2)) for i in range(count)])
# do something with a
It doesn't seem very efficient to call struct.unpack one element at a
time, but struct doesn't have an unpack_farray version like xdrlib
does. I also thought of using the array module and .byteswap() but
the help says it only work on 4 and 8 byte arrays.
Any ideas?
Thanks,
Greg
_______________________________________________
NumPy-Discussion mailing list
[email protected]
http://mail.scipy.org/mailman/listinfo/numpy-discussion