Hi,
Thanks for your answer.
I use ubuntu 12.04 32 bits and python 2.7
I upgrade numpy to 1.8, but the error persists
I think that the problem is in gzip.py :
max_read_chunk = 10 * 1024 * 1024 # 10Mb
What do you think?
Best regards,
AMIRA
2013/12/31 numpy-discussion-requ...@scipy.org
Send
-discussion
End of NumPy-Discussion Digest, Vol 87, Issue 35
-- next part --
An HTML attachment was scrubbed...
URL:
http://mail.scipy.org/pipermail/numpy-discussion/attachments/20140101/279def51/attachment.html
On 01.01.2014 16:50, Amira Chekir wrote:
On 31.12.2013 14:13, Amira Chekir wrote:
Hello together,
I try to load a (large) NIfTI file (DMRI from Human Connectome Project,
about 1 GB) with NiBabel.
import nibabel as nib
img = nib.load(dmri.nii.gz)
data = img.get_data()
The
Hello,
I'm having issues with performing operations on an array in C and
passing it back to Python. The array values seem to become unitialized
upon being passed back to Python. My first attempt involved initializing
the array in C as so:
double a_fin[max_mth];
where max_mth is an int. I fill
On 1 Jan 2014 13:57, Bart Baker bart...@gmail.com wrote:
Hello,
I'm having issues with performing operations on an array in C and
passing it back to Python. The array values seem to become unitialized
upon being passed back to Python. My first attempt involved initializing
the array in C as