On 9/22/06, Bill Baxter [EMAIL PROTECTED] wrote:
Do you also have a 64-bit processor?Just checking since you didn't mention it.--bbOn 9/22/06, [EMAIL PROTECTED]
[EMAIL PROTECTED] wrote: I would like to read files 2Gbyte. From earlier posting I believed it should be possible with python 2.5. I am
On 9/7/06, Glen W. Mabey [EMAIL PROTECTED] wrote:
A long time ago, Travis wrote: My understanding is that using memory-mapped files for *very* largefiles will require modification to the mmap module in Python ---Did anyone ever pick up the ball on this issue?
This works with python-2.5 betas and
On 7/24/06, Travis Oliphant [EMAIL PROTECTED] wrote:
Mike Ressler wrote: I'm trying to work with memmaps on very large files, i.e. 2 GB, up to 10 GB. Can't believe I'm really the first, but so be it.
I just discovered the problem.All the places wherePyObject_AsRead/WriteBuffer is used needs
On 7/26/06, Robert Kern [EMAIL PROTECTED] wrote:
If someone can explain the rules of engagement for Lightning Talks, I'm thinking about presenting this at SciPy 2006. Then you'll see there is a reason for my madness.Unfortunately, we only have scheduled 30 minutes of lightning talks this year.
We
My apologies if this is a duplicate - my first attempt doesn't seem to have gone back to the list.-- Forwarded message --From: Mike Ressler
[EMAIL PROTECTED]Date: Jul 25, 2006 12:17 PMSubject: Re: ***[Possible UCE]*** [Numpy-discussion] Bug in memmap/python allocation code
I'm trying to work with memmaps on very large files, i.e. 2 GB, up to 10 GB. The files are data cubes of images (my largest is 1290(x)x1024(y)x2011(z)) and my immediate task is to strip the data from 32-bits down to 16, and to rearrange some of the data on a per-xy-plane basis. I'm running this