> Perhaps there is a size threshold? You could experiment with different block > sizes in the following f.read() replacement: > > def read_chunked(f, size=2**20): > read = functools.partial(f.read, size) > return "".join(iter(read, "")) > Under win32 platform, my experience is that the fastest way to read binary file from disk is the mmap module. You should try that too. -- https://mail.python.org/mailman/listinfo/python-list
- cPickle.load vs. file.read+cPickle.loads on large bina... andrea . gavana
- Re: cPickle.load vs. file.read+cPickle.loads on l... Peter Otten
- Re: cPickle.load vs. file.read+cPickle.loads on l... andrea . gavana
- Re: cPickle.load vs. file.read+cPickle.loads ... Chris Angelico
- Re: cPickle.load vs. file.read+cPickle.loads ... andrea . gavana
- Re: cPickle.load vs. file.read+cPickle.lo... Peter Otten
- Re: cPickle.load vs. file.read+cPickl... Nagy László Zsolt
- Re: cPickle.load vs. file.read+cPickl... andrea . gavana
- Re: cPickle.load vs. file.read+cPickle.lo... andrea . gavana
- Re: cPickle.load vs. file.read+cPickl... Peter Otten