Hello All,

I have a very large csv file 14G and I am planning to move all of my
data to hdf5. I am using h5py to load the data. The biggest problem I
am having is, I am putting the entire file into memory and then
creating a dataset from it. This is very inefficient and it takes over
4 hours to create the hdf5 file.

The csv file has various types:
int4, int4, str, str, str, str, str

I was wondering if anyone knows of any techniques to load this file faster?

TIA
-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to