Thanks all for your advices 

Well many thing to look for, but it's obvious now  that I've first to
work on (better) strategy than the one I was thinking previously (i.e.
load all the files and results in one step). 

It's is just a reflexion, but for huge files one solution might be to
split/write/build first the array in a dedicated file (2x o(n)
iterations - one to identify the blocks size - additional one to get and
write), and then to load it in memory and work with numpy - at this
stage the dimension is known and some packages will be fast and more
adapted (pandas or astropy as suggested). 

Thanks all for your time and help 

Paul
_______________________________________________
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion

Reply via email to