Hello all,
I need to copy a large block of rows (start=1,000,000 and stop=5,000,000) from 
table a to table b in the same file. Looking at the pytables documentation I 
came up with a few options.

Option 1:
for row in a.iterrows(start, stop):
    b.append([row[:]])

However a loop in python over millions of rows doesn't seem optimal.

Option 2:
b.append(a.read(start, stop))

All dthe ata is first copied from a to memory before being written to b, which 
is not scalable.

Option 3:
Use whereAppend()

This seems the best (assuming the copying of rows from table to table is done 
much more efficiently in C than I could do in Python). However whereAppend 
doesn't want to run without a condition, and I do not have a condition, just 
start and stop parameters....

Did I miss the obvious alternative? Any suggestions?
Best,
Koert

------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________
Pytables-users mailing list
Pytables-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/pytables-users

Reply via email to