Thanks, Massimo!

Even if I grab 1000 at a time (which is definitely better to do!!), 
I still have to wait for minutes before I get rows back!

However, I have found that this is 'fast' :

n = db.table.count() # not sure if that's the syntax, but somehow find out 
how many records in table


for i in range(1,n):

row = db.table(i)

# do something with r


This works fine if I want to iterate through every single entry in the 
database (and I assume there are no breaks in the record.id values). 

I don't know how to do something equivalent when the query/set represents an 
arbitrary (large) number of table records.

I even tried just extracting the record.id values, 

id_index = db(q).select( db.table.id, limitby=(1000,10000) )


to then access them through db.table(i),  

row = db.table( id_index[i].id )


but just getting the id_index list of record.id's was slow!

I'm using sqlite3, on linux, with 4GB of RAM, and the DB is about 700MB in 
size (830MB with some index tables built).

Thanks!!!
Luis.

 

 

 

Reply via email to