the fun of a key-value datastore!

i don't know of a way to copy the data with web2py "magic", but you can 
write a migration script that does that for you, and then as you create new 
tasks you can write all the data necessary at task creation time.

another thought - i doubt users can use over 1000 records of either at one 
time.  try using limitby in your queries to limit to data that will be 
consumed.  you might consider catching (partial) result sets from the 
less-dynamic of the 2 tables.

if you are dealing with background processes, look into "backends" that 
have unlimited time to process a request.

Reply via email to