Hello.
I have to import large amount of data from csv-files with some additional 
work on their structure.
I try 
reader = csv.reader(form.vars.data.file)
for row in reader:
    db.mytable.insert()
and so huge memory consumption. Standart behavior commit-before-response 
does not fit to mez
I add explicit db.commit() after insert and in testing environment with 
sqlite database it works like a charm, but when i try to import data in a 
production server with firebird 2.5 (And fdb driver) - i have same trouble.
Is that database-related or web2py-related problem?

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to