Your setting might be using DEBUG = True, which save all SQL queries and
its start/stop time for debugging when something goes wrong. Try turning
it to False.
[EMAIL PROTECTED] wrote:
Hi all
I am wondering if other people have noticed an apparent bug in Django
when using the DB API to insert a large number (~1 million) of records
(each record ~2k) into MySQL (WinXP/Python 2.3).
Here is a simplified/clean version of the code I am running:
from django.models.appname import foos
filelist= glob.glob("path/*.dat")
for file in filelist:
f = open(file)
data = f.read()
f.close()
records = parseData(data)
for r in records:
myfoo = foos.Foo(myfield=r)
myfoo.save()
When I run this on this large chunk of data the memory footprint of the
Python process continues to climb until the machine runs out of memory.
If the last two lines however are replaced by a direct MySQLdb
connection the memory is stable at approximately 4mb and finishes
without problems.
I have tried to set the "f" and "foo" variables to None but it doesn't
help.
Any ideas what is wrong ?
br
Vidar Masson