Hello!

I am trying to load really huge dataset (contains millions of records)
into database through fixtures and it seems that django loads entire
data into memory before commiting it into db so python process dies
out of memory. Am I right that the only possible solution in my case
is not to use fixtures at all and upload data by script ?

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to