On 21/10/10 13:31, Chris Withers wrote:

> ...which is a little odd, given that the file was created by 'dumpdata'.
> Any ideas?
> 

Do you see any genuine wierdness in the format of any stringified
datetimes in the dumped json? Yes I know you've got 132 megs, but I
don't mean check totally manually, just do something like (untested):

import re
dre = re.compile(r'^\d\d\d\d-\d\d-\d\d \d\d:\d\d(:\d\d)?(\.\d+)?$')
from django.utils import simplejson
d = simplejson.load(file('/path/to/dump.json'))
[i['pk'] for i in d if not re.match(dre, i['fields']['my_datefield'])]

- if that comes back nonempty, you've got some object(s) in the dump
where my_datefield's value doesn't match the required format for
conversion back into a datetime from a string.

> I'm on Django 1.1...

Perhaps you could try a different approach: using django 1.2's multiple
database connections to do a live-live transfer, that way you're not
dealing with quite so many conversions.  There is also a way to bodge a
mostly-functional 1 read-write + N read-only multiple connections
arrangement into django 1.1, but it is horrible (further details
available if you really want them). (of course, just because you're
using 1.1 for your main app doesn't mean you can't use 1.2 to tape
together a transfer tool...)


-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-us...@googlegroups.com.
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en.

Reply via email to