Hi all,

I've been hunting bug #3915, and I think I've found the cause, but I
can't see an obvious solution. AFAICT, the problem only exists with
Postgres, because it's the only backend that uses sequences for
primary key allocation.

Here's a sample Django session. MyModel is a some random model - the
definition doesn't matter. The problem comes when you manually
allocate a PK value that interrupts the sequence allocated by Postgres
itself:

obj2 = MyModel(id=2, ...)
obj2.save()
# Saves fine, using the provided PK value

obj1= MyModel(...)
obj1.save()
# Saves fine

obj3 = MyModel(...)
obj3.save()
# Raises an error - pk is not unique.

This is a problem with hand-cranked examples like this one, but it
also becomes a problem if you flush the database then reload a
fixture. Fixtures define the PK values of the objects they are
restoring, so all objects are created with PK's defined, but after a
flush, the PK sequence says the next available PK is 1. If your
fixture actually _has_ a value with a PK of 1, then you won't be able
to create any new objects of that type.

I can work around this problem in a messy way in the deserializer by
manually resetting the sequence to the largest PK value that was found
during deserialization, but that isn't really clean, and doesn't fix
the manual allocation problem.

Does anyone have a better suggestion?

Yours,
Russ Magee %-)

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django developers" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-developers?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to