Let me make some comments:

Using primitive, then there is no other way to specify a nullability. I think we can take to the developer a choice to use nullable and primitive int but keep it always warned that 0 will save NULL in the database, and a NULL in database will, in turn, be converted to a 0 in primitive field. I think a WARN in docs is sufficient.

Maybe a future "nullable" attribute in field-descritor could help, with values "zeroIsNull", "negativeIsNull", "false", "true" (where false means default action: 0 == 0, -1 == -1, and true means actual behaviour 0==null. The other options talk by itself).

But in PK, that doesn't allow NULL, if the primitive field is 0 we can have two situations: if the mapping tell us that there is an auto-increment field, then user doesn't want a real 0, but a newly inserted field. Auto-increments, as far as I have seen don't use 0 as starting point.

<comment type="maybe_offtopic">As interesting history, some weeks ago a beginner developer asked me about a product type codes: 0 - Service, 1 - Resale product, 2 - Own factoried product. I said to him: never, ever, use 0 as PK to avoid problems for understanding. In meantime, another begginer at his side said "0, in my point of view, is expected to means nothing, so I'll never use it as value for a PK". I'm tending to agree with this vision.</comment>

So, OJB could test if the field is auto-increment, and, if true, then assume that it's a new record being inserted. If false, assume that it's a 0 since there is no NULLs in PKs, and should be a update if already exists. This change, as far as I can see, will not change the current behaviour of OJB, right?

Just my 2 cents,

Edson Richter


Armin Waibel wrote:


Guido Beutler wrote:

Hi,

changing

public boolean representsNull(FieldDescriptor fld, Object aValue)
{
.....
if(((aValue instanceof Number) && (((Number) aValue).longValue() == 0)))
{
>>> result = fld.getPersistentField().getType().isPrimitive() && fld.isPrimaryKey();
}
....
return result;
}


into

result = fld.getPersistentField().getType().isPrimitive() && ! fld.isPrimaryKey();
seems to fix my problem, updates are generated now and insert's work too.
One of our OJB Guru's like Armin should take a look on it. ;-)
I just made some small tests and can not be sure that this don't produce some side effects.
If my fix is correct, updates never worked for objects with 0 values at primary key fields.



Maybe I'm on the wrong tack, but I don't figure out the change. This method should return 'true'
- when the field as number of '0'
- and the field is primitive (short in your case)
- and the field-descriptor is declared as PK


In your patch you return true, when the field was not declared as PK. But in that case all values are valid ('0' too). Your patch do the opposite from what I expected. Maybe I'm completely confused ;-)

Again, if you have a class with primitive short field, declared as PK with value '0', OJB assume that the field is nullified and because PK couldn't be 'null' OJB assume the given object was new.
Hope I don't make a fool of oneself ;-)


regards,
Armin

best regards,

Guido


Guido Beutler wrote:


Guido Beutler wrote:

Armin Waibel wrote:

do you use anonymous keys? If so where?

>> Do you remember our DataSource problem with 3.2.2.RC3 with missing
>> results?

No, can you describe this problem again? Should I update to JBoss 3.2.3 and run the tests again?






we had the problem, that not all objects were returned by OJB. This seemed to be a side effect of the eager-release
flag. After Update to JBoss 3.2.3 the problem disapeared. Maybe that the bahavior of our current problem is different
in 3.2.3.
I'll put some debug gcode into PersistenceBroker and see what's going on during insert/update.


best regards,

Guido




Hi,

I added some debug code to

PersistenceBrokerImpl :

   public void store(Object obj) throws PersistenceBrokerException
   {

...

boolean doInsert = serviceBrokerHelper().hasNullPKField(cld, obj);

returns true. The reason seems to be BrokerHelper.representsNull :

public boolean representsNull(FieldDescriptor fld, Object aValue)
{
.....
if(((aValue instanceof Number) && (((Number) aValue).longValue() == 0)))
{
result = fld.getPersistentField().getType().isPrimitive() && fld.isPrimaryKey();
}
....
return result;
}


returns true for my SMALLINT objects if the value is 0. But 0 is a leagal value for SMALLINT PK attributes.
After that PersistenceBrokerImpl.store checks the cache:


/*
if PK values are set, lookup cache or db to see whether object
needs insert or update
*/
if (!doInsert)
{
doInsert = objectCache.lookup(oid) == null
&& !serviceBrokerHelper().doesExist(cld, oid, obj);
}


because of doInsert is still true (I checked it) The cache is never checked for the object. doInsert is still true and
a few lines later


           // now store it:
           store(obj, oid, cld, doInsert);

generates the insert statement. Maybe I'm wrong but for me it looks like a 0 (not null) at any PK field causes a insert statement.
In my case it is immediately the first object. Is it a good idea to check the cache independent if doInsert is true or is the implementation
of representsNull the cause and should be changed ?


best regards,

Guido





---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]






--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to