[
https://issues.apache.org/jira/browse/TUSCANY-1815?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12531763
]
Amita Vadhavkar commented on TUSCANY-1815:
------------------------------------------
I tried the below test case which is very similar to what you are trying:-
-----------------------------------------------------------------------------------------------------------------------------------
public void testMergeUpdateSingleTable() throws Exception {
//existing records
DAS das = DAS.FACTORY.createDAS(getConfig("CustomersOrdersConfig.xml"),
getConnection());
DataObject root = das.getCommand("all customers").executeQuery();
DataObject firstCustomer = root.getDataObject("CUSTOMER[ID=1]");
firstCustomer.delete();//DAS will not automatic createOrReplace , so do
delete here and later doing a create
System.out.println("after delete "+XMLHelper.INSTANCE.save(root,
"root", "root"));
//create new customer using static SDO - say a service is doing it
String typeUri =
"http:///org.apache.tuscany.das.rdb.test/customernew.xsd";
HelperContext context = HelperProvider.getDefaultContext();
CustomernewFactory.INSTANCE.register(context);
ConfigHelper helper = new ConfigHelper();
helper.setDataObjectModel(typeUri);
helper.addTable("CUSTOMER", "CUSTOMER");
helper.addPrimaryKey("CUSTOMER.ID");
GraphMerger gm = new GraphMerger();
DataObject graph = gm.emptyGraph(helper.getConfig());
CUSTOMER c = (CUSTOMER) graph.createDataObject("CUSTOMER");
c.setID(1);
c.setLASTNAME("WilliamsNew");
c.setADDRESS("400 Fourth Street");
System.out.println("c
"+XMLHelper.INSTANCE.save(((DataObject)c).getRootObject(), "c", "c"));
gm.addPrimaryKey("CUSTOMER.ID");
DataObject root1 = gm.merge(root, ((DataObject)c).getRootObject());
System.out.println("after merge "+XMLHelper.INSTANCE.save(root1,
"root1", "root1"));
das.applyChanges(root1);
}
---------------------------------------------------------------------------------------------------------------------------------
So I am mixing the types from Dynamic DO and Types coming in from model.xsd.
I did not get any ClassCastException but a nice :) SQLIntegrity one like below
:-
java.lang.RuntimeException: java.sql.SQLIntegrityConstraintViolationException:
The statement was aborted because it would have caused a duplicate key value in
a unique or primary key constraint or unique index identified by
'SQL071002050115810' defined on 'CUSTOMER'.
at
org.apache.tuscany.das.rdb.impl.WriteCommandImpl.basicExecute(WriteCommandImpl.java:94)
at
org.apache.tuscany.das.rdb.impl.ChangeOperation.execute(ChangeOperation.java:72)
at org.apache.tuscany.das.rdb.impl.Changes.execute(Changes.java:57)
at
org.apache.tuscany.das.rdb.impl.ApplyChangesCommandImpl.execute(ApplyChangesCommandImpl.java:68)
at
org.apache.tuscany.das.rdb.impl.DASImpl.applyChanges(DASImpl.java:310)
at
org.apache.tuscany.das.rdb.test.GraphMergeTests.testMergeUpdateSingleTable(GraphMergeTests.java:217)
....
Caused by: java.sql.SQLIntegrityConstraintViolationException: The statement was
aborted because it would have caused a duplicate key value in a unique or
primary key constraint or unique index identified by 'SQL071002050115810'
defined on 'CUSTOMER'.
at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.Util.generateCsSQLException(Unknown
Source)
at
org.apache.derby.impl.jdbc.TransactionResourceImpl.wrapInSQLException(Unknown
Source)
at
org.apache.derby.impl.jdbc.TransactionResourceImpl.handleException(Unknown
Source)
at org.apache.derby.impl.jdbc.EmbedConnection.handleException(Unknown
Source)
at org.apache.derby.impl.jdbc.ConnectionChild.handleException(Unknown
Source)
at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown
Source)
at
org.apache.derby.impl.jdbc.EmbedPreparedStatement.executeStatement(Unknown
Source)
at
org.apache.derby.impl.jdbc.EmbedPreparedStatement.executeUpdate(Unknown Source)
at
org.apache.tuscany.das.rdb.impl.Statement.executeUpdate(Statement.java:172)
at
org.apache.tuscany.das.rdb.impl.Statement.executeUpdate(Statement.java:132)
at
org.apache.tuscany.das.rdb.impl.Statement.executeUpdate(Statement.java:136)
at
org.apache.tuscany.das.rdb.impl.WriteCommandImpl.basicExecute(WriteCommandImpl.java:92)
... 21 more
Caused by: java.sql.SQLException: The statement was aborted because it would
have caused a duplicate key value in a unique or primary key constraint or
unique index identified by 'SQL071002050115810' defined on 'CUSTOMER'.
at
org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown Source)
at
org.apache.derby.impl.jdbc.SQLExceptionFactory40.wrapArgsForTransportAcrossDRDA(Unknown
Source)
... 34 more
---------------------------------------------------------------------------------------------------------------------------------
Also I can see a very small fix can solve this exception. In Changed.java
execute() method - which is execed for das.applyChanges(root1),
the changes are attempted in order - 1) insert, 2) update, 3) delete
Due to this, even though the test case was trying to do a delete(for ID=1) and
then create(for ID=1), the DAS runtime was attempting
INSERT before DELETE and was throwing SQLIntegrity exception.
A simple/quick solution will be follow the order - 1)delete, 2) insert, 3)
update - verified that with this, the delete and insert happens one after other
and no exception.
But a better solution will be to maintain the oder of changes user is imposing
through the calls. i.e. if user has a mix of many tables in a root
and is doing some complex combination of CUD operations following some logic,
DAS needs to preserve this order instead of bucketing all
insert in bucket1, all deletes in bucket2...and then executing the buckets. For
this to work, commonj.sdo.ChangeSummary.getChangedDataObjects()
need return "ordered list". Need to confirm this with SDO.
I am putting this in a separate mail on ML to get some feedback from others
(DAS+SDO) about what will be the correct approach.
Please see that in the example code you are showing, just creating Type, may
not register that Type in DAS context. But
gm.emptyGraph(helper.getConfig()); will enable DAS to create DataObject for
that Type.
> das.applyChanges will always do insert instead of update if createDataObject
> was used
> -------------------------------------------------------------------------------------
>
> Key: TUSCANY-1815
> URL: https://issues.apache.org/jira/browse/TUSCANY-1815
> Project: Tuscany
> Issue Type: Bug
> Components: Java DAS RDB
> Affects Versions: Java-DAS-beta1, Java-DAS-Next
> Environment: DB2 Iseries
> Reporter: Nick Duncan
> Assignee: Amita Vadhavkar
> Fix For: Java-DAS-Next
>
>
> If I do something like:
> -------------------------------
> DataObject root = das.getCommand("AllAutos").executeQuery();
>
> DataObject dao = root.createDataObject("t_test");
> dao.set("NAME", "NICK");
> dao.set("ID", 100);
>
> das.applyChanges(root);
> -------------------------------------
> There is already a row in the table with primary key 100. ID is defined in
> the config xml as being the primary key, it ignores that ID was set and does
> an insert statement. ID is also defined as an auto generated column.
> Basically I was expecting something like Hibernate's savorOrUpdate... Maybe
> if the field that represents primary key is shown to have been changed in
> the changeSummary, then DAS will figure out what statement to generate.
> Where I'm seeing this being a problem is if we get a dataobject that
> represents a row in the database, and then send it off to a service, we get
> back another object that has been updated, but since it is a different object
> then the DAS will think it should do an update? Merge doesn't seem to
> alleviate this problem either. Please let me know if I am way off base here.
> Thanks
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]