Quote from Aaron's comments of JIRA. (
https://issues.apache.org/jira/browse/ODE-704)

Hi Jeff,

Thanks for the pointers on helping me get started on this task. I have setup
a github project and created a fork of your JPA branch. I have made some
progress with separating the JPA and hibernate code apart based on your
initial effort and I also have started to attempt to unify the JPA DAO
patterns used in the DAO and bpel-store modules. I had a few questions and I
was hoping you could provide me some guidance on them.

1) Maven modules - To prevent module proliferation I was thinking that the
DAO, store, and scheduler implementations could be stored in the same module
per implementation. For example,

ode-dao - contains new generic DAO ConnectionFactory interface(s) and the
existing core DAOConnection interfaces
dao-jpa - contains all the JPA entities, including dao, store, and soon to
be scheduler. The store and scheduler implementations would still be in
different packages and persistence units.
dao-hibernate - contains all the hibernate entities, including dao, store,
and soon to be scheduler. Includes factory implementation classes that
implement generic interfaces in ode-dao
dao-jpa-ojpa - contains factory implementation classes that implement
generic interfaces in ode-dao. The factory primes the JPA environment with
the the openjpa specific properties
dao-jpa-hibernate - same as dao-jpa-ojpa but for hibernate
bpel-store - contains factory implementation classes that implement generic
interfaces in ode-dao and the existing store connection code.
bpel-scheduler-simple - the same as bpel-store with new connection based DAO

il-common - OdeConfigProperties updated to include new factory class lookups
for the store and scheduler implementations

What do you think of this approach?

2) OpenJPA class enhancements - OpenJPA requires that the classes be
annotated in order to be utilized unless a JavaEE container is used or their
runtime agent is utilized. The problem is that these class enhancements
would interfere with a different JPA implementations and duplicating the
entities per implementation module would lead to too much redundancy. To
address this I took the approach of extending the dao-jpa maven pom to
duplicate the target classes, run the enhancer on the copy, and then run the
maven-jar plugin again with a classifer of openjpa so that two versions of
the module will be stored in the maven repo. Is this a valid approach?

3) When examining the ode 2.0 source code I found a lot of references to
JDBC datasources and transaction managers in the DAO classes. To me the DAO
abstraction classes should be storage implementation agnostic. I would like
to refactor the connection interfaces to use the same interface and
introduce a common DAO strategy.

//new common interface
public interface DAOConnectionFactory<C> {

  C getConnection();

  <E> void init(OdeConfigProperties p, E envCtx);

  void shutdown();
}

//Module specific DAO access implementation (dao, store, scheduler, etc)
public interface BpelDAOConnectionFactory extends
DAOConnectionFactory<BpelDAOConnection> {

  BpelDAOConnection getConnection();


}

//Implementation specific factory that can be instantiated by the module
using an OdeConfigProperties lookup
public class BpelDAOConnectionFactoryJPAImpl implements
BpelDAOConnectionFactory {

  public BpelDAOConnection getConnection() {
   return new BpelDAOConnectionJpaImpl();
  }

  public <JDBCContext>void init(OdeConfigProperties p, JDBCContext envCtx) {


  }

  public void shutdown() {

  }

}

//This is a implementation specific environment context that the runtime the
ode is executed in can pass opaquely through the DAO layer into the
implementation.
public class JDBCContext {

   DataSource getDataSource(){}
   TransactionManager getTransactionManager(){}

}


What are your thoughts on this approach?

Regards,

Aaron

-- 
Cheers,
Jeff Yu

----------------
blog: http://jeff.familyyu.net

Reply via email to