Hi Tammo,

> For the migration from old omodels one approach would be to create a copy
> of the OModel in a different package, do the refactoring there and change
> the compiler and the runtime to point to the new OModel.
> When an old cbp-file is found, it is loaded to the old OModel, is then
migrated to the
> new OModel (using that mysterious function X) and is then written back to
a
> file in order to complete the migration.
I think the approach and the Function X for migration are both very
practical. I need to go into the source code for more details.

> First the migration from the old model to the new model. Second the
> migration from the new model to newer models. The goal of the project is
to
> make the second point as hazzlefree as possible.
I'd like to talk on a little more on the compatibility and migration
topics. I think it somewhat more sophisticated than I originally thought.

The current OModel models BPEL entity to corresponding java Object, e.g.
whole process to OProcess. When the ODE is running, it watchs certain
directory (WEB-INF/processes/) for new deployment. If umcompiled .bpel
found, it will read the .bpel into BOM objects and then serialize them into
.cbp file. When processes are invoked, the runtime load cbps and runs the
processes.
In my view, the OModel can be divided into two parts, serialization
mechanism and java representation mechanism, when BPEL evolves, I think two
types of compatibility should be considered in migration.
For example, we want to support new elements in a process:
<bpel:process>
  ...
  <b4p:humanInteractions>
    ...
  <b4p:humanInteractions>
  ...
</bpel:process>

First, serialization compatibility
In our OProcess implementation, there are no fields corresponding to the
new elements. Then we implement a new version of OProcess with new elements
as new fields. The new versions are not compatible to the old one under
java serialization, which means old version binary cannot be de-serialized
to new version runtime objects. Then we need run different OModels in
parallel. And as a result, duplicated run-time code. If we use JSON-like
serialize mechanism, deserialize-compatible will be easier. And I think
JSON-like serialize mechanism is able achieve serialize/deserialize
compatibility without change java representation mechanism of OModel(using
simple fields in serializable classes of bpel-obj).
Second, java representation compatibility
In some cases like bugfix, the serialization compatibility seems
sufficient, but does not in many cases. If we decide to support BPEL4People
extension, java representation of OModel will change, like new fields or
classes. Then compiler and run-time needs to change to adapt. It could
adapt to new model in ideal conditions. But I think it's impossible that
it's complete compatible when BPEL evolves. We need to design java
representation mechanism, compiler and runtime carefully to make changes
needed as few as possible. Further, if BPEL4People are supported, is it
possible to disable it since I don't need the feature and enable it will
waste resources? I think a plugin mechanism for compiler and runtime might
be an option. I haven't go into source code of these components, I've no
idea if it's practical...

In my opinion, the former compatibility could be achieved almost once for
all by a JSON-like or ther flexible serialize mechanism, and it will bother
little in later migrations. But the latter one are beyond OModel and exists
in every migration. What we could do is do best to reduce migration cost.
It may involve more than OModel refactor, but compiler and runtime to some
degree.
Hoping I made myself clear.

Thanks,
  Zhen

Reply via email to