Hi Zhen,

I think the proposal ist just fine. Please also add a section explaining
how much time you can spend and if there other (conflicting) assignments
during the gsoc schedule. Besides of that I'm happy to see the proposal
submitted.

Thanks,
  Tammo


On Wed, Mar 19, 2014 at 3:28 PM, Fang Zhen <fangzhenz...@gmail.com> wrote:

> Hi Tammo,
> I think I'm clear of what and how the project is to do on the whole. My
> proposal is attached. Hoping your feedback on anything :-).
>
> Thanks,
>   Zhen
>
>
> 2014-03-18 21:14 GMT+08:00 Tammo van Lessen <tvanles...@gmail.com>:
>
> Hi Zhen,
>>
>> good analysis. As I side mark, actually java serialization can deal with
>> the addition of new fields, however, we sometimes even had difficulties
>> with that as well. I agree, if the new OModel provides the flexibility,
>> plugin mechanisms for compiler and runtime would be needed for extensions
>> like BPEL4People. Some time ago I developed (basic) support for extension
>> activities (which is now hibernated in some branch, waiting for the new
>> OModel basically ;)) and struggled with exactly this problems. Although
>> its
>> possible to add hooks to compiler and runtime for plugins, there was no
>> way
>> to let the plugin extend the OModel. If the OModel can support this, this
>> would be very good. From a design perspective, I think it could be solved
>> like that: Reimplement the OModel based on maps and provide basically two
>> APIs. One low-level-API which wraps the map and allows plugins to add new,
>> plugin related, perhaps prefixed fields. The other API provides basically
>> convenient access to the fields needed by the core runtime, eg following
>> to
>> what the current OModel provides. The serialization however should be
>> based
>> on JSON (or smile). Since for the JSON representation it doesn't matter if
>> the data is backed by fields or a map, it could be possible to keep the
>> mismatch between a JSON serialization of the old OModel and the new OModel
>> as small as possible.
>>
>> Regarding the migrations within the OModel, we could add a version
>> attribute and give all OModel classes the possibility to migrate. For
>> instance we could check if the version of loaded model is older than the
>> version of the implementation and then call a migrate()-function
>> recursively on the OModel. Each class could then decide whether it needs
>> to
>> migrate something or not.
>>
>> WDYT?
>>
>> Are you already in the process of composing a proposal? If I'm not
>> mistaken
>> there are only a couple of days left, so if you like to get my feedback
>> before it would be good to discuss that soonish :)
>>
>> Best,
>>   Tammo
>>
>>
>>
>>
>> On Sat, Mar 15, 2014 at 3:41 PM, Fang Zhen <fangzhenz...@gmail.com>
>> wrote:
>>
>> > Hi Tammo,
>> >
>> > > For the migration from old omodels one approach would be to create a
>> copy
>> > > of the OModel in a different package, do the refactoring there and
>> change
>> > > the compiler and the runtime to point to the new OModel.
>> > > When an old cbp-file is found, it is loaded to the old OModel, is then
>> > migrated to the
>> > > new OModel (using that mysterious function X) and is then written
>> back to
>> > a
>> > > file in order to complete the migration.
>> > I think the approach and the Function X for migration are both very
>> > practical. I need to go into the source code for more details.
>> >
>> > > First the migration from the old model to the new model. Second the
>> > > migration from the new model to newer models. The goal of the project
>> is
>> > to
>> > > make the second point as hazzlefree as possible.
>> > I'd like to talk on a little more on the compatibility and migration
>> > topics. I think it somewhat more sophisticated than I originally
>> thought.
>> >
>> > The current OModel models BPEL entity to corresponding java Object, e.g.
>> > whole process to OProcess. When the ODE is running, it watchs certain
>> > directory (WEB-INF/processes/) for new deployment. If umcompiled .bpel
>> > found, it will read the .bpel into BOM objects and then serialize them
>> into
>> > .cbp file. When processes are invoked, the runtime load cbps and runs
>> the
>> > processes.
>> > In my view, the OModel can be divided into two parts, serialization
>> > mechanism and java representation mechanism, when BPEL evolves, I think
>> two
>> > types of compatibility should be considered in migration.
>> > For example, we want to support new elements in a process:
>> > <bpel:process>
>> >   ...
>> >   <b4p:humanInteractions>
>> >     ...
>> >   <b4p:humanInteractions>
>> >   ...
>> > </bpel:process>
>> >
>> > First, serialization compatibility
>> > In our OProcess implementation, there are no fields corresponding to the
>> > new elements. Then we implement a new version of OProcess with new
>> elements
>> > as new fields. The new versions are not compatible to the old one under
>> > java serialization, which means old version binary cannot be
>> de-serialized
>> > to new version runtime objects. Then we need run different OModels in
>> > parallel. And as a result, duplicated run-time code. If we use JSON-like
>> > serialize mechanism, deserialize-compatible will be easier. And I think
>> > JSON-like serialize mechanism is able achieve serialize/deserialize
>> > compatibility without change java representation mechanism of
>> OModel(using
>> > simple fields in serializable classes of bpel-obj).
>> > Second, java representation compatibility
>> > In some cases like bugfix, the serialization compatibility seems
>> > sufficient, but does not in many cases. If we decide to support
>> BPEL4People
>> > extension, java representation of OModel will change, like new fields or
>> > classes. Then compiler and run-time needs to change to adapt. It could
>> > adapt to new model in ideal conditions. But I think it's impossible that
>> > it's complete compatible when BPEL evolves. We need to design java
>> > representation mechanism, compiler and runtime carefully to make changes
>> > needed as few as possible. Further, if BPEL4People are supported, is it
>> > possible to disable it since I don't need the feature and enable it will
>> > waste resources? I think a plugin mechanism for compiler and runtime
>> might
>> > be an option. I haven't go into source code of these components, I've no
>> > idea if it's practical...
>> >
>> > In my opinion, the former compatibility could be achieved almost once
>> for
>> > all by a JSON-like or ther flexible serialize mechanism, and it will
>> bother
>> > little in later migrations. But the latter one are beyond OModel and
>> exists
>> > in every migration. What we could do is do best to reduce migration
>> cost.
>> > It may involve more than OModel refactor, but compiler and runtime to
>> some
>> > degree.
>> > Hoping I made myself clear.
>> >
>> > Thanks,
>> >   Zhen
>> >
>>
>>
>>
>> --
>> Tammo van Lessen - http://www.taval.de
>>
>
>


-- 
Tammo van Lessen - http://www.taval.de

Reply via email to