Hi (again). :)
My comments below.


Dan Becker a écrit :
Vincent Zurczak wrote:
Luciano Resende a écrit :
As you mentioned, the Tuscany runtime will validate the composites
when processing the contribution. I don' t recall of a specific tool
that would just parse and validate the composite, but a tool like this
should be very simple to do, or  you could try Eclipse STP SCA Tools
and see if they have something like this.
 
There is indeed something in progress about validation in the SCA Tools project.

The validation relies on the SCA meta-model (which was recently updated to support Tuscany 1.4 - see [0]) and extra-validation rules.

It works like the Java validation in Eclipse.
Every time a composite file is changed in an SCA project, it goes through a validation step. When errors are found, error markers are added on the file and described in the "problem" view.

There is still work to do on it, mainly to link error markers with editors.
And we are also working to run validation when SCA annotated java code is modified.

For the moment, I did not find time to write documentation about this mechanism in the wiki [1].
It's only available in the trunk and in the recent builds for Eclipse Galileo, but if you want to test it, I can help you. It's quite simple.
And if you want to help, you're welcome. :)


Hello Vincent,

Be aware that the validation done in the Eclipse STP tools is based on the Tuscany SCA schemas. It validates according to the rules defined in the schema. Additionally, there is also model information behind the syntax that can be validated.
Actually, in STP, the validation is made according to the SCA meta-model.
The Tuscany SCA schemas only define the meta-model extension for Tuscany elements (mainly implementations, bindings and interfaces).
The SCA core elements are part of the basis of the meta-model.

Simon mentions 3 steps to loading a composite. Each one of these steps is a candidate for validation:

>> - Read the composite xml - create a internal model of the XML from the composite file

This is the step where schema validation is important. Are there the correct number of attributes and elements? Are they in the correct order? Are restricted fields populated with valid alternatives?
The SCA meta-model is based on the SCA specifications and implemented with EMF. It deals with all of the things you mentionned.
This meta-model is used in the SCA Composite designer (the graphical composite editor from STP).
Besides, the SCA meta-model defines additional constraints which are listed here. [0]

Among these constraints, the following ones are already implemented in the plug-in "org.eclipse.stp.sca.validation":
  • The component name must be unique across all the components in the composite
  • The name of a composite reference must be unique across all the composite references in the composite
  • The name of a composite service must be unique across all the composite services in the composite
  • It 's forbidden to wire a component reference with a component service of the same component.
The others should soon follow.
Besides, the SCA meta-model can be used inside Eclipse but also in standalone (it is made up of 2 or 3 jars).
See in the plug-in "org.eclipse.stp.sca.tests" for an example (class "org.eclipse.stp.sca.tests.ScaExample").

About the validator I mentionned in my first mail, it mainly relies on this model for the validation.

>> - resolve the composite - find all the things that the composite refers to such as Java component implementation files and WSDL interface definitions.

In this step some of the model information can be validated. Does the component implementation exist and do the methods match in name, parameters, and return values? Does the WSDL interface match the composite service interfaces?
Completing the model instance with resources (implementation, interfaces) is in progress.
Although we only deal with Java implementations and interfaces for the moment.
We have a new component in STP to introspect Java code and complete model instances.

And I am working on WSDL support (e.g. be able in the tools to have java implementations and reference any web service - simply starting from its WSDL).

>> - build the composite - fix up the internal model to ensure that it is valid and ready to be activated as an SCA composite application.

Has everything been loaded? Are all pieces of the composite available.
Once the SCA model is loaded, it is very easy to validate it using EMF mechanisms.
It takes care of structural constraints (e.g. as described in XML schemas) but also of extra-constraints defined on the meta-model (typically, things you can't describe in XSDs, unless using annotations and programs like Schematron).
>From the description and validation points of view, EMF is very powerful.

In STP, we are very interested in improving our tools. And handle validation is important and expected by users.
We also try to make tools that work with most of the available platforms. By covering every platform specifics and staying in touch with them.

Now, if you are interested to contribute and improve the Tuscany support in STP SCA Tools (e.g. on the validation part), we would be glad to work with you.
Contributions and contributors are welcome.

I would also understand if you decided to rely on Tuscany platform elements to make it, since you are probably in a more advanced stage of development.


Best regards,

                     Vincent Zurczak.


[ 0 ] : http://wiki.eclipse.org/STP/SCA_Component/SCA_Composite_Meta_Model#Additional_validation_rules

-- 
Vincent Zurczak
EBM WebSourcing
+33 (0) 4 38 12 16 77


Reply via email to