Thanks Jeremy.

That was indeed a long email and also very informative. I take your point that the components should be testable in isolation without needing an SCA runtime to be available. The objective of unit testing a component is to excercise all the code paths within the component without worrying about its extrenal dependencies. However, the dependencies need to be adequately mocked to realise all the permutations of behaviour as defined by the contracts for the dependencies, that would influence the internal behaviour of the component being tested. From that perspective, using SCATestCase class is an ant-pattern in the true spirit of testing components in isolation.

However, the SCATestCase class is relavant from an integration testing perspective, where the componet is tested for its behaviour in the context of a runtime. This is more applicable for testing things that are more closely related to Tuscany like binding and container extensions.

However, the motive behind my original email was the ability to run the Tuscany runtime within an IDE, similar to how you would run Tomcat within Eclipse for example. Debugging is only one use case for this. I know there are implications around classloader hierarchies and the like. However, some of those things have been covered quite neatly on your post on module restructuring for classloader changes. I thought on a longer term, we could also extend this to provide richer functionality within the IDE, a tuscany user can use to compose his services and use bindings, containers etc from a Tuscany installation

Another thing I have been wondering was rather than assuming extension, boot libraries etc relative to the location from where the launcher was loaded, would it be cleaner to have the ability to override the locations for boot libraries, extensions etc using system properties (I think this is already implemnted to a certain extend using installDir, bootDir etc). The runtime should be bootable from a minimum set of JARs on the system classloader and from there onwards a child hierarchy of classloaders are built to ensure isolation between the various deployed artefacts. In a managed environment like a JEE container, this would start from the EAR or WAR classloader.

Many thanks
Meeraj

From: Jeremy Boynes <[EMAIL PROTECTED]>
Reply-To: [email protected]
To: [email protected]
Subject: Debugging the runtime, was: IDE Plugins
Date: Tue, 15 Aug 2006 08:28:22 -0700

On Aug 15, 2006, at 6:39 AM, Meeraj Kunnumpurath wrote:

Rather hack the code you can just set the "tuscany.installDir"  system
property
Thanks Jeremy, I did see the usage of tuscany.installDir. My question
was in the absence of the system property, does the runtime always  need
to resolve the extensions directory relative to the directory from  which
the launcher jar was loaded. Can it do the same if the
MainLauncherBooter was loaded from an exploded directory rather than a
Jar?

The "tuscany.installDir" property was designed to support debug of a user's application code. A user typically does not have the source for Tuscany in their IDE or have all the individual jars on their classpath - they have a project with their code and have Tuscany installed somewhere. It's like if I have a web application - I don't have the source for Tomcat or WebSphere available and I don't import individual jars into a project.

What we're trying to do is extend that environment so that the "user" (us) has all of the guts of the runtime exposed so that they can debug part of it e.g. step through the core or debug an extension. If we just add them to the classpath (e.g. as dependencies in Maven or as libraries in an IDE) we distort things even further: 1) there's no installation to speak of - just a bunch of jars on the classpath. Components that rely on having an installation directory structure (such as the DSE) won't work. The property is a way around that but does not really solve the problem, because ... 2) the launcher isolates the application from the runtime by loading the runtime in a separate classloader. The jars for that classloader are found by scanning a directory in the installation directory (or specified by the property "tuscany.bootDir"). By placing these jars in the system classloader the isolation is broken. This is great for us debugging the runtime except for the subtle (or not so subtle) classloader problems it may cause - it's different, which means that code will not debug the same.

This is not a problem unique to Tuscany and is one that has been solved before. The one of those solutions we have chosen leverages on the capabilities of the IoC architecture we used for the runtime and for the SCA programming model in general. That solution is to have components clearly define the things that they are dependent on (the IoC contract) and then have a test framework set up those dependencies in order to exercise the component. Those dependencies need to be fairly granular - e.g. at the level of a simple interface not "the entire runtime"

If you do that you can partition your testing into two phases:
1) component testing, where some test harness sets up the dependencies for a component and then exercises the component in those contexts. 2) integration testing, where you already know from 1) how the component will behave, so you focus on making sure that the things that use your component set up the contexts it expects

The SCA programming model expects and supports users who write and test applications in this way. The spec has gone to a lot of effort to allow users to test their components without needing a running SCA environment. The use of an IoC architecture in the Java C&I model is specifically designed to enable that.

If I am implementing a component, the C&I model explicitly calls out the IoC contract - it clearly defines the Services, References and Properties that a component has. That is the context for component testing that can be set by a test harness. If my component is implemented in Java, that test harness can be something as simple as JUnit with EasyMock for the references.

Once you've tested your components, then they can be assembled into composites for integration testing. That is where you would ensure the bindings and policies set up are appropriate for the way in which you want to use the component.

We have chosen to use SCA to assemble the runtime and this means these same techniques can be used to debug runtime components as well. Extension components can be tested on their own with dependencies resolved by a test harness such as JUnit with EasyMock. You should be able to test all the codepaths in an extension this way - all it takes is writing some test cases. This is easily debuggable in an IDE, just like a user's application code would be.

Once you know the component works as expected, your extension can then be integration tested with a real live runtime. This will involve deploying application components that use it, either as implementations (for a container extension), or to talk to other applications (for a binding). This may involve deploying to another runtime e.g. to a web container so that inbound HTTP requests can be tested. There will be a lot of moving parts, but that is in the nature of integration tests.

Putting it simply, the more testing you do at the component level the easier integration testing will be. We have extensions out there with few if any component level tests. This means all testing (if any) and debugging is done at the integration level which means there are lot of moving parts to set up and get right even before starting to test your component.

Putting it another way, people writing extensions should write unit tests for their components - it's easier for them and easier for others.

--
Jeremy


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


_________________________________________________________________
Be the first to hear what's new at MSN - sign up to our free newsletters! http://www.msn.co.uk/newsletters


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to