On Sat, Jan 24, 2009 at 9:35 AM, ant elder <[email protected]> wrote:
> > > On Fri, Jan 23, 2009 at 12:03 PM, Simon Laws <[email protected]>wrote: > >> snip... >> >>> >>> >>> > >>> > In the case binding-sca-calculator, for example, the ant build.xml runs >>> the >>> > sample using either the JSE or OSGI laucher. I've made no changes to >>> > launchers yet subject to discussion on the other thread. So in this >>> case the >>> > sample is just a contribution. There is a JUnit test but again this >>> just >>> > calls the lanucher and treats the sample as a contribution. There is a >>> > client conponent in the sample that drives it. >>> > >>> >>> Ok >>> >>> > I'm looking at extending the distro module to run the ant script to >>> autmate >>> > the process we have struggled with in the past. In a way I'd rather >>> have >>> > this happen as JUnit time but maybe that won't work out. I certainly >>> need >>> > some help with the webapp version as we need to fire up cargo or >>> something >>> > to ensure that we can test the webapp deploy stage. >>> > >>> >>> If we are trying to solve the issue where we never run our samples >>> using ant/distribution, this is a good idea. How about something like >>> a "build smoke test" profile that, after building a distribution would >>> try to unpack and exercise it using ant ? >>> >>> >>> >> Sounds like a good idea to me. >> >> There are a few things I think we can do to reduce the amount of time we >> spend testing releases, for example, >> >> - remove the need to generate ant scrips > > > A huge +1 from me on doing that > > >> >> - allow the ant scripts to be run from mvn/eclipse at development time so >> people can try/develop them without explicitly deploying a distribution >> - have mvn automatically test the samples that require a webapp to be >> deployed to a webapp container (cargo?) > > > Big +1 on that one too. It will add some significant complexity though so > to keep the sample simple it might be worth doing this somewhere out side of > the sample like in an itest or similar. > > > >> >> - consider have the sample junit tests (or equivalent) run through the ant >> scripts. how to capture failures? >> - final smoke test (from luciano's comment above). Whatever we do earlier >> in the stream we need to test the samples as they appear in the distribution >> >> what else? >> >> Simon >> > > I'm not so sure on the other parts mentioned about the Ant scripts etc yet. > Most of the problems we have with them breaking all the time in old samples > is with the way they use Tuscany and the dependencies, if we go with the > something like a launcher approach to run the samples the Ant scripts should > become real simple and the samples would be more robust and reliable so i'm > not sure we'll need to do so much. > > ...ant > > I'm going to close this particular thread off now. I checked in three samples. implementation-java-calculator binding-ws-calculator host-webapp-calculator Hopefully I got everything checking in right. I've extended the distribution to allow them all to be included but of course binding-ws-calculator and host-webapp-calculator don't actually work yet as the extensions they depend on haven't been brought up properly yet. Nothing is set in stone here but I'd personally like to use these to investigate how we make samples as easy to use and as educational for the users of 2.x and what the distribution has to look like to support this. All help is very much appreciated. Regards Simon
