Thanks for adding the instructions for Bluemix in the PR today - I will
check them out and try to run everything on Bluemix.
Yes, we should print 'SUCESS' or 'FAILURE' messages at the end of the test
On Tue, Oct 11, 2016 at 8:06 AM, Tim Ellison <t.p.elli...@gmail.com> wrote:
> On 06/10/16 14:32, Walter Ray-Dulany wrote:
> > Tim Ellison noted in a recent email to the list (subject: [CANCEL]
> > Apache Pirk 0.2.0-incubating Realse) the following:
> >> The distributed tests took a long time (~35mins IIRC), is that normal?
> > I feel like noting this in a separate thread, and discussing it here, is
> > worthwhile.
> > The distributed tests *do* take a long time. The reason isn't because any
> > one test is slow; indeed, most of the individual tests take only a
> > of seconds. The slowness is because of the very large number of tests.
> > vast array of tests, upon inspection, can be seen to be caused by a very
> > thorough testing of the large number of different actions that can be
> > performed by Pirk and platforms it supports.
> There was certainly lots of logging going by - not that I was watching
> it constantly for half an hour - so I appreciate there is plenty of code
> being run and I trust that it is doing something useful ;-)
> I was also trusting that if there was a failure, it would be obvious,
> and there was nothing that said "FAILURE" at the end of the run, so I
> took that as a good sign.
> > I think this is ok. As an example of why this thorough testing is
> > necessary, just yesterday, while finishing up PIRK-45, I ran into an
> > 15 minutes into the distributed tests. Without this thorough testing,
> > code would have made it to a PR yesterday night, and (without the
> > distributed tests) it is likely no one would have caught it (it was a
> > subtle "Spark has an ancient package on its classpath pre-empting a newer
> > version I'd included, and the old package has a bug" error).
> Testing is good. Automatic background testing for this type of test
> suite may be better. I'm not likely to run the dist tests on every
> pause in development if they take a long time, and cost real money (a
> single run on AWS reported that it would cost me $2.50-ish*)
> It does run faster on the IBM JVM, and I can run it faster and for free
> on Bluemix, so I'll go that route for the moment -- but don't want to
> loose sight of trying to make the testing readily available for anyone
> who drops by in the community. As mooted earlier, maybe the best way to
> do that is investigate a shared resource test environment hosted at the
> * I know, I'm cheap :-)