yes, this is one of the flaky tests with occasional errors - unfortunately, even with the exact seeds of a failed run, this behavior is not reproducible locally.
Regards, Matthias On Fri, Feb 17, 2017 at 10:16 AM, <dusenberr...@gmail.com> wrote: > > Failed tests: > > > > FrameMatrixReblockTest.testFrameWriteMultipleSparseBinarySpark:170->runFrameReblockTest:230 > 31 values are not in equal > > What's going on with this test and the associated logic that it is > testing? Isn't this the same test that has been intermittently failing for > a while now? > > -Mike > > -- > > Mike Dusenberry > GitHub: github.com/dusenberrymw > LinkedIn: linkedin.com/in/mikedusenberry > > Sent from my iPhone. > > > > On Feb 17, 2017, at 4:07 AM, jenk...@spark.tc wrote: > > > > See <https://sparktc.ibmcloud.com/jenkins/job/SystemML- > DailyTest/816/changes> > > > > Changes: > > > > [npansar] [MINOR] Code refactoring MatrixIndexingSPInstruction to enable > parallel > > > > [Deron Eriksson] [SYSTEMML-1280] Restore and deprecate SQLContext methods > > > > [Deron Eriksson] [SYSTEMML-1279] Decrease numCols to prevent spark > codegen issue > > > > [Deron Eriksson] [SYSTEMML-1211] Update dependencies for Spark 2.1.0 > > > > [Deron Eriksson] [SYSTEMML-1271] Increment minimum Spark version in > MLContext > > > > [Deron Eriksson] [SYSTEMML-1277] MLContext implicitly convert mllib > Vector to ml Vector > > > > [npansar] [MINOR] Addressed corner cases (i.e. empty blocks and > extremely large > > > > ------------------------------------------ > > [...truncated 12735 lines...] > > 17/02/17 04:59:57 INFO scheduler.DAGScheduler: Parents of final stage: > List(ShuffleMapStage 258) > > 17/02/17 04:59:57 INFO scheduler.DAGScheduler: Missing parents: > List(ShuffleMapStage 258) > > 17/02/17 04:59:57 INFO scheduler.DAGScheduler: Submitting > ShuffleMapStage 258 (MapPartitionsRDD[715] at mapPartitionsToPair at > RDDConverterUtils.java:256), which has no missing parents > > 17/02/17 04:59:57 INFO memory.MemoryStore: Block broadcast_258 stored as > values in memory (estimated size 21.5 KB, free 1044.7 MB) > > 17/02/17 04:59:57 INFO memory.MemoryStore: Block broadcast_258_piece0 > stored as bytes in memory (estimated size 9.6 KB, free 1044.7 MB) > > 17/02/17 04:59:57 INFO storage.BlockManagerInfo: Added > broadcast_258_piece0 in memory on 169.54.146.43:43583 (size: 9.6 KB, > free: 1045.1 MB) > > 17/02/17 04:59:57 INFO spark.SparkContext: Created broadcast 258 from > broadcast at DAGScheduler.scala:996 > > 17/02/17 04:59:57 INFO scheduler.DAGScheduler: Submitting 1 missing > tasks from ShuffleMapStage 258 (MapPartitionsRDD[715] at > mapPartitionsToPair at RDDConverterUtils.java:256) > > 17/02/17 04:59:57 INFO scheduler.TaskSchedulerImpl: Adding task set > 258.0 with 1 tasks > > 17/02/17 04:59:57 INFO scheduler.FairSchedulableBuilder: Added task set > TaskSet_258.0 tasks to pool default > > 17/02/17 04:59:57 INFO scheduler.TaskSetManager: Starting task 0.0 in > stage 258.0 (TID 268, localhost, executor driver, partition 0, > PROCESS_LOCAL, 6289 bytes) > > 17/02/17 04:59:57 INFO executor.Executor: Running task 0.0 in stage > 258.0 (TID 268) > > 17/02/17 04:59:57 INFO executor.Executor: Finished task 0.0 in stage > 258.0 (TID 268). 1891 bytes result sent to driver > > 17/02/17 04:59:57 INFO scheduler.TaskSetManager: Finished task 0.0 in > stage 258.0 (TID 268) in 13 ms on localhost (executor driver) (1/1) > > 17/02/17 04:59:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet > 258.0, whose tasks have all completed, from pool default > > 17/02/17 04:59:57 INFO scheduler.DAGScheduler: ShuffleMapStage 258 > (mapPartitionsToPair at RDDConverterUtils.java:256) finished in 0.015 s > > 17/02/17 04:59:57 INFO scheduler.DAGScheduler: looking for newly > runnable stages > > 17/02/17 04:59:57 INFO scheduler.DAGScheduler: running: Set() > > 17/02/17 04:59:57 INFO scheduler.DAGScheduler: waiting: Set(ResultStage > 259) > > 17/02/17 04:59:57 INFO scheduler.DAGScheduler: failed: Set() > > 17/02/17 04:59:57 INFO scheduler.DAGScheduler: Submitting ResultStage > 259 (MapPartitionsRDD[717] at mapValues at SparkUtils.java:147), which has > no missing parents > > 17/02/17 04:59:57 INFO memory.MemoryStore: Block broadcast_259 stored as > values in memory (estimated size 3.8 KB, free 1044.7 MB) > > 17/02/17 04:59:57 INFO memory.MemoryStore: Block broadcast_259_piece0 > stored as bytes in memory (estimated size 2.1 KB, free 1044.7 MB) > > 17/02/17 04:59:57 INFO storage.BlockManagerInfo: Added > broadcast_259_piece0 in memory on 169.54.146.43:43583 (size: 2.1 KB, > free: 1045.1 MB) > > 17/02/17 04:59:57 INFO spark.SparkContext: Created broadcast 259 from > broadcast at DAGScheduler.scala:996 > > 17/02/17 04:59:57 INFO scheduler.DAGScheduler: Submitting 1 missing > tasks from ResultStage 259 (MapPartitionsRDD[717] at mapValues at > SparkUtils.java:147) > > 17/02/17 04:59:57 INFO scheduler.TaskSchedulerImpl: Adding task set > 259.0 with 1 tasks > > 17/02/17 04:59:57 INFO scheduler.FairSchedulableBuilder: Added task set > TaskSet_259.0 tasks to pool default > > 17/02/17 04:59:57 INFO scheduler.TaskSetManager: Starting task 0.0 in > stage 259.0 (TID 269, localhost, executor driver, partition 0, ANY, 5753 > bytes) > > 17/02/17 04:59:57 INFO executor.Executor: Running task 0.0 in stage > 259.0 (TID 269) > > 17/02/17 04:59:57 INFO storage.ShuffleBlockFetcherIterator: Getting 1 > non-empty blocks out of 1 blocks > > 17/02/17 04:59:57 INFO storage.ShuffleBlockFetcherIterator: Started 0 > remote fetches in 0 ms > > 17/02/17 04:59:57 INFO executor.Executor: Finished task 0.0 in stage > 259.0 (TID 269). 2021 bytes result sent to driver > > 17/02/17 04:59:57 INFO scheduler.TaskSetManager: Finished task 0.0 in > stage 259.0 (TID 269) in 5 ms on localhost (executor driver) (1/1) > > 17/02/17 04:59:57 INFO scheduler.TaskSchedulerImpl: Removed TaskSet > 259.0, whose tasks have all completed, from pool default > > 17/02/17 04:59:57 INFO scheduler.DAGScheduler: ResultStage 259 (collect > at SparkExecutionContext.java:768) finished in 0.006 s > > 17/02/17 04:59:58 INFO storage.BlockManagerInfo: Removed > broadcast_258_piece0 on 169.54.146.43:43583 in memory (size: 9.6 KB, > free: 1045.1 MB) > > 17/02/17 04:59:58 INFO spark.ContextCleaner: Cleaned accumulator 16831 > > 17/02/17 04:59:58 INFO storage.BlockManagerInfo: Removed > broadcast_259_piece0 on 169.54.146.43:43583 in memory (size: 2.1 KB, > free: 1045.1 MB) > > 17/02/17 04:59:58 INFO storage.BlockManagerInfo: Removed > broadcast_257_piece0 on 169.54.146.43:43583 in memory (size: 9.0 KB, > free: 1045.2 MB) > > 17/02/17 04:59:58 INFO storage.BlockManagerInfo: Removed > broadcast_256_piece0 on 169.54.146.43:43583 in memory (size: 8.9 KB, > free: 1045.2 MB) > > 17/02/17 04:59:58 INFO scheduler.DAGScheduler: Registering RDD 719 > (flatMapToPair at RDDConverterUtils.java:273) > > 17/02/17 04:59:58 INFO scheduler.DAGScheduler: Got job 193 > (collectAsList at MLContextTest.java:1343) with 1 output partitions > > 17/02/17 04:59:58 INFO scheduler.DAGScheduler: Final stage: ResultStage > 261 (collectAsList at MLContextTest.java:1343) > > 17/02/17 04:59:58 INFO scheduler.DAGScheduler: Parents of final stage: > List(ShuffleMapStage 260) > > 17/02/17 04:59:58 INFO scheduler.DAGScheduler: Missing parents: > List(ShuffleMapStage 260) > > 17/02/17 04:59:58 INFO scheduler.DAGScheduler: Submitting > ShuffleMapStage 260 (MapPartitionsRDD[719] at flatMapToPair at > RDDConverterUtils.java:273), which has no missing parents > > 17/02/17 04:59:58 INFO memory.MemoryStore: Block broadcast_260 stored as > values in memory (estimated size 3.9 KB, free 1044.7 MB) > > 17/02/17 04:59:58 INFO memory.MemoryStore: Block broadcast_260_piece0 > stored as bytes in memory (estimated size 2.2 KB, free 1044.7 MB) > > 17/02/17 04:59:58 INFO storage.BlockManagerInfo: Added > broadcast_260_piece0 in memory on 169.54.146.43:43583 (size: 2.2 KB, > free: 1045.2 MB) > > 17/02/17 04:59:58 INFO spark.SparkContext: Created broadcast 260 from > broadcast at DAGScheduler.scala:996 > > 17/02/17 04:59:58 INFO scheduler.DAGScheduler: Submitting 1 missing > tasks from ShuffleMapStage 260 (MapPartitionsRDD[719] at flatMapToPair at > RDDConverterUtils.java:273) > > 17/02/17 04:59:58 INFO scheduler.TaskSchedulerImpl: Adding task set > 260.0 with 1 tasks > > 17/02/17 04:59:58 INFO scheduler.FairSchedulableBuilder: Added task set > TaskSet_260.0 tasks to pool default > > 17/02/17 04:59:58 INFO scheduler.TaskSetManager: Starting task 0.0 in > stage 260.0 (TID 270, localhost, executor driver, partition 0, > PROCESS_LOCAL, 6218 bytes) > > 17/02/17 04:59:58 INFO executor.Executor: Running task 0.0 in stage > 260.0 (TID 270) > > 17/02/17 04:59:58 INFO executor.Executor: Finished task 0.0 in stage > 260.0 (TID 270). 1253 bytes result sent to driver > > 17/02/17 04:59:58 INFO scheduler.TaskSetManager: Finished task 0.0 in > stage 260.0 (TID 270) in 6 ms on localhost (executor driver) (1/1) > > 17/02/17 04:59:58 INFO scheduler.TaskSchedulerImpl: Removed TaskSet > 260.0, whose tasks have all completed, from pool default > > 17/02/17 04:59:58 INFO scheduler.DAGScheduler: ShuffleMapStage 260 > (flatMapToPair at RDDConverterUtils.java:273) finished in 0.006 s > > 17/02/17 04:59:58 INFO scheduler.DAGScheduler: looking for newly > runnable stages > > 17/02/17 04:59:58 INFO scheduler.DAGScheduler: running: Set() > > 17/02/17 04:59:58 INFO scheduler.DAGScheduler: waiting: Set(ResultStage > 261) > > 17/02/17 04:59:58 INFO scheduler.DAGScheduler: failed: Set() > > 17/02/17 04:59:58 INFO scheduler.DAGScheduler: Submitting ResultStage > 261 (MapPartitionsRDD[725] at collectAsList at MLContextTest.java:1343), > which has no missing parents > > 17/02/17 04:59:58 INFO memory.MemoryStore: Block broadcast_261 stored as > values in memory (estimated size 11.3 KB, free 1044.7 MB) > > 17/02/17 04:59:58 INFO memory.MemoryStore: Block broadcast_261_piece0 > stored as bytes in memory (estimated size 5.6 KB, free 1044.7 MB) > > 17/02/17 04:59:58 INFO storage.BlockManagerInfo: Added > broadcast_261_piece0 in memory on 169.54.146.43:43583 (size: 5.6 KB, > free: 1045.2 MB) > > 17/02/17 04:59:58 INFO spark.SparkContext: Created broadcast 261 from > broadcast at DAGScheduler.scala:996 > > 17/02/17 04:59:58 INFO scheduler.DAGScheduler: Submitting 1 missing > tasks from ResultStage 261 (MapPartitionsRDD[725] at collectAsList at > MLContextTest.java:1343) > > 17/02/17 04:59:58 INFO scheduler.TaskSchedulerImpl: Adding task set > 261.0 with 1 tasks > > 17/02/17 04:59:58 INFO scheduler.FairSchedulableBuilder: Added task set > TaskSet_261.0 tasks to pool default > > 17/02/17 04:59:58 INFO scheduler.TaskSetManager: Starting task 0.0 in > stage 261.0 (TID 271, localhost, executor driver, partition 0, ANY, 5783 > bytes) > > 17/02/17 04:59:58 INFO executor.Executor: Running task 0.0 in stage > 261.0 (TID 271) > > 17/02/17 04:59:58 INFO storage.ShuffleBlockFetcherIterator: Getting 1 > non-empty blocks out of 1 blocks > > 17/02/17 04:59:58 INFO storage.ShuffleBlockFetcherIterator: Started 0 > remote fetches in 1 ms > > 17/02/17 04:59:58 INFO executor.Executor: Finished task 0.0 in stage > 261.0 (TID 271). 2024 bytes result sent to driver > > 17/02/17 04:59:58 INFO scheduler.TaskSetManager: Finished task 0.0 in > stage 261.0 (TID 271) in 7 ms on localhost (executor driver) (1/1) > > 17/02/17 04:59:58 INFO scheduler.TaskSchedulerImpl: Removed TaskSet > 261.0, whose tasks have all completed, from pool default > > 17/02/17 04:59:58 INFO scheduler.DAGScheduler: ResultStage 261 > (collectAsList at MLContextTest.java:1343) finished in 0.009 s > > 17/02/17 04:59:59 INFO storage.BlockManagerInfo: Removed > broadcast_261_piece0 on 169.54.146.43:43583 in memory (size: 5.6 KB, > free: 1045.2 MB) > > 17/02/17 04:59:59 INFO storage.BlockManagerInfo: Removed > broadcast_260_piece0 on 169.54.146.43:43583 in memory (size: 2.2 KB, > free: 1045.2 MB) > > 17/02/17 04:59:59 INFO spark.ContextCleaner: Cleaned accumulator 17026 > > 17/02/17 04:59:59 INFO spark.ContextCleaner: Cleaned shuffle 67 > > 17/02/17 04:59:59 INFO server.ServerConnector: Stopped > ServerConnector@2aaced77{HTTP/1.1}{0.0.0.0:4040} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@1deddf65{/stages/stage/kill, > null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@70677659{/jobs/job/kill,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@4aabd893{/api,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@7529412e{/,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@6be0d0e0{/static,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@5f7ca14e{/executors/threadDump/json,null, > UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@7a68aaa2{/executors/ > threadDump,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@5e4813e9{/executors/json,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@7278ca13{/executors,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@69998504{/environment/json,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@283196b4{/environment,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@15c73b7e{/storage/rdd/json,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@693e589c{/storage/rdd,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@5c53b749{/storage/json,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@636bd416{/storage,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@38588bbe{/stages/pool/json,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@4cff34f3{/stages/pool,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@364138da{/stages/stage/json, > null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@3e2ca893{/stages/stage,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@223acd27{/stages/json,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@53f5263f{/stages,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@1cf282fe{/jobs/job/json,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@2956ab98{/jobs/job,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@6d0387a9{/jobs/json,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO handler.ContextHandler: Stopped > o.s.j.s.ServletContextHandler@40b1ef30{/jobs,null,UNAVAILABLE} > > 17/02/17 04:59:59 INFO ui.SparkUI: Stopped Spark web UI at > http://169.54.146.43:4040 > > 17/02/17 04:59:59 INFO spark.MapOutputTrackerMasterEndpoint: > MapOutputTrackerMasterEndpoint stopped! > > 17/02/17 04:59:59 INFO memory.MemoryStore: MemoryStore cleared > > 17/02/17 04:59:59 INFO storage.BlockManager: BlockManager stopped > > 17/02/17 04:59:59 INFO storage.BlockManagerMaster: BlockManagerMaster > stopped > > 17/02/17 04:59:59 INFO scheduler.OutputCommitCoordinator$ > OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! > > 17/02/17 04:59:59 INFO spark.SparkContext: Successfully stopped > SparkContext > > Running org.apache.sysml.test.integration.mlcontext.MLContextTest > > Tests run: 169, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 63.734 > sec - in org.apache.sysml.test.integration.mlcontext.MLContextTest > > 17/02/17 04:59:59 INFO util.ShutdownHookManager: Shutdown hook called > > 17/02/17 04:59:59 INFO util.ShutdownHookManager: Deleting directory > /tmp/spark-4b81537e-e1c4-4b89-8821-3efd1c0dc2d8 > > Running org.apache.sysml.test.integration.functions. > indexing.LeftIndexingTest > > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 678.607 > sec - in org.apache.sysml.test.integration.functions. > indexing.LeftIndexingTest > > Running org.apache.sysml.test.integration.functions.append. > AppendVectorTest > > Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 432.935 > sec - in org.apache.sysml.test.integration.functions.append. > AppendVectorTest > > Running org.apache.sysml.test.integration.functions. > quaternary.WeightedSigmoidTest > > Tests run: 56, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: > 1,004.868 sec - in org.apache.sysml.test.integration.functions. > quaternary.WeightedSigmoidTest > > 17/02/17 05:02:56 INFO util.ShutdownHookManager: Shutdown hook called > > 17/02/17 05:02:56 INFO util.ShutdownHookManager: Deleting directory > /tmp/spark-bf095cc1-cf00-4bfd-8a06-7c279a743c14 > > Running org.apache.sysml.test.integration.applications. > parfor.ParForCorrelationTest > > Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 285.721 > sec - in org.apache.sysml.test.integration.applications. > parfor.ParForCorrelationTest > > Running org.apache.sysml.test.integration.applications.descriptivestats. > UnivariateUnweightedScaleDenseTest > > Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 348.853 > sec - in org.apache.sysml.test.integration.applications.descriptivestats. > UnivariateUnweightedScaleDenseTest > > Running org.apache.sysml.test.integration.applications.parfor. > ParForCorrelationTestLarge > > Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 43.071 > sec - in org.apache.sysml.test.integration.applications.parfor. > ParForCorrelationTestLarge > > Running org.apache.sysml.test.integration.functions.reorg.FullOrderTest > > Tests run: 132, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: > 1,368.647 sec - in org.apache.sysml.test.integration.functions.reorg. > FullOrderTest > > Running org.apache.sysml.test.integration.functions.quaternary. > WeightedDivMatrixMultTest > > Tests run: 90, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: > 1,489.678 sec - in org.apache.sysml.test.integration.functions.quaternary. > WeightedDivMatrixMultTest > > 17/02/17 05:09:08 INFO util.ShutdownHookManager: Shutdown hook called > > 17/02/17 05:09:08 INFO util.ShutdownHookManager: Deleting directory > /tmp/spark-fb94f0d9-b2b8-4ab3-bd22-59162e93e33e > > Running org.apache.sysml.test.integration.applications.descriptivestats. > UnivariateWeightedScaleDenseTest > > Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 345.726 > sec - in org.apache.sysml.test.integration.applications.descriptivestats. > UnivariateWeightedScaleDenseTest > > Running org.apache.sysml.test.integration.applications. > parfor.ParForNaiveBayesTest > > Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 531.889 > sec - in org.apache.sysml.test.integration.applications. > parfor.ParForNaiveBayesTest > > Running org.apache.sysml.test.integration.functions.append. > AppendChainTest > > Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: > 1,062.068 sec - in org.apache.sysml.test.integration.functions.append. > AppendChainTest > > Running org.apache.sysml.test.integration.applications.descriptivestats. > UnivariateUnweightedScaleSparseTest > > Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 277.404 > sec - in org.apache.sysml.test.integration.applications.descriptivestats. > UnivariateUnweightedScaleSparseTest > > Running org.apache.sysml.test.integration.applications.dml.HITSDMLTest > > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 92.16 > sec - in org.apache.sysml.test.integration.applications.dml.HITSDMLTest > > Running org.apache.sysml.test.integration.functions.append. > AppendMatrixTest > > Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: > 1,430.611 sec - in org.apache.sysml.test.integration.functions.append. > AppendMatrixTest > > Running org.apache.sysml.test.integration.applications.descriptivestats. > UnivariateWeightedScaleSparseTest > > Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 329.608 > sec - in org.apache.sysml.test.integration.applications.descriptivestats. > UnivariateWeightedScaleSparseTest > > Running org.apache.sysml.test.integration.applications. > pydml.ArimaPyDMLTest > > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 49.175 > sec - in org.apache.sysml.test.integration.applications. > pydml.ArimaPyDMLTest > > Running org.apache.sysml.test.integration.applications. > pydml.HITSPyDMLTest > > Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 76.627 > sec - in org.apache.sysml.test.integration.applications. > pydml.HITSPyDMLTest > > Running org.apache.sysml.test.integration.applications.dml. > LinearRegressionDMLTest > > Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.395 > sec - in org.apache.sysml.test.integration.applications.dml. > LinearRegressionDMLTest > > Running org.apache.sysml.test.integration.applications. > pydml.CsplineDSPyDMLTest > > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.831 > sec - in org.apache.sysml.test.integration.applications. > pydml.CsplineDSPyDMLTest > > Running org.apache.sysml.test.integration.applications. > pydml.LinearLogRegPyDMLTest > > Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 82.606 > sec - in org.apache.sysml.test.integration.applications. > pydml.LinearLogRegPyDMLTest > > Running org.apache.sysml.test.integration.applications. > pydml.NaiveBayesPyDMLTest > > Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 81.42 > sec - in org.apache.sysml.test.integration.applications. > pydml.NaiveBayesPyDMLTest > > Running org.apache.sysml.test.integration.applications.dml. > CsplineCGDMLTest > > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.806 > sec - in org.apache.sysml.test.integration.applications.dml. > CsplineCGDMLTest > > Running org.apache.sysml.test.integration.applications.dml.L2SVMDMLTest > > Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 64.378 > sec - in org.apache.sysml.test.integration.applications.dml.L2SVMDMLTest > > Running org.apache.sysml.test.integration.applications.pydml. > MDABivariateStatsPyDMLTest > > Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 294.972 > sec - in org.apache.sysml.test.integration.applications.pydml. > MDABivariateStatsPyDMLTest > > Running org.apache.sysml.test.integration.applications. > pydml.PageRankPyDMLTest > > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.3 sec > - in org.apache.sysml.test.integration.applications. > pydml.PageRankPyDMLTest > > Running org.apache.sysml.test.integration.applications.pydml. > NaiveBayesParforPyDMLTest > > Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 81.195 > sec - in org.apache.sysml.test.integration.applications.pydml. > NaiveBayesParforPyDMLTest > > Running org.apache.sysml.test.integration.applications.dml. > NaiveBayesDMLTest > > Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 81.784 > sec - in org.apache.sysml.test.integration.applications.dml. > NaiveBayesDMLTest > > Running org.apache.sysml.test.integration.applications.dml. > MDABivariateStatsDMLTest > > Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 293.84 > sec - in org.apache.sysml.test.integration.applications.dml. > MDABivariateStatsDMLTest > > Running org.apache.sysml.test.integration.applications.dml. > PageRankDMLTest > > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.301 > sec - in org.apache.sysml.test.integration.applications.dml. > PageRankDMLTest > > Running org.apache.sysml.test.integration.applications. > pydml.GLMPyDMLTest > > Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 586.548 > sec - in org.apache.sysml.test.integration.applications.pydml.GLMPyDMLTest > > Running org.apache.sysml.test.integration.applications. > pydml.MultiClassSVMPyDMLTest > > Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 89.255 > sec - in org.apache.sysml.test.integration.applications. > pydml.MultiClassSVMPyDMLTest > > Running org.apache.sysml.test.integration.applications.dml.ID3DMLTest > > Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 390.245 > sec - in org.apache.sysml.test.integration.applications.dml.ID3DMLTest > > Running org.apache.sysml.test.integration.applications.dml.GLMDMLTest > > Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 889.89 > sec - in org.apache.sysml.test.integration.applications.dml.GLMDMLTest > > Running org.apache.sysml.test.integration.applications. > pydml.CsplineCGPyDMLTest > > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.232 > sec - in org.apache.sysml.test.integration.applications. > pydml.CsplineCGPyDMLTest > > Running org.apache.sysml.test.integration.applications. > pydml.WelchTPyDMLTest > > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.629 > sec - in org.apache.sysml.test.integration.applications. > pydml.WelchTPyDMLTest > > Running org.apache.sysml.test.integration.applications.dml. > MultiClassSVMDMLTest > > Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 87.486 > sec - in org.apache.sysml.test.integration.applications.dml. > MultiClassSVMDMLTest > > Running org.apache.sysml.test.integration.applications.dml. > CsplineDSDMLTest > > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.518 > sec - in org.apache.sysml.test.integration.applications.dml. > CsplineDSDMLTest > > Running org.apache.sysml.test.integration.applications. > pydml.ApplyTransformPyDMLTest > > Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.26 sec > - in org.apache.sysml.test.integration.applications. > pydml.ApplyTransformPyDMLTest > > Running org.apache.sysml.test.integration.applications.dml. > NaiveBayesParforDMLTest > > Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 81.105 > sec - in org.apache.sysml.test.integration.applications.dml. > NaiveBayesParforDMLTest > > Running org.apache.sysml.test.integration.applications. > pydml.L2SVMPyDMLTest > > Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 64.522 > sec - in org.apache.sysml.test.integration.applications. > pydml.L2SVMPyDMLTest > > Running org.apache.sysml.test.integration.applications.pydml. > LinearRegressionPyDMLTest > > Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.556 > sec - in org.apache.sysml.test.integration.applications.pydml. > LinearRegressionPyDMLTest > > Running org.apache.sysml.test.integration.applications.dml. > LinearLogRegDMLTest > > Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 61.237 > sec - in org.apache.sysml.test.integration.applications.dml. > LinearLogRegDMLTest > > Running org.apache.sysml.test.integration.applications.dml.GNMFDMLTest > > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 56.504 > sec - in org.apache.sysml.test.integration.applications.dml.GNMFDMLTest > > Running org.apache.sysml.test.integration.applications.dml.WelchTDMLTest > > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.657 > sec - in org.apache.sysml.test.integration.applications.dml.WelchTDMLTest > > Running org.apache.sysml.test.integration.applications.dml. > ApplyTransformDMLTest > > Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.661 > sec - in org.apache.sysml.test.integration.applications.dml. > ApplyTransformDMLTest > > Running org.apache.sysml.test.integration.applications. > pydml.ID3PyDMLTest > > Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 379.455 > sec - in org.apache.sysml.test.integration.applications.pydml.ID3PyDMLTest > > Running org.apache.sysml.test.integration.applications. > pydml.GNMFPyDMLTest > > Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 57.108 > sec - in org.apache.sysml.test.integration.applications. > pydml.GNMFPyDMLTest > > > > Results : > > > > Failed tests: > > > > FrameMatrixReblockTest.testFrameWriteMultipleSparseBinarySpark:170->runFrameReblockTest:230 > 31 values are not in equal > > > > Tests run: 6608, Failures: 1, Errors: 0, Skipped: 0 > > > > [INFO] > > [INFO] --- maven-failsafe-plugin:2.17:verify (default) @ systemml --- > > [INFO] Failsafe report directory: <https://sparktc.ibmcloud.com/ > jenkins/job/SystemML-DailyTest/ws/target/failsafe-reports> > > [INFO] ------------------------------------------------------------ > ------------ > > [INFO] BUILD FAILURE > > [INFO] ------------------------------------------------------------ > ------------ > > [INFO] Total time: 02:25 h > > [INFO] Finished at: 2017-02-17T06:07:07-06:00 > > [INFO] Final Memory: 67M/2167M > > [INFO] ------------------------------------------------------------ > ------------ > > [ERROR] Failed to execute goal org.apache.maven.plugins: > maven-failsafe-plugin:2.17:verify (default) on project systemml: There > are test failures. > > [ERROR] > > [ERROR] Please refer to <https://sparktc.ibmcloud.com/ > jenkins/job/SystemML-DailyTest/ws/target/failsafe-reports> for the > individual test results. > > [ERROR] -> [Help 1] > > [ERROR] > > [ERROR] To see the full stack trace of the errors, re-run Maven with the > -e switch. > > [ERROR] Re-run Maven using the -X switch to enable full debug logging. > > [ERROR] > > [ERROR] For more information about the errors and possible solutions, > please read the following articles: > > [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/ > MojoFailureException > > Build step 'Execute shell' marked build as failure > > Run condition [Always] enabling perform for step [[]] >