See
<https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/63/display/redirect?page=changes>
Changes:
[rmannibucau] extracting the scheduled executor service in a factory variable
in SDF
[sidhom] Run NeedsRunner tests from direct runner gradle build
[ccy] Fix issue from incomplete removal of has_cache
[sidhom] Address review comments
[sidhom] Remove old sourceSets.test.output references
[robertwb] Avoid warning in our default runner.
[github] [BEAM-3719] Adds support for reading side-inputs from SDFs
[github] print() is a function in Python 3
[robertwb] [maven-release-plugin] prepare branch release-2.4.0
[robertwb] [maven-release-plugin] prepare for next development iteration
[robertwb] Bump Python dev version.
------------------------------------------
[...truncated 3.63 MB...]
at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logError
SEVERE: Task 3 in stage 0.0 failed 1 times; aborting job
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 0.0, whose tasks have all completed, from pool
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: Lost task 0.0 in stage 0.0 (TID 0) on localhost, executor driver:
java.lang.NoSuchMethodError
(org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner;)
[duplicate 1]
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 0.0, whose tasks have all completed, from pool
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: Lost task 1.0 in stage 0.0 (TID 1) on localhost, executor driver:
java.lang.NoSuchMethodError
(org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner;)
[duplicate 2]
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 0.0, whose tasks have all completed, from pool
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: Lost task 2.0 in stage 0.0 (TID 2) on localhost, executor driver:
java.lang.NoSuchMethodError
(org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner;)
[duplicate 3]
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 0.0, whose tasks have all completed, from pool
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cancelling stage 0
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 0 (mapToPair at GroupCombineFunctions.java:184) failed in
2.081 s due to Job aborted due to stage failure: Task 3 in stage 0.0 failed 1
times, most recent failure: Lost task 3.0 in stage 0.0 (TID 3, localhost,
executor driver): java.lang.NoSuchMethodError:
org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner;
at
org.apache.beam.runners.spark.translation.MultiDoFnFunction.call(MultiDoFnFunction.java:137)
at
org.apache.beam.runners.spark.translation.MultiDoFnFunction.call(MultiDoFnFunction.java:58)
at
org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:186)
at
org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:186)
at
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
at
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 0 failed: collect at BoundedDataset.java:87, took 2.409393 s
Mar 02, 2018 11:04:33 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@54524a6a{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://127.0.0.1:4040
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Mar 02, 2018 11:04:33 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
[WARNING]
org.apache.beam.sdk.Pipeline$PipelineExecutionException:
java.lang.NoSuchMethodError:
org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner;
at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom
(SparkPipelineResult.java:68)
at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish
(SparkPipelineResult.java:99)
at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish
(SparkPipelineResult.java:87)
at org.apache.beam.examples.WordCount.main (WordCount.java:187)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke
(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke
(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:282)
at java.lang.Thread.run (Thread.java:748)
Caused by: java.lang.NoSuchMethodError:
org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner;
at org.apache.beam.runners.spark.translation.MultiDoFnFunction.call
(MultiDoFnFunction.java:137)
at org.apache.beam.runners.spark.translation.MultiDoFnFunction.call
(MultiDoFnFunction.java:58)
at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply
(JavaRDDLike.scala:186)
at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply
(JavaRDDLike.scala:186)
at
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply
(RDD.scala:797)
at
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply
(RDD.scala:797)
at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator (RDD.scala:287)
at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator (RDD.scala:287)
at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator (RDD.scala:287)
at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator (RDD.scala:287)
at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator (RDD.scala:287)
at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator (RDD.scala:287)
at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator (RDD.scala:287)
at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator (RDD.scala:287)
at org.apache.spark.scheduler.ShuffleMapTask.runTask
(ShuffleMapTask.scala:96)
at org.apache.spark.scheduler.ShuffleMapTask.runTask
(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run (Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run (Executor.scala:338)
at java.util.concurrent.ThreadPoolExecutor.runWorker
(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run
(ThreadPoolExecutor.java:624)
at java.lang.Thread.run (Thread.java:748)
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:44 min
[INFO] Finished at: 2018-03-02T11:04:33Z
[INFO] Final Memory: 87M/836M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java
(default-cli) on project word-count-beam: An exception occured while executing
the Java class. java.lang.NoSuchMethodError:
org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner;
-> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please
read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] Failed command
:runners:spark:runQuickstartJavaSpark FAILED
Mar 02, 2018 11:04:36 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-02T11:04:35.954Z: (1329671cafed2f34): Executing operation
WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Close
Mar 02, 2018 11:04:36 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-02T11:04:36.011Z: (1329671cafed2acc): Executing operation
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Create
Mar 02, 2018 11:04:36 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-02T11:04:36.141Z: (1329671cafed2abd): Executing operation
WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract+MapElements/Map+WriteCounts/WriteFiles/RewindowIntoGlobal/Window.Assign+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
Mar 02, 2018 11:04:46 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-02T11:04:46.669Z: (1329671cafed21ed): Executing operation
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close
Mar 02, 2018 11:04:46 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-02T11:04:46.733Z: (1329671cafed2c12): Executing operation
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)
Mar 02, 2018 11:04:49 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-02T11:04:48.816Z: (556da1fb7f20c8bc): Executing operation s12-u31
Mar 02, 2018 11:04:49 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-02T11:04:49.029Z: (1329671cafed2ecb): Executing operation
WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/CreateDataflowView
Mar 02, 2018 11:04:49 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-02T11:04:49.222Z: (556da1fb7f20c5d9): Executing operation
WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource)+WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)+WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map+WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize
Mar 02, 2018 11:04:51 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-02T11:04:50.985Z: (25f20689db9d9e37): Cleaning up.
Mar 02, 2018 11:04:51 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-02T11:04:51.067Z: (25f20689db9d9f01): Stopping worker pool...
Mar 02, 2018 11:07:06 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-02T11:07:04.059Z: (2991a79c2f0cd870): Autoscaling: Resized worker
pool from 1 to 0.
Mar 02, 2018 11:07:06 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-03-02T11:07:04.085Z: (2991a79c2f0cdaba): Autoscaling: Would further
reduce the number of workers but reached the minimum number allowed for the job.
Mar 02, 2018 11:07:14 AM org.apache.beam.runners.dataflow.DataflowPipelineJob
waitUntilFinish
INFO: Job 2018-03-02_03_02_32-15408355914220323339 finished with status DONE.
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 05:58 min
[INFO] Finished at: 2018-03-02T11:07:14Z
[INFO] Final Memory: 58M/694M
[INFO] ------------------------------------------------------------------------
gsutil cat gs://temp-storage-for-release-validation-tests/quickstart/count* |
grep Montague:
Montague: 47
Verified Montague: 47
gsutil rm gs://temp-storage-for-release-validation-tests/quickstart/count*
Removing
gs://temp-storage-for-release-validation-tests/quickstart/counts-00000-of-00003...
Removing
gs://temp-storage-for-release-validation-tests/quickstart/counts-00001-of-00003...
Removing
gs://temp-storage-for-release-validation-tests/quickstart/counts-00002-of-00003...
[SUCCESS]
FAILURE: Build completed with 3 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:flink:runQuickstartJavaFlinkLocal'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java''
> finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:apex:runQuickstartJavaApex'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java''
> finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:spark:runQuickstartJavaSpark'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java''
> finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
BUILD FAILED in 6m 51s
6 actionable tasks: 6 executed
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]