See <https://builds.apache.org/job/beam_PostRelease_NightlySnapshot/163/display/redirect?page=changes>
Changes: [tgroh] Cleanups in GroupAlsoByWindowEvaluatorFactory [dawid] [BEAM-2831] Do not wrap IOException in SerializableCoder [tgroh] Allow Fusion to Continue with unknown PTransforms [tgroh] fixup! Allow Fusion to Continue with unknown PTransforms [tgroh] fixup! fixup! Allow Fusion to Continue with unknown PTransforms [wcn] Fix documentation around pipeline creation. [herohde] Migrate container instructions and builds to Gradle [rober] Go SDK usercounters [chamikara] [BEAM-3744] Expand Pubsub read API for Python. (#4901) [herohde] CR: added comments to container helper function [herohde] CR: make containerImageName take named parameters [ccy] Revert #4781 which broke Python postsubmits ------------------------------------------ [...truncated 3.02 MB...] Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 0.0 in stage 2.0 (TID 8). 13240 bytes result sent to driver Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Starting task 7.0 in stage 2.0 (TID 15, localhost, executor driver, partition 7, PROCESS_LOCAL, 4730 bytes) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Running task 7.0 in stage 2.0 (TID 15) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 0.0 in stage 2.0 (TID 8) in 96 ms on localhost (executor driver) (4/8) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Getting 0 non-empty blocks out of 4 blocks Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Started 0 remote fetches in 1 ms Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Getting 0 non-empty blocks out of 4 blocks Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Started 0 remote fetches in 1 ms Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Getting 0 non-empty blocks out of 4 blocks Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Started 0 remote fetches in 0 ms Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Getting 0 non-empty blocks out of 4 blocks Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Started 0 remote fetches in 1 ms Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Block rdd_41_4 stored as values in memory (estimated size 16.0 B, free 1825.1 MB) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Block rdd_41_5 stored as values in memory (estimated size 16.0 B, free 1825.1 MB) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Block rdd_41_7 stored as values in memory (estimated size 16.0 B, free 1825.1 MB) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Block rdd_41_6 stored as values in memory (estimated size 16.0 B, free 1825.1 MB) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Added rdd_41_4 in memory on 127.0.0.1:43400 (size: 16.0 B, free: 1825.4 MB) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Added rdd_41_7 in memory on 127.0.0.1:43400 (size: 16.0 B, free: 1825.4 MB) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Added rdd_41_6 in memory on 127.0.0.1:43400 (size: 16.0 B, free: 1825.4 MB) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Added rdd_41_5 in memory on 127.0.0.1:43400 (size: 16.0 B, free: 1825.4 MB) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 7.0 in stage 2.0 (TID 15). 11942 bytes result sent to driver Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 5.0 in stage 2.0 (TID 13). 11942 bytes result sent to driver Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 4.0 in stage 2.0 (TID 12). 11942 bytes result sent to driver Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 7.0 in stage 2.0 (TID 15) in 69 ms on localhost (executor driver) (5/8) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 5.0 in stage 2.0 (TID 13) in 89 ms on localhost (executor driver) (6/8) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 4.0 in stage 2.0 (TID 12) in 92 ms on localhost (executor driver) (7/8) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 6.0 in stage 2.0 (TID 14). 11942 bytes result sent to driver Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 6.0 in stage 2.0 (TID 14) in 86 ms on localhost (executor driver) (8/8) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Removed TaskSet 2.0, whose tasks have all completed, from pool Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: ResultStage 2 (collect at BoundedDataset.java:87) finished in 0.159 s Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Job 0 finished: collect at BoundedDataset.java:87, took 4.417106 s Mar 28, 2018 11:04:45 AM org.apache.beam.runners.spark.SparkRunner$Evaluator enterCompositeTransform INFO: Entering directly-translatable composite transform: 'WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values' Mar 28, 2018 11:04:45 AM org.apache.beam.runners.spark.SparkRunner$Evaluator doVisitTransform INFO: Evaluating Create.Values Mar 28, 2018 11:04:45 AM org.apache.beam.runners.spark.SparkRunner$Evaluator doVisitTransform INFO: Evaluating org.apache.beam.sdk.transforms.Reify$ReifyView$1@51de1b24 Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Block broadcast_3 stored as values in memory (estimated size 672.0 B, free 1825.1 MB) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Block broadcast_3_piece0 stored as bytes in memory (estimated size 361.0 B, free 1825.1 MB) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Added broadcast_3_piece0 in memory on 127.0.0.1:43400 (size: 361.0 B, free: 1825.4 MB) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Created broadcast 3 from broadcast at SideInputBroadcast.java:59 Mar 28, 2018 11:04:45 AM org.apache.beam.runners.spark.SparkRunner$Evaluator doVisitTransform INFO: Evaluating org.apache.beam.sdk.transforms.MapElements$1@4a707370 Mar 28, 2018 11:04:45 AM org.apache.beam.runners.spark.SparkRunner$Evaluator doVisitTransform INFO: Evaluating org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn@43d8899 Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Starting job: foreach at BoundedDataset.java:117 Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Got job 1 (foreach at BoundedDataset.java:117) with 4 output partitions Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Final stage: ResultStage 3 (foreach at BoundedDataset.java:117) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Parents of final stage: List() Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Missing parents: List() Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Submitting ResultStage 3 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize).output MapPartitionsRDD[53] at values at TransformTranslator.java:400), which has no missing parents Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Block broadcast_4 stored as values in memory (estimated size 84.1 KB, free 1825.0 MB) Mar 28, 2018 11:04:45 AM org.apache.spark.internal.Logging$class logInfo INFO: Block broadcast_4_piece0 stored as bytes in memory (estimated size 22.4 KB, free 1825.0 MB) Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Added broadcast_4_piece0 in memory on 127.0.0.1:43400 (size: 22.4 KB, free: 1825.3 MB) Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Created broadcast 4 from broadcast at DAGScheduler.scala:1006 Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Submitting 4 missing tasks from ResultStage 3 (WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize/ParMultiDo(Finalize).output MapPartitionsRDD[53] at values at TransformTranslator.java:400) (first 15 tasks are for partitions Vector(0, 1, 2, 3)) Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Adding task set 3.0 with 4 tasks Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Starting task 0.0 in stage 3.0 (TID 16, localhost, executor driver, partition 0, PROCESS_LOCAL, 4827 bytes) Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Starting task 1.0 in stage 3.0 (TID 17, localhost, executor driver, partition 1, PROCESS_LOCAL, 4827 bytes) Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Starting task 2.0 in stage 3.0 (TID 18, localhost, executor driver, partition 2, PROCESS_LOCAL, 4827 bytes) Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Starting task 3.0 in stage 3.0 (TID 19, localhost, executor driver, partition 3, PROCESS_LOCAL, 4837 bytes) Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Running task 2.0 in stage 3.0 (TID 18) Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Running task 0.0 in stage 3.0 (TID 16) Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Running task 1.0 in stage 3.0 (TID 17) Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Running task 3.0 in stage 3.0 (TID 19) Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 0.0 in stage 3.0 (TID 16). 12184 bytes result sent to driver Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 2.0 in stage 3.0 (TID 18). 12141 bytes result sent to driver Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 1.0 in stage 3.0 (TID 17). 12141 bytes result sent to driver Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 0.0 in stage 3.0 (TID 16) in 70 ms on localhost (executor driver) (1/4) Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 1.0 in stage 3.0 (TID 17) in 69 ms on localhost (executor driver) (2/4) Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 2.0 in stage 3.0 (TID 18) in 72 ms on localhost (executor driver) (3/4) Mar 28, 2018 11:04:46 AM org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process INFO: Finalizing 4 file results Mar 28, 2018 11:04:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation createMissingEmptyShards INFO: Finalizing for destination null num shards 4. Mar 28, 2018 11:04:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-2204937763380428322-tmpdir/word-count-beam/.temp-beam-2018-03-28_11-04-37-0/0f8fec26-178e-426b-83e2-51ee452f6bb8, shard=0, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@1374bd08, paneInfo=PaneInfo.NO_FIRING} to final location /tmp/groovy-generated-2204937763380428322-tmpdir/word-count-beam/counts-00000-of-00004 Mar 28, 2018 11:04:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-2204937763380428322-tmpdir/word-count-beam/.temp-beam-2018-03-28_11-04-37-0/029c5b0c-b6ef-44a3-9422-3e9965099f12, shard=1, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@1374bd08, paneInfo=PaneInfo.NO_FIRING} to final location /tmp/groovy-generated-2204937763380428322-tmpdir/word-count-beam/counts-00001-of-00004 Mar 28, 2018 11:04:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-2204937763380428322-tmpdir/word-count-beam/.temp-beam-2018-03-28_11-04-37-0/472f680c-5fca-4d42-be96-46fa0c582c72, shard=2, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@1374bd08, paneInfo=PaneInfo.NO_FIRING} to final location /tmp/groovy-generated-2204937763380428322-tmpdir/word-count-beam/counts-00002-of-00004 Mar 28, 2018 11:04:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation moveToOutputFiles INFO: Will copy temporary file FileResult{tempFilename=/tmp/groovy-generated-2204937763380428322-tmpdir/word-count-beam/.temp-beam-2018-03-28_11-04-37-0/fc1ab1ee-5d06-40df-a989-c0230a05c91b, shard=3, window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@1374bd08, paneInfo=PaneInfo.NO_FIRING} to final location /tmp/groovy-generated-2204937763380428322-tmpdir/word-count-beam/counts-00003-of-00004 Mar 28, 2018 11:04:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles INFO: Will remove known temporary file /tmp/groovy-generated-2204937763380428322-tmpdir/word-count-beam/.temp-beam-2018-03-28_11-04-37-0/0f8fec26-178e-426b-83e2-51ee452f6bb8 Mar 28, 2018 11:04:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles INFO: Will remove known temporary file /tmp/groovy-generated-2204937763380428322-tmpdir/word-count-beam/.temp-beam-2018-03-28_11-04-37-0/029c5b0c-b6ef-44a3-9422-3e9965099f12 Mar 28, 2018 11:04:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles INFO: Will remove known temporary file /tmp/groovy-generated-2204937763380428322-tmpdir/word-count-beam/.temp-beam-2018-03-28_11-04-37-0/fc1ab1ee-5d06-40df-a989-c0230a05c91b Mar 28, 2018 11:04:46 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation removeTemporaryFiles INFO: Will remove known temporary file /tmp/groovy-generated-2204937763380428322-tmpdir/word-count-beam/.temp-beam-2018-03-28_11-04-37-0/472f680c-5fca-4d42-be96-46fa0c582c72 Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 3.0 in stage 3.0 (TID 19). 15889 bytes result sent to driver Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Finished task 3.0 in stage 3.0 (TID 19) in 124 ms on localhost (executor driver) (4/4) Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: ResultStage 3 (foreach at BoundedDataset.java:117) finished in 0.127 s Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Job 1 finished: foreach at BoundedDataset.java:117, took 0.157978 s Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Removed TaskSet 3.0, whose tasks have all completed, from pool Mar 28, 2018 11:04:46 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1 INFO: Batch pipeline execution complete. Mar 28, 2018 11:04:46 AM org.spark_project.jetty.server.AbstractConnector doStop INFO: Stopped Spark@7aef989c{HTTP/1.1,[http/1.1]}{127.0.0.1:4040} Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Stopped Spark web UI at http://127.0.0.1:4040 Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: MapOutputTrackerMasterEndpoint stopped! Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: MemoryStore cleared Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: BlockManager stopped Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: BlockManagerMaster stopped Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: OutputCommitCoordinator stopped! Mar 28, 2018 11:04:46 AM org.apache.spark.internal.Logging$class logInfo INFO: Successfully stopped SparkContext [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 01:52 min [INFO] Finished at: 2018-03-28T11:04:46Z [INFO] Final Memory: 92M/892M [INFO] ------------------------------------------------------------------------ grep Foundation counts* counts-00000-of-00004:Foundation: 1 Verified Foundation: 1 [SUCCESS] Mar 28, 2018 12:07:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process SEVERE: 2018-03-28T12:07:22.137Z: Workflow failed. Causes: The Dataflow appears to be stuck. You can get help with Cloud Dataflow at https://cloud.google.com/dataflow/support. Mar 28, 2018 12:07:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-03-28T12:07:22.262Z: Cancel request is committed for workflow job: 2018-03-28_04_02_39-9444388526551827135. Mar 28, 2018 12:07:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-03-28T12:07:22.356Z: Cleaning up. Mar 28, 2018 12:07:22 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-03-28T12:07:22.466Z: Stopping worker pool... Mar 28, 2018 12:08:40 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-03-28T12:08:37.831Z: Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s). Mar 28, 2018 12:08:46 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish INFO: Job 2018-03-28_04_02_39-9444388526551827135 failed with status FAILED. [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 01:07 h [INFO] Finished at: 2018-03-28T12:08:46Z [INFO] Final Memory: 35M/273M [INFO] ------------------------------------------------------------------------ gsutil cat gs://temp-storage-for-release-validation-tests/quickstart/count* | grep Montague: CommandException: No URLs matched: gs://temp-storage-for-release-validation-tests/quickstart/count* CommandException: No URLs matched: gs://temp-storage-for-release-validation-tests/quickstart/count* [ERROR] Failed command :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow FAILED FAILURE: Build completed with 2 failures. 1: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':runners:flink:runQuickstartJavaFlinkLocal'. > Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java'' > finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'. > Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java'' > finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org BUILD FAILED in 1h 8m 31s 6 actionable tasks: 6 executed Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure Not sending mail to unregistered user c...@google.com Not sending mail to unregistered user da...@getindata.com Not sending mail to unregistered user w...@google.com Not sending mail to unregistered user hero...@google.com Not sending mail to unregistered user ro...@frantil.com