See 
<https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1079/display/redirect?page=changes>

Changes:

[Luke Cwik] [BEAM-10688] Euphoria assumes that all type descriptors are 
resolvable

[Luke Cwik] [BEAM-10670] Use fraction of remainder if consumed fraction is 
unknown

[Luke Cwik] [BEAM-10670] Improve splitting logic to prefer splits upto the the

[Luke Cwik] [BEAM-10670] Fix passing forward the self-checkpoint from the

[Boyuan Zhang] Enable dataflow streaming engine when running runner_v2 and 
streaming.

[Boyuan Zhang] Fix formatter.

[Boyuan Zhang] Use unbounded wrapper for Kafka Read.

[je.ik] [BEAM-10691] Use FlinkStateInternals#addWatermarkHoldUsage for timer

[noreply] [BEAM-9615] Map user types to Schema reps. (#12554)

[noreply] fixed a typo in S3TestUtils (#12582)

[neville.lyh] [BEAM-10612] Add flink 1.11 runner

[Boyuan Zhang] Scale progress with respect to windows observation.

[noreply] [BEAM-9547] Implement some methods for deferred Series. (#12534)

[Robert Burke] Fix broken build.

[Luke Cwik] [BEAM-8025] Update tests to use TemporaryFolder instead of rolling 
their


------------------------------------------
[...truncated 2.25 MB...]
INFO: Entering directly-translatable composite transform: 
'WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle'
Aug 15, 2020 11:36:49 AM org.apache.beam.runners.spark.SparkRunner$Evaluator 
doVisitTransform
INFO: Evaluating Reshuffle
Aug 15, 2020 11:36:49 AM org.apache.beam.runners.spark.SparkRunner$Evaluator 
doVisitTransform
INFO: Evaluating org.apache.beam.sdk.transforms.MapElements$1@1b6eafff
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting job: foreach at BoundedDataset.java:124
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Registering RDD 50 (repartition at GroupCombineFunctions.java:191) as 
input to shuffle 2
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Got job 1 (foreach at BoundedDataset.java:124) with 4 output partitions
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Final stage: ResultStage 4 (foreach at BoundedDataset.java:124)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Parents of final stage: List(ShuffleMapStage 3)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Missing parents: List(ShuffleMapStage 3)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ShuffleMapStage 3 (MapPartitionsRDD[50] at repartition at 
GroupCombineFunctions.java:191), which has no missing parents
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_4 stored as values in memory (estimated size 23.8 KB, 
free 13.5 GB)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_4_piece0 stored as bytes in memory (estimated size 10.7 
KB, free 13.5 GB)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_4_piece0 in memory on localhost:45555 (size: 10.7 KB, 
free: 13.5 GB)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 4 from broadcast at DAGScheduler.scala:1163
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 1 missing tasks from ShuffleMapStage 3 (MapPartitionsRDD[50] 
at repartition at GroupCombineFunctions.java:191) (first 15 tasks are for 
partitions Vector(0))
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 3.0 with 1 tasks
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 3.0 (TID 16, localhost, executor driver, 
partition 0, PROCESS_LOCAL, 8509 bytes)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 3.0 (TID 16)
Aug 15, 2020 11:36:49 AM 
org.apache.beam.sdk.io.WriteFiles$FinalizeTempFileBundles$FinalizeFn process
INFO: Finalizing 4 file results
Aug 15, 2020 11:36:49 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
createMissingEmptyShards
INFO: Finalizing for destination null num shards 4.
Aug 15, 2020 11:36:49 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
moveToOutputFiles
INFO: Will copy temporary file 
FileResult{tempFilename=/tmp/groovy-generated-5001336804573784876-tmpdir/word-count-beam/.temp-beam-23887337-82d4-49b6-86ac-fe2402644462/c12fe42e-0492-4db5-a507-8e05976b8cfc,
 shard=0, 
window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@7fcf4559, 
paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, 
onTimeIndex=0}} to final location 
/tmp/groovy-generated-5001336804573784876-tmpdir/word-count-beam/counts-00000-of-00004
Aug 15, 2020 11:36:49 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
moveToOutputFiles
INFO: Will copy temporary file 
FileResult{tempFilename=/tmp/groovy-generated-5001336804573784876-tmpdir/word-count-beam/.temp-beam-23887337-82d4-49b6-86ac-fe2402644462/51c99d74-eeda-410a-b06a-732a64ecba1a,
 shard=1, 
window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@7fcf4559, 
paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, 
onTimeIndex=0}} to final location 
/tmp/groovy-generated-5001336804573784876-tmpdir/word-count-beam/counts-00001-of-00004
Aug 15, 2020 11:36:49 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
moveToOutputFiles
INFO: Will copy temporary file 
FileResult{tempFilename=/tmp/groovy-generated-5001336804573784876-tmpdir/word-count-beam/.temp-beam-23887337-82d4-49b6-86ac-fe2402644462/a5ee9f26-95a9-48bc-90d4-759b925948e9,
 shard=2, 
window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@7fcf4559, 
paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, 
onTimeIndex=0}} to final location 
/tmp/groovy-generated-5001336804573784876-tmpdir/word-count-beam/counts-00002-of-00004
Aug 15, 2020 11:36:49 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
moveToOutputFiles
INFO: Will copy temporary file 
FileResult{tempFilename=/tmp/groovy-generated-5001336804573784876-tmpdir/word-count-beam/.temp-beam-23887337-82d4-49b6-86ac-fe2402644462/a6888bbb-1bb5-4549-8ea7-869a194a50aa,
 shard=3, 
window=org.apache.beam.sdk.transforms.windowing.GlobalWindow@7fcf4559, 
paneInfo=PaneInfo{isFirst=true, isLast=true, timing=ON_TIME, index=0, 
onTimeIndex=0}} to final location 
/tmp/groovy-generated-5001336804573784876-tmpdir/word-count-beam/counts-00003-of-00004
Aug 15, 2020 11:36:49 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-5001336804573784876-tmpdir/word-count-beam/.temp-beam-23887337-82d4-49b6-86ac-fe2402644462/c12fe42e-0492-4db5-a507-8e05976b8cfc
Aug 15, 2020 11:36:49 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-5001336804573784876-tmpdir/word-count-beam/.temp-beam-23887337-82d4-49b6-86ac-fe2402644462/51c99d74-eeda-410a-b06a-732a64ecba1a
Aug 15, 2020 11:36:49 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-5001336804573784876-tmpdir/word-count-beam/.temp-beam-23887337-82d4-49b6-86ac-fe2402644462/a5ee9f26-95a9-48bc-90d4-759b925948e9
Aug 15, 2020 11:36:49 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
INFO: Will remove known temporary file 
/tmp/groovy-generated-5001336804573784876-tmpdir/word-count-beam/.temp-beam-23887337-82d4-49b6-86ac-fe2402644462/a6888bbb-1bb5-4549-8ea7-869a194a50aa
Aug 15, 2020 11:36:49 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
WARNING: Failed to match temporary files under: 
[/tmp/groovy-generated-5001336804573784876-tmpdir/word-count-beam/.temp-beam-23887337-82d4-49b6-86ac-fe2402644462/].
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 3.0 (TID 16). 11122 bytes result sent to driver
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 3.0 (TID 16) in 209 ms on localhost (executor 
driver) (1/1)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 3.0, whose tasks have all completed, from pool 
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 3 (repartition at GroupCombineFunctions.java:191) 
finished in 0.234 s
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 4)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 4 
(WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output
 MapPartitionsRDD[57] at values at TransformTranslator.java:434), which has no 
missing parents
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_5 stored as values in memory (estimated size 16.0 KB, 
free 13.5 GB)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_5_piece0 stored as bytes in memory (estimated size 7.3 
KB, free 13.5 GB)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_5_piece0 in memory on localhost:45555 (size: 7.3 KB, 
free: 13.5 GB)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 5 from broadcast at DAGScheduler.scala:1163
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 4 
(WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output
 MapPartitionsRDD[57] at values at TransformTranslator.java:434) (first 15 
tasks are for partitions Vector(0, 1, 2, 3))
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 4.0 with 4 tasks
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 4.0 (TID 17, localhost, executor driver, 
partition 0, NODE_LOCAL, 7938 bytes)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 4.0 (TID 18, localhost, executor driver, 
partition 1, NODE_LOCAL, 7938 bytes)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 4.0 (TID 19, localhost, executor driver, 
partition 2, NODE_LOCAL, 7938 bytes)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 4.0 (TID 20, localhost, executor driver, 
partition 3, NODE_LOCAL, 7938 bytes)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 4.0 (TID 18)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 4.0 (TID 17)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 4.0 (TID 19)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 4.0 (TID 20)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 4.0 (TID 19). 6453 bytes result sent to driver
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 4.0 (TID 20). 6453 bytes result sent to driver
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 4.0 (TID 18). 6453 bytes result sent to driver
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 4.0 (TID 17). 6453 bytes result sent to driver
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 4.0 (TID 18) in 34 ms on localhost (executor 
driver) (1/4)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 4.0 (TID 20) in 32 ms on localhost (executor 
driver) (2/4)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 4.0 (TID 19) in 34 ms on localhost (executor 
driver) (3/4)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 4.0 (TID 17) in 41 ms on localhost (executor 
driver) (4/4)
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 4.0, whose tasks have all completed, from pool 
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 4 (foreach at BoundedDataset.java:124) finished in 0.071 s
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:124, took 0.316620 s
Aug 15, 2020 11:36:49 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Aug 15, 2020 11:36:49 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@4a730a4d{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Aug 15, 2020 11:36:49 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

> Task :runners:google-cloud-dataflow-java:runQuickstartJavaDataflow
Aug 15, 2020 11:37:44 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:44.043Z: Finished operation 
ReadLines/Read+WordCount.CountWords/ParDo(ExtractWords)+WordCount.CountWords/Count.PerElement/Init/Map+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Partial+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Reify+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Write
Aug 15, 2020 11:37:44 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:44.117Z: Executing operation 
WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Close
Aug 15, 2020 11:37:44 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:44.171Z: Finished operation 
WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Close
Aug 15, 2020 11:37:44 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:44.244Z: Executing operation 
WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract+MapElements/Map+WriteCounts/WriteFiles/RewindowIntoGlobal/Window.Assign+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
Aug 15, 2020 11:37:51 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:51.141Z: Finished operation 
WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract+MapElements/Map+WriteCounts/WriteFiles/RewindowIntoGlobal/Window.Assign+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
Aug 15, 2020 11:37:51 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:51.243Z: Executing operation 
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close
Aug 15, 2020 11:37:51 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:51.294Z: Finished operation 
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close
Aug 15, 2020 11:37:51 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:51.384Z: Executing operation 
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)
Aug 15, 2020 11:37:53 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:53.218Z: Finished operation 
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)
Aug 15, 2020 11:37:53 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:53.350Z: Executing operation 
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections/Unzipped-1
Aug 15, 2020 11:37:53 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:53.476Z: Finished operation 
WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections/Unzipped-1
Aug 15, 2020 11:37:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:53.617Z: Executing operation 
WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/CreateDataflowView
Aug 15, 2020 11:37:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:53.661Z: Finished operation 
WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/CreateDataflowView
Aug 15, 2020 11:37:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:53.808Z: Executing operation 
WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource)+WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)+WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map+WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair
 with random 
key+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
Aug 15, 2020 11:37:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:56.290Z: Finished operation 
WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource)+WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)+WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map+WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair
 with random 
key+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
Aug 15, 2020 11:37:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:56.368Z: Executing operation 
WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
Aug 15, 2020 11:37:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:56.420Z: Finished operation 
WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
Aug 15, 2020 11:37:56 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:56.484Z: Executing operation 
WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map
Aug 15, 2020 11:37:58 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:56.799Z: Finished operation 
WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map
Aug 15, 2020 11:37:58 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:56.978Z: Cleaning up.
Aug 15, 2020 11:37:58 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:37:57.082Z: Stopping worker pool...
Aug 15, 2020 11:38:47 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:38:47.195Z: Autoscaling: Resized worker pool from 1 to 0.
Aug 15, 2020 11:38:49 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-15T11:38:47.242Z: Worker pool stopped.
Aug 15, 2020 11:38:54 AM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2020-08-15_04_35_18-10685581142290012803 finished with status DONE.
gsutil cat 
gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/count*
 | grep Montague:
Montague: 47
Verified Montague: 47
gsutil rm 
gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/count*
Removing 
gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/counts-00000-of-00003...
/ [1 objects]                                                                   
Removing 
gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/counts-00001-of-00003...
/ [2 objects]                                                                   
Removing 
gs://temp-storage-for-release-validation-tests/nightly-snapshot-validation/counts-00002-of-00003...
/ [3 objects]                                                                   
Operation completed over 3 objects.                                             
 
[SUCCESS]

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38m 38s
7 actionable tasks: 6 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/2zif52jq72cm4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to