See 
<https://ci-beam.apache.org/job/beam_PostRelease_NightlySnapshot/1161/display/redirect?page=changes>

Changes:

[Udi Meiri] [BEAM-2717] Implement ProtoCoder.to_type_hint

[Pablo Estrada] Adding display data to BQ File Loads transform

[Robert Bradshaw] Allow use of index as series.

[Robert Bradshaw] Allow setting columns.

[sychen] Fix GroupIntoBathces.test_buffering_timer_in_fixed_window_streaming

[Robert Bradshaw] Add utility to test a set of strings.

[Robert Bradshaw] Add a proxy for panda's top-level module functions.

[Robert Bradshaw] [BEAM-9547] Implement pd.concat().

[noreply] [BEAM-11091] Allow to specify coder for HadoopFormatIO.Read (#13166)

[noreply] [BEAM-11162] Fetch missing projectId from options (#13234)

[Kenneth Knowles] Add class-level suppression of rawtypes errors

[Kenneth Knowles] Enable rawtype errors globally

[noreply] [BEAM-3736] Add CombineFn.setup and CombineFn.teardown to Python SDK

[noreply] [BEAM-11190] Fix grouping on categorical columns (#13256)

[Robert Bradshaw] todo, lint

[noreply] [BEAM-3736] Disable CombineFnVisitor (#13266)

[noreply] [BEAM-11196] Set parent of fused stages to the lowest common ancestor


------------------------------------------
[...truncated 3.79 MB...]
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 28
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 44
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 24
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 35
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 73
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 92
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 23
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 63
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 85
Nov 05, 2020 11:08:57 AM org.apache.beam.sdk.io.FileBasedSink$WriteOperation 
removeTemporaryFiles
WARNING: Failed to match temporary files under: 
[/tmp/groovy-generated-6155051820837814332-tmpdir/word-count-beam/.temp-beam-99528c8e-53d5-4604-b1b8-defb0ed4e306/].
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed broadcast_3_piece0 on localhost:36419 in memory (size: 12.3 KB, 
free: 13.5 GB)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 5
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 21
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 6
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24). 12900 bytes result sent to driver
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed broadcast_1_piece0 on localhost:36419 in memory (size: 9.8 KB, 
free: 13.5 GB)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 5.0 (TID 24) in 141 ms on localhost (executor 
driver) (4/4)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 5.0, whose tasks have all completed, from pool 
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 100
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: ShuffleMapStage 5 (repartition at GroupCombineFunctions.java:191) 
finished in 0.155 s
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 33
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: looking for newly runnable stages
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 51
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: running: Set()
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: waiting: Set(ResultStage 6)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 43
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: failed: Set()
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 3
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 11
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 42
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 101
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 66
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 94
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting ResultStage 6 
(WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output
 MapPartitionsRDD[113] at values at TransformTranslator.java:434), which has no 
missing parents
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 54
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 91
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 74
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 38
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 47
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 72
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 32
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 50
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 56
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 31
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 41
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 17
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 96
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 87
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 95
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 60
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 13
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 9
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 88
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7 stored as values in memory (estimated size 16.0 KB, 
free 13.5 GB)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed broadcast_0_piece0 on localhost:36419 in memory (size: 8.7 KB, 
free: 13.5 GB)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 7.3 
KB, free 13.5 GB)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Added broadcast_7_piece0 in memory on localhost:36419 (size: 7.3 KB, 
free: 13.5 GB)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 8
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 90
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Cleaned accumulator 40
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Created broadcast 7 from broadcast at DAGScheduler.scala:1184
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Submitting 4 missing tasks from ResultStage 6 
(WriteCounts/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map/ParMultiDo(Anonymous).output
 MapPartitionsRDD[113] at values at TransformTranslator.java:434) (first 15 
tasks are for partitions Vector(0, 1, 2, 3))
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Adding task set 6.0 with 4 tasks
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 0.0 in stage 6.0 (TID 28, localhost, executor driver, 
partition 0, NODE_LOCAL, 7938 bytes)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 1.0 in stage 6.0 (TID 29, localhost, executor driver, 
partition 1, NODE_LOCAL, 7938 bytes)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 2.0 in stage 6.0 (TID 30, localhost, executor driver, 
partition 2, NODE_LOCAL, 7938 bytes)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Starting task 3.0 in stage 6.0 (TID 31, localhost, executor driver, 
partition 3, NODE_LOCAL, 7938 bytes)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 0.0 in stage 6.0 (TID 28)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 1.0 in stage 6.0 (TID 29)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 2.0 in stage 6.0 (TID 30)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Running task 3.0 in stage 6.0 (TID 31)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Getting 1 non-empty blocks including 1 local blocks and 0 remote blocks
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 0 ms
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Started 0 remote fetches in 1 ms
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30). 6496 bytes result sent to driver
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29). 6496 bytes result sent to driver
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31). 6496 bytes result sent to driver
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28). 6496 bytes result sent to driver
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 0.0 in stage 6.0 (TID 28) in 18 ms on localhost (executor 
driver) (1/4)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 2.0 in stage 6.0 (TID 30) in 18 ms on localhost (executor 
driver) (2/4)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 3.0 in stage 6.0 (TID 31) in 18 ms on localhost (executor 
driver) (3/4)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Finished task 1.0 in stage 6.0 (TID 29) in 20 ms on localhost (executor 
driver) (4/4)
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool 
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: ResultStage 6 (foreach at BoundedDataset.java:127) finished in 0.032 s
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Job 1 finished: foreach at BoundedDataset.java:127, took 0.255782 s
Nov 05, 2020 11:08:57 AM org.apache.beam.runners.spark.SparkRunner lambda$run$1
INFO: Batch pipeline execution complete.
Nov 05, 2020 11:08:57 AM org.spark_project.jetty.server.AbstractConnector doStop
INFO: Stopped Spark@63152571{HTTP/1.1,[http/1.1]}{127.0.0.1:4040}
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Stopped Spark web UI at http://localhost:4040
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: MapOutputTrackerMasterEndpoint stopped!
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: MemoryStore cleared
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManager stopped
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: BlockManagerMaster stopped
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: OutputCommitCoordinator stopped!
Nov 05, 2020 11:08:57 AM org.apache.spark.internal.Logging$class logInfo
INFO: Successfully stopped SparkContext
grep Foundation counts*
counts-00000-of-00004:Foundation: 1
Verified Foundation: 1
[SUCCESS]

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:runMobileGamingJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:runQuickstartJavaDataflow'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:direct-java:runMobileGamingJavaDirect'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 8m 41s
8 actionable tasks: 7 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/cn6bfgmtnwll4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to