See
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/1271/display/redirect?page=changes>
Changes:
[ekirpichov] Add enum34 to manual_licenses
------------------------------------------
[...truncated 251.40 KB...]
INFO: Uploading
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-lbacygS5UrGB3ox-xxIlNYrF9vvjE8dRW4-yjZF1X9E.jar
Aug 19, 2020 6:34:03 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-TJwleALM4PyU7lyjVqcjWTdKmiOszOviXSdUb9Pm_mE.jar
Aug 19, 2020 6:34:03 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-VJJdz-O0sK-IbOpvzTzr4meJxkNcHfoaxEg3GIKK8IU.jar
Aug 19, 2020 6:34:04 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-8MqQs1QNtCy2DTU32N8NOz5UIFPaco22-VH-hw0hOkk.jar
Aug 19, 2020 6:34:04 AM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 174 files cached, 25 files newly uploaded in
1 seconds
Aug 19, 2020 6:34:04 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read from source as step s1
Aug 19, 2020 6:34:04 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Gather time as step s2
Aug 19, 2020 6:34:04 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Map records as step s3
Aug 19, 2020 6:34:04 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to BQ/PrepareWrite/ParDo(Anonymous) as step s4
Aug 19, 2020 6:34:04 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)
as step s5
Aug 19, 2020 6:34:04 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites as step s6
Aug 19, 2020 6:34:04 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds as step s7
Aug 19, 2020 6:34:04 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign
as step s8
Aug 19, 2020 6:34:04 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey as step s9
Aug 19, 2020 6:34:04 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable as step s10
Aug 19, 2020 6:34:04 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign as step s11
Aug 19, 2020 6:34:04 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite as step s12
Aug 19, 2020 6:34:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to
gs://temp-storage-for-perf-tests/loadtests/staging/
Aug 19, 2020 6:34:04 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <104183 bytes, hash
e3f7671f9f1e3eb9cc3202849009f4000a8a1a324fc8c913647ebd4f7e2eefe4> to
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-4_dnH58ePrnMMgKEkAn0AAqKGjJPyMkTZH69T34u7-Q.pb
Aug 19, 2020 6:34:05 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
Aug 19, 2020 6:34:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-18_23_34_05-3671138866891474026?project=apache-beam-testing
Aug 19, 2020 6:34:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2020-08-18_23_34_05-3671138866891474026
Aug 19, 2020 6:34:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2020-08-18_23_34_05-3671138866891474026
Aug 19, 2020 6:34:06 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2020-08-19T06:34:05.287Z: The requested max number of ****s (5) is
ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
Aug 19, 2020 6:34:12 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:12.098Z: Worker configuration: n1-standard-1 in
us-central1-f.
Aug 19, 2020 6:34:13 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:12.938Z: Expanding CoGroupByKey operations into
optimizable parts.
Aug 19, 2020 6:34:13 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.008Z: Expanding GroupByKey operations into
optimizable parts.
Aug 19, 2020 6:34:13 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.036Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
Aug 19, 2020 6:34:13 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.143Z: Fusing adjacent ParDo, Read, Write, and
Flatten operations
Aug 19, 2020 6:34:13 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.170Z: Fusing consumer Gather time into Read from
source
Aug 19, 2020 6:34:13 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.193Z: Fusing consumer Map records into Gather time
Aug 19, 2020 6:34:13 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.226Z: Fusing consumer Write to
BQ/PrepareWrite/ParDo(Anonymous) into Map records
Aug 19, 2020 6:34:13 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.261Z: Fusing consumer Write to
BQ/StreamingInserts/CreateTables/ParDo(CreateTables) into Write to
BQ/PrepareWrite/ParDo(Anonymous)
Aug 19, 2020 6:34:13 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.298Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites into Write to
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)
Aug 19, 2020 6:34:13 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.326Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds into Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites
Aug 19, 2020 6:34:13 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.348Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign
into Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds
Aug 19, 2020 6:34:13 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.385Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify into Write
to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign
Aug 19, 2020 6:34:13 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.414Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write into Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify
Aug 19, 2020 6:34:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.446Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow
into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read
Aug 19, 2020 6:34:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.491Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable into Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow
Aug 19, 2020 6:34:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.528Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign into Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable
Aug 19, 2020 6:34:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.554Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite into Write to
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign
Aug 19, 2020 6:34:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:13.983Z: Executing operation Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
Aug 19, 2020 6:34:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:14.070Z: Starting 5 ****s in us-central1-f...
Aug 19, 2020 6:34:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:14.118Z: Finished operation Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
Aug 19, 2020 6:34:16 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:14.250Z: Executing operation Read from source+Gather
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
Aug 19, 2020 6:34:28 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2020-08-19T06:34:26.174Z: Your project already contains 100
Dataflow-created metric descriptors and Stackdriver will not create new
Dataflow custom metrics for this job. Each unique user-defined metric name
(independent of the DoFn in which it is defined) produces a new metric
descriptor. To delete old / unused metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Aug 19, 2020 6:34:42 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:34:41.979Z: Autoscaling: Raised the number of ****s to
5 based on the rate of progress in the currently running stage(s).
Aug 19, 2020 6:35:07 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:35:07.107Z: Workers have started successfully.
Aug 19, 2020 6:35:07 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:35:07.143Z: Workers have started successfully.
Aug 19, 2020 6:37:42 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:37:40.694Z: Finished operation Read from source+Gather
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
Aug 19, 2020 6:37:42 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:37:40.761Z: Executing operation Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close
Aug 19, 2020 6:37:42 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:37:40.817Z: Finished operation Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close
Aug 19, 2020 6:37:42 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-19T06:37:40.898Z: Executing operation Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite
Aug 19, 2020 6:39:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2020-08-19T06:38:59.742Z: An OutOfMemoryException occurred.
Consider specifying higher memory instances in PipelineOptions.
java.lang.RuntimeException:
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.ExecutionError:
java.lang.OutOfMemoryError: Java heap space
at
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:132)
at
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:119)
at
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:84)
at
org.apache.beam.runners.dataflow.****.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:63)
at
org.apache.beam.runners.dataflow.****.util.common.****.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
at
org.apache.beam.runners.dataflow.****.util.common.****.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
at
org.apache.beam.runners.dataflow.****.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
at
org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
at
org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
at
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:121)
at
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
at
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
at
org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.runReadLoop(ReadOperation.java:201)
at
org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.start(ReadOperation.java:159)
at
org.apache.beam.runners.dataflow.****.util.common.****.MapTaskExecutor.execute(MapTaskExecutor.java:77)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:417)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:386)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:311)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by:
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.ExecutionError:
java.lang.OutOfMemoryError: Java heap space
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2048)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
at
org.apache.beam.runners.dataflow.****.util.common.****.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:74)
at
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:125)
... 26 more
Caused by: java.lang.OutOfMemoryError: Java heap space
at
org.apache.beam.runners.dataflow.****.ChunkingShuffleBatchReader.getFixedLengthPrefixedByteArray(ChunkingShuffleBatchReader.java:98)
at
org.apache.beam.runners.dataflow.****.ChunkingShuffleBatchReader.getShuffleEntry(ChunkingShuffleBatchReader.java:82)
at
org.apache.beam.runners.dataflow.****.ChunkingShuffleBatchReader.read(ChunkingShuffleBatchReader.java:63)
at
org.apache.beam.runners.dataflow.****.util.common.****.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:55)
at
org.apache.beam.runners.dataflow.****.util.common.****.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:51)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
at
org.apache.beam.runners.dataflow.****.util.common.****.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:74)
at
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:125)
at
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:119)
at
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:84)
at
org.apache.beam.runners.dataflow.****.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:63)
at
org.apache.beam.runners.dataflow.****.util.common.****.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
at
org.apache.beam.runners.dataflow.****.util.common.****.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
at
org.apache.beam.runners.dataflow.****.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
at
org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
at
org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
at
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:121)
at
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
at
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
at
org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.runReadLoop(ReadOperation.java:201)
at
org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.start(ReadOperation.java:159)
at
org.apache.beam.runners.dataflow.****.util.common.****.MapTaskExecutor.execute(MapTaskExecutor.java:77)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:417)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:386)
Aug 19, 2020 6:44:24 AM
org.apache.beam.runners.dataflow.DataflowPipelineJob lambda$waitUntilFinish$0
WARNING: Job is already running in Google Cloud Platform, Ctrl-C will not
cancel it.
To cancel the job in the cloud, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2020-08-18_23_34_05-3671138866891474026
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead SKIPPED
> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution **** for
':' Thread 4,5,main]) completed. Took 10 mins 27.875 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> Process 'Gradle Test Executor 3' finished with non-zero exit value 143
This problem might be caused by incorrect test process configuration.
Please refer to the test execution section in the User Manual at
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 11m 7s
84 actionable tasks: 55 executed, 29 from cache
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=b5aae32a-ae81-4cb1-b12e-816c3c093c49,
currentDir=<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 9025
log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-9025.out.log
----- Last 20 lines from daemon log file - daemon-9025.out.log -----
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> Process 'Gradle Test Executor 3' finished with non-zero exit value 143
This problem might be caused by incorrect test process configuration.
Please refer to the test execution section in the User Manual at
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 11m 7s
84 actionable tasks: 55 executed, 29 from cache
Daemon vm is shutting down... The daemon has exited normally or was terminated
in response to a user interrupt.
----- End of the daemon log -----
FAILURE: Build failed with an exception.
* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may
have crashed)
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]