See
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/1239/display/redirect?page=changes>
Changes:
[noreply] [BEAM-10300] Improve JdbcIOTest.testFluentBackOffConfiguration
stability
------------------------------------------
[...truncated 252.58 KB...]
INFO: Uploading
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.24.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.24.0-SNAPSHOT-bVHfLvljnY5Zm7jmS1iFa7NuTXYfs2nNpcOg0R0b0f0.jar
Aug 11, 2020 6:34:06 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.24.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.24.0-SNAPSHOT-JY1U5K5PDai_wgJelX75c08xZlN5253bHcddNlmC5C4.jar
Aug 11, 2020 6:34:06 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.24.0-SNAPSHOT.jar>
to
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.24.0-SNAPSHOT-PMkpiOMOhTSm5GXg4FqmC1gh2l9nT5zTiBDGj4v9tr8.jar
Aug 11, 2020 6:34:06 AM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 174 files cached, 25 files newly uploaded in
0 seconds
Aug 11, 2020 6:34:06 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read from source as step s1
Aug 11, 2020 6:34:06 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Gather time as step s2
Aug 11, 2020 6:34:06 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Map records as step s3
Aug 11, 2020 6:34:06 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to BQ/PrepareWrite/ParDo(Anonymous) as step s4
Aug 11, 2020 6:34:07 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables)
as step s5
Aug 11, 2020 6:34:07 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites as step s6
Aug 11, 2020 6:34:07 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds as step s7
Aug 11, 2020 6:34:07 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign
as step s8
Aug 11, 2020 6:34:07 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey as step s9
Aug 11, 2020 6:34:07 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable as step s10
Aug 11, 2020 6:34:07 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign as step s11
Aug 11, 2020 6:34:07 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite as step s12
Aug 11, 2020 6:34:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to
gs://temp-storage-for-perf-tests/loadtests/staging/
Aug 11, 2020 6:34:07 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <104183 bytes, hash
91664d52b9a1e0f30fbdff74bc623182f4bf22e3fd2f941ed400ebfb41aed72f> to
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-kWZNUrmh4PMPvf90vGIxgvS_IuP9L5Qe1ADr-0Gu1y8.pb
Aug 11, 2020 6:34:07 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.24.0-SNAPSHOT
Aug 11, 2020 6:34:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-10_23_34_07-752389174269638054?project=apache-beam-testing
Aug 11, 2020 6:34:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2020-08-10_23_34_07-752389174269638054
Aug 11, 2020 6:34:08 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2020-08-10_23_34_07-752389174269638054
Aug 11, 2020 6:34:08 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2020-08-11T06:34:07.399Z: The requested max number of ****s (5) is
ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
Aug 11, 2020 6:34:15 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:14.856Z: Worker configuration: n1-standard-1 in
us-central1-f.
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:15.542Z: Expanding CoGroupByKey operations into
optimizable parts.
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:15.825Z: Expanding GroupByKey operations into
optimizable parts.
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:15.842Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:15.955Z: Fusing adjacent ParDo, Read, Write, and
Flatten operations
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:15.973Z: Fusing consumer Gather time into Read from
source
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:15.998Z: Fusing consumer Map records into Gather time
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:16.020Z: Fusing consumer Write to
BQ/PrepareWrite/ParDo(Anonymous) into Map records
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:16.048Z: Fusing consumer Write to
BQ/StreamingInserts/CreateTables/ParDo(CreateTables) into Write to
BQ/PrepareWrite/ParDo(Anonymous)
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:16.072Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites into Write to
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:16.096Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds into Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:16.132Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign
into Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:16.169Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify into Write
to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:16.200Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write into Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:16.239Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow
into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:16.273Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable into Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:16.309Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign into Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:16.345Z: Fusing consumer Write to
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite into Write to
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:16.716Z: Executing operation Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:16.797Z: Starting 5 ****s in us-central1-f...
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:16.847Z: Finished operation Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
Aug 11, 2020 6:34:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:16.983Z: Executing operation Read from source+Gather
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
Aug 11, 2020 6:34:48 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:34:46.902Z: Autoscaling: Raised the number of ****s to
5 based on the rate of progress in the currently running stage(s).
Aug 11, 2020 6:34:50 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2020-08-11T06:34:49.469Z: Your project already contains 100
Dataflow-created metric descriptors and Stackdriver will not create new
Dataflow custom metrics for this job. Each unique user-defined metric name
(independent of the DoFn in which it is defined) produces a new metric
descriptor. To delete old / unused metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Aug 11, 2020 6:35:07 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:35:07.045Z: Workers have started successfully.
Aug 11, 2020 6:35:07 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:35:07.063Z: Workers have started successfully.
Aug 11, 2020 6:37:41 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:37:40.490Z: Finished operation Read from source+Gather
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
Aug 11, 2020 6:37:41 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:37:40.583Z: Executing operation Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close
Aug 11, 2020 6:37:41 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:37:40.643Z: Finished operation Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close
Aug 11, 2020 6:37:41 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-08-11T06:37:40.716Z: Executing operation Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite
Aug 11, 2020 6:38:51 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2020-08-11T06:38:50.700Z: An OutOfMemoryException occurred.
Consider specifying higher memory instances in PipelineOptions.
java.lang.RuntimeException:
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.ExecutionError:
java.lang.OutOfMemoryError: Java heap space
at
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:132)
at
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:119)
at
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:84)
at
org.apache.beam.runners.dataflow.****.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:63)
at
org.apache.beam.runners.dataflow.****.util.common.****.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
at
org.apache.beam.runners.dataflow.****.util.common.****.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
at
org.apache.beam.runners.dataflow.****.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
at
org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
at
org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
at
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:121)
at
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
at
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
at
org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.runReadLoop(ReadOperation.java:201)
at
org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.start(ReadOperation.java:159)
at
org.apache.beam.runners.dataflow.****.util.common.****.MapTaskExecutor.execute(MapTaskExecutor.java:77)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:417)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:386)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:311)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by:
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.ExecutionError:
java.lang.OutOfMemoryError: Java heap space
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2048)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
at
org.apache.beam.runners.dataflow.****.util.common.****.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:74)
at
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:125)
... 26 more
Caused by: java.lang.OutOfMemoryError: Java heap space
at
org.apache.beam.runners.dataflow.****.ChunkingShuffleBatchReader.getFixedLengthPrefixedByteArray(ChunkingShuffleBatchReader.java:98)
at
org.apache.beam.runners.dataflow.****.ChunkingShuffleBatchReader.getShuffleEntry(ChunkingShuffleBatchReader.java:82)
at
org.apache.beam.runners.dataflow.****.ChunkingShuffleBatchReader.read(ChunkingShuffleBatchReader.java:63)
at
org.apache.beam.runners.dataflow.****.util.common.****.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:55)
at
org.apache.beam.runners.dataflow.****.util.common.****.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:51)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
at
org.apache.beam.runners.dataflow.****.util.common.****.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:74)
at
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:125)
at
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:119)
at
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:84)
at
org.apache.beam.runners.dataflow.****.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:63)
at
org.apache.beam.runners.dataflow.****.util.common.****.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
at
org.apache.beam.runners.dataflow.****.util.common.****.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
at
org.apache.beam.runners.dataflow.****.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
at
org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
at
org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
at
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:121)
at
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
at
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
at
org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
at
org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
at
org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.runReadLoop(ReadOperation.java:201)
at
org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.start(ReadOperation.java:159)
at
org.apache.beam.runners.dataflow.****.util.common.****.MapTaskExecutor.execute(MapTaskExecutor.java:77)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:417)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:386)
Aug 11, 2020 6:46:46 AM
org.apache.beam.runners.dataflow.DataflowPipelineJob lambda$waitUntilFinish$0
WARNING: Job is already running in Google Cloud Platform, Ctrl-C will not
cancel it.
To cancel the job in the cloud, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2020-08-10_23_34_07-752389174269638054
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead SKIPPED
> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution **** for
':' Thread 9,5,main]) completed. Took 12 mins 48.433 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> Process 'Gradle Test Executor 3' finished with non-zero exit value 143
This problem might be caused by incorrect test process configuration.
Please refer to the test execution section in the User Manual at
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 13m 31s
84 actionable tasks: 55 executed, 29 from cache
Publishing build scan...
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=8ca33146-e470-4ca1-b993-ae5ef28bd6d0,
currentDir=<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 3094
log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-3094.out.log
----- Last 20 lines from daemon log file - daemon-3094.out.log -----
* What went wrong:
Execution failed for task
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> Process 'Gradle Test Executor 3' finished with non-zero exit value 143
This problem might be caused by incorrect test process configuration.
Please refer to the test execution section in the User Manual at
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 13m 31s
84 actionable tasks: 55 executed, 29 from cache
Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated
in response to a user interrupt.
----- End of the daemon log -----
FAILURE: Build failed with an exception.
* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may
have crashed)
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]