See 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/1279/display/redirect>

Changes:


------------------------------------------
[...truncated 252.26 KB...]
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/core/build/libs/beam-sdks-java-core-2.25.0-SNAPSHOT-tests.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-core-2.25.0-SNAPSHOT-tests-ENRvOHovO-gI8_O-8uJOwRPPiuM5B5eAb-e9S4BUAnA.jar
    Aug 21, 2020 6:34:10 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/sdks/java/fn-execution/build/libs/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-sdks-java-fn-execution-2.25.0-SNAPSHOT-zVmUJRryuFinc36QioldRCKcR12eTWA3vhxTNXsZo4Y.jar
    Aug 21, 2020 6:34:10 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-runners-google-cloud-dataflow-java-legacy-****-2.25.0-SNAPSHOT-uUD69gExTskIoQXjCSAO1LQ6amIPbbLYTC9InWmjHiM.jar
    Aug 21, 2020 6:34:10 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/pipeline/build/libs/beam-model-pipeline-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-pipeline-2.25.0-SNAPSHOT-70GD6Y5Sdv_Srr0nHyNhwnCmlzDjc1jB1WBaG-UmQag.jar
    Aug 21, 2020 6:34:10 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/job-management/build/libs/beam-model-job-management-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-job-management-2.25.0-SNAPSHOT-0zial8usFo4qkBekS0g0xtoeBMzxLgdIwyYPPdMmAUs.jar
    Aug 21, 2020 6:34:10 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading 
<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src/model/fn-execution/build/libs/beam-model-fn-execution-2.25.0-SNAPSHOT.jar>
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/beam-model-fn-execution-2.25.0-SNAPSHOT-R8LUVZvFRbp9piH3rgrhAfQSLdjXsiRC6wDyVLRJmFE.jar
    Aug 21, 2020 6:34:11 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 174 files cached, 25 files newly uploaded in 
1 seconds
    Aug 21, 2020 6:34:11 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from source as step s1
    Aug 21, 2020 6:34:11 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Gather time as step s2
    Aug 21, 2020 6:34:11 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Map records as step s3
    Aug 21, 2020 6:34:11 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/PrepareWrite/ParDo(Anonymous) as step s4
    Aug 21, 2020 6:34:11 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to BQ/StreamingInserts/CreateTables/ParDo(CreateTables) 
as step s5
    Aug 21, 2020 6:34:11 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites as step s6
    Aug 21, 2020 6:34:11 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds as step s7
    Aug 21, 2020 6:34:11 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign 
as step s8
    Aug 21, 2020 6:34:11 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey as step s9
    Aug 21, 2020 6:34:11 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable as step s10
    Aug 21, 2020 6:34:11 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign as step s11
    Aug 21, 2020 6:34:11 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to 
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite as step s12
    Aug 21, 2020 6:34:11 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to 
gs://temp-storage-for-perf-tests/loadtests/staging/
    Aug 21, 2020 6:34:11 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <104187 bytes, hash 
067abf256d04c3df049634baad62ccce8a0ef77d9cd566e427e8d11913499e95> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Bnq_JW0Ew98EljS6rWLMzooO932c1WbkJ-jRGRNJnpU.pb
    Aug 21, 2020 6:34:12 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.25.0-SNAPSHOT
    Aug 21, 2020 6:34:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-20_23_34_12-8870182566606201231?project=apache-beam-testing
    Aug 21, 2020 6:34:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2020-08-20_23_34_12-8870182566606201231
    Aug 21, 2020 6:34:13 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2020-08-20_23_34_12-8870182566606201231
    Aug 21, 2020 6:34:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-21T06:34:12.158Z: The requested max number of ****s (5) is 
ignored as autoscaling is explicitly disabled (autoscalingAlgorithm=NONE).
    Aug 21, 2020 6:34:21 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:20.458Z: Worker configuration: n1-standard-1 in 
us-central1-f.
    Aug 21, 2020 6:34:21 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.196Z: Expanding CoGroupByKey operations into 
optimizable parts.
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.249Z: Expanding GroupByKey operations into 
optimizable parts.
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.285Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.395Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.420Z: Fusing consumer Gather time into Read from 
source
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.446Z: Fusing consumer Map records into Gather time
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.482Z: Fusing consumer Write to 
BQ/PrepareWrite/ParDo(Anonymous) into Map records
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.518Z: Fusing consumer Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables) into Write to 
BQ/PrepareWrite/ParDo(Anonymous)
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.564Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites into Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.607Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds into Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.641Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign 
into Write to BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.662Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify into Write 
to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.696Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write into Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.724Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow 
into Write to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.777Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable into Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.871Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign into Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:21.917Z: Fusing consumer Write to 
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite into Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:22.508Z: Executing operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:22.602Z: Starting 5 ****s in us-central1-f...
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:22.650Z: Finished operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Create
    Aug 21, 2020 6:34:23 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:22.794Z: Executing operation Read from source+Gather 
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Aug 21, 2020 6:34:47 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2020-08-21T06:34:47.309Z: Your project already contains 100 
Dataflow-created metric descriptors and Stackdriver will not create new 
Dataflow custom metrics for this job. Each unique user-defined metric name 
(independent of the DoFn in which it is defined) produces a new metric 
descriptor. To delete old / unused metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Aug 21, 2020 6:34:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:34:55.934Z: Autoscaling: Raised the number of ****s to 
5 based on the rate of progress in the currently running stage(s).
    Aug 21, 2020 6:35:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:35:12.954Z: Workers have started successfully.
    Aug 21, 2020 6:35:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:35:12.982Z: Workers have started successfully.
    Aug 21, 2020 6:37:54 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:37:52.376Z: Finished operation Read from source+Gather 
time+Map records+Write to BQ/PrepareWrite/ParDo(Anonymous)+Write to 
BQ/StreamingInserts/CreateTables/ParDo(CreateTables)+Write to 
BQ/StreamingInserts/StreamingWriteTables/ShardTableWrites+Write to 
BQ/StreamingInserts/StreamingWriteTables/TagWithUniqueIds+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Reify+Write 
to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Write
    Aug 21, 2020 6:37:54 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:37:52.465Z: Executing operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close
    Aug 21, 2020 6:37:54 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:37:52.519Z: Finished operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Close
    Aug 21, 2020 6:37:54 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2020-08-21T06:37:52.615Z: Executing operation Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/Read+Write to 
BQ/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey/GroupByWindow+Write
 to BQ/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable+Write to 
BQ/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign+Write to 
BQ/StreamingInserts/StreamingWriteTables/StreamingWrite
    Aug 21, 2020 6:46:44 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2020-08-21T06:46:43.043Z: An OutOfMemoryException occurred. 
Consider specifying higher memory instances in PipelineOptions.
    java.lang.RuntimeException: 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.ExecutionError:
 java.lang.OutOfMemoryError: Java heap space
        at 
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:132)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:119)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:84)
        at 
org.apache.beam.runners.dataflow.****.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:63)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
        at 
org.apache.beam.runners.dataflow.****.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
        at 
org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
        at 
org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
        at 
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:121)
        at 
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
        at 
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.runReadLoop(ReadOperation.java:201)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.start(ReadOperation.java:159)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.MapTaskExecutor.execute(MapTaskExecutor.java:77)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:417)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:386)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:311)
        at 
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:140)
        at 
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:120)
        at 
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:107)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
    Caused by: 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.ExecutionError:
 java.lang.OutOfMemoryError: Java heap space
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2048)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:74)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:125)
        ... 26 more
    Caused by: java.lang.OutOfMemoryError: Java heap space
        at 
org.apache.beam.runners.dataflow.****.ChunkingShuffleBatchReader.getFixedLengthPrefixedByteArray(ChunkingShuffleBatchReader.java:98)
        at 
org.apache.beam.runners.dataflow.****.ChunkingShuffleBatchReader.getShuffleEntry(ChunkingShuffleBatchReader.java:82)
        at 
org.apache.beam.runners.dataflow.****.ChunkingShuffleBatchReader.read(ChunkingShuffleBatchReader.java:63)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:55)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:51)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:74)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:125)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:119)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:84)
        at 
org.apache.beam.runners.dataflow.****.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:63)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:271)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:263)
        at 
org.apache.beam.runners.dataflow.****.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:397)
        at 
org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:55)
        at 
org.apache.beam.runners.dataflow.****.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:39)
        at 
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:121)
        at 
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:73)
        at 
org.apache.beam.runners.dataflow.****.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:114)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.ParDoOperation.process(ParDoOperation.java:44)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.OutputReceiver.process(OutputReceiver.java:49)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.runReadLoop(ReadOperation.java:201)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.ReadOperation.start(ReadOperation.java:159)
        at 
org.apache.beam.runners.dataflow.****.util.common.****.MapTaskExecutor.execute(MapTaskExecutor.java:77)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:417)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:386)


org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead SKIPPED

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED
:sdks:java:io:bigquery-io-perf-tests:integrationTest (Thread[Execution **** for 
':' Thread 5,5,main]) completed. Took 13 mins 20.148 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> Process 'Gradle Test Executor 3' finished with non-zero exit value 143
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at 
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 14m 4s
84 actionable tasks: 55 executed, 29 from cache
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=9e7ff632-dee0-41ce-9457-57d6990ae392, 
currentDir=<https://ci-beam.apache.org/job/beam_BiqQueryIO_Streaming_Performance_Test_Java/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 10602
  log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-10602.out.log
----- Last  20 lines from daemon log file - daemon-10602.out.log -----
FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> Process 'Gradle Test Executor 3' finished with non-zero exit value 143
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at 
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 14m 4s
84 actionable tasks: 55 executed, 29 from cache
Daemon vm is shutting down... The daemon has exited normally or was terminated 
in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may 
have crashed)

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to