See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Streaming_Java/1027/display/redirect>

Changes:


------------------------------------------
[...truncated 321.06 KB...]
    INFO: 2023-09-19T05:20:49.460Z: Finished operation Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    Sep 19, 2023 5:20:50 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-09-19T05:20:49.580Z: Executing operation Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+Read
 from 
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+Read
 from 
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+Read
 from 
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+Read
 from 
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+Read
 from 
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    Sep 19, 2023 5:20:54 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-09-19T05:20:53.308Z: Finished operation Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+Read
 from 
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+Read
 from 
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+Read
 from 
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+Read
 from 
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+Read
 from 
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    Sep 19, 2023 5:20:54 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-09-19T05:20:53.369Z: Executing operation Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    Sep 19, 2023 5:20:54 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-09-19T05:20:53.612Z: Finished operation Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    Sep 19, 2023 5:20:54 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-09-19T05:20:53.672Z: Executing operation Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+Read
 from 
BQ/ViewId/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
    Sep 19, 2023 5:20:58 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-09-19T05:20:57.004Z: Finished operation Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+Read
 from 
BQ/ViewId/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
    Sep 19, 2023 5:20:58 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-09-19T05:20:57.129Z: Executing operation Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/CreateDataflowView
    Sep 19, 2023 5:20:58 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-09-19T05:20:57.179Z: Finished operation Read from 
BQ/ViewId/Combine.GloballyAsSingletonView/CreateDataflowView
    Sep 19, 2023 5:21:24 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2023-09-19T05:21:23.270Z: java.lang.IllegalArgumentException: 
unable to serialize 
gs://temp-storage-for-perf-tests/loadtests/BigQueryExtractTemp/f4088cd8117f408ebb209fae17c074ef/000000000000.avro
 range [0, 17318970)
        at 
org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:59)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.serializeSplitToCloudSource(WorkerCustomSources.java:151)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.wrapIntoSourceSplitResponse(WorkerCustomSources.java:339)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:222)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:201)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplit(WorkerCustomSources.java:180)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:82)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:319)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:291)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:221)
        at 
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:147)
        at 
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:127)
        at 
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)
    Caused by: java.io.NotSerializableException: 
com.google.api.services.bigquery.model.TableSchema
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
        at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
        at 
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at 
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at 
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
        at 
org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:55)
        ... 17 more

    Sep 19, 2023 5:21:24 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2023-09-19T05:21:23.954Z: java.lang.IllegalArgumentException: 
unable to serialize 
gs://temp-storage-for-perf-tests/loadtests/BigQueryExtractTemp/f4088cd8117f408ebb209fae17c074ef/000000000000.avro
 range [0, 17318970)
        at 
org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:59)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.serializeSplitToCloudSource(WorkerCustomSources.java:151)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.wrapIntoSourceSplitResponse(WorkerCustomSources.java:339)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:222)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:201)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplit(WorkerCustomSources.java:180)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:82)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:319)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:291)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:221)
        at 
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:147)
        at 
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:127)
        at 
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)
    Caused by: java.io.NotSerializableException: 
com.google.api.services.bigquery.model.TableSchema
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
        at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
        at 
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at 
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at 
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
        at 
org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:55)
        ... 17 more

    Sep 19, 2023 5:21:27 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2023-09-19T05:21:26.887Z: java.lang.IllegalArgumentException: 
unable to serialize 
gs://temp-storage-for-perf-tests/loadtests/BigQueryExtractTemp/f4088cd8117f408ebb209fae17c074ef/000000000000.avro
 range [0, 17318970)
        at 
org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:59)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.serializeSplitToCloudSource(WorkerCustomSources.java:151)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.wrapIntoSourceSplitResponse(WorkerCustomSources.java:339)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:222)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:201)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplit(WorkerCustomSources.java:180)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:82)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:319)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:291)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:221)
        at 
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:147)
        at 
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:127)
        at 
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)
    Caused by: java.io.NotSerializableException: 
com.google.api.services.bigquery.model.TableSchema
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
        at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
        at 
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at 
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at 
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
        at 
org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:55)
        ... 17 more

    Sep 19, 2023 5:21:28 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2023-09-19T05:21:27.649Z: java.lang.IllegalArgumentException: 
unable to serialize 
gs://temp-storage-for-perf-tests/loadtests/BigQueryExtractTemp/f4088cd8117f408ebb209fae17c074ef/000000000000.avro
 range [0, 17318970)
        at 
org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:59)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.serializeSplitToCloudSource(WorkerCustomSources.java:151)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.wrapIntoSourceSplitResponse(WorkerCustomSources.java:339)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:222)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:201)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplit(WorkerCustomSources.java:180)
        at 
org.apache.beam.runners.dataflow.****.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:82)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:319)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:291)
        at 
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:221)
        at 
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:147)
        at 
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:127)
        at 
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)
    Caused by: java.io.NotSerializableException: 
com.google.api.services.bigquery.model.TableSchema
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
        at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
        at 
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at 
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at 
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
        at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
        at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
        at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
        at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
        at 
org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:55)
        ... 17 more

    Sep 19, 2023 5:21:28 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-09-19T05:21:27.935Z: Finished operation Read from 
BQ/Read(BigQueryTableSource)+Read from 
BQ/PassThroughThenCleanup/ParMultiDo(Identity)+Gather time+Read from 
BQ/PassThroughThenCleanup/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+Counting
 element
    Sep 19, 2023 5:21:28 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2023-09-19T05:21:28.023Z: Workflow failed. Causes: S09:Read from 
BQ/Read(BigQueryTableSource)+Read from 
BQ/PassThroughThenCleanup/ParMultiDo(Identity)+Gather time+Read from 
BQ/PassThroughThenCleanup/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+Counting
 element failed.
    Sep 19, 2023 5:21:28 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-09-19T05:21:28.087Z: Cleaning up.
    Sep 19, 2023 5:21:28 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-09-19T05:21:28.310Z: Stopping **** pool...
    Sep 19, 2023 5:23:51 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-09-19T05:23:49.455Z: Autoscaling: Resized **** pool from 5 to 0.
    Sep 19, 2023 5:23:51 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-09-19T05:23:49.514Z: Worker pool stopped.
    Sep 19, 2023 5:24:25 AM 
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2023-09-18_22_18_50-6920939254943733320 failed with status FAILED.
    Sep 19, 2023 5:24:25 AM org.apache.beam.sdk.testutils.metrics.MetricsReader 
getCounterMetric
    SEVERE: Failed to get metric read_count, from namespace 
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT

org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED
    java.lang.AssertionError: expected:<10485760> but was:<-1>
        at org.junit.Assert.fail(Assert.java:89)
        at org.junit.Assert.failNotEquals(Assert.java:835)
        at org.junit.Assert.assertEquals(Assert.java:647)
        at org.junit.Assert.assertEquals(Assert.java:633)
        at 
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testRead(BigQueryIOIT.java:222)
        at 
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:145)

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.095 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Streaming_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.103 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Streaming_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task 
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Streaming_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.6.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 16m
149 actionable tasks: 86 executed, 61 from cache, 2 up-to-date

Publishing build scan...
Publishing build scan failed due to 'An unexpected error occurred creating your 
build scan.
Please report this problem at https://ge.apache.org/help, quoting the following 
reference code:
5xpe2wvs4kso4' (2 retries remaining)...
Publishing build scan failed due to 'An unexpected error occurred creating your 
build scan.
Please report this problem at https://ge.apache.org/help, quoting the following 
reference code:
yzazmuabe3yly' (1 retry remaining)...
Publishing build scan failed due to network error 
'java.net.SocketTimeoutException: Read timed out' (2 retries remaining)...
Publishing build scan failed due to network error 
'java.net.SocketTimeoutException: Read timed out' (1 retry remaining)...

An unexpected error occurred creating your build scan.
Please report this problem at https://ge.apache.org/help, quoting the following 
reference code:
zkzgjyeiljha6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to