See
<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Streaming_Java/1028/display/redirect>
Changes:
------------------------------------------
[...truncated 319.78 KB...]
Sep 19, 2023 5:20:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:20:43.386Z: All ****s have finished the startup
processes and began to receive work requests.
Sep 19, 2023 5:20:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:20:48.182Z: Finished operation Read from
BQ/TriggerIdCreation/Impulse+Read from
BQ/TriggerIdCreation/MapElements/Map+Read from
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map+Read
from
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+Read
from
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial+Read
from
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify+Read
from
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
Sep 19, 2023 5:20:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:20:48.247Z: Executing operation Read from
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Close
Sep 19, 2023 5:20:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:20:49.283Z: Finished operation Read from
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Close
Sep 19, 2023 5:20:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:20:49.370Z: Executing operation Read from
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
Sep 19, 2023 5:20:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:20:49.543Z: Finished operation Read from
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
Sep 19, 2023 5:20:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:20:49.688Z: Executing operation Read from
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+Read
from
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+Read
from
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+Read
from
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+Read
from
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+Read
from
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
Sep 19, 2023 5:20:54 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:20:53.558Z: Finished operation Read from
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+Read
from
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+Read
from
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+Read
from
BQ/ViewId/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+Read
from
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+Read
from
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
Sep 19, 2023 5:20:54 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:20:53.610Z: Executing operation Read from
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
Sep 19, 2023 5:20:54 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:20:53.658Z: Finished operation Read from
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
Sep 19, 2023 5:20:54 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:20:53.713Z: Executing operation Read from
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+Read
from
BQ/ViewId/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
Sep 19, 2023 5:20:59 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:20:56.748Z: Finished operation Read from
BQ/ViewId/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+Read
from
BQ/ViewId/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
Sep 19, 2023 5:20:59 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:20:56.857Z: Executing operation Read from
BQ/ViewId/Combine.GloballyAsSingletonView/CreateDataflowView
Sep 19, 2023 5:20:59 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:20:56.928Z: Finished operation Read from
BQ/ViewId/Combine.GloballyAsSingletonView/CreateDataflowView
Sep 19, 2023 5:21:28 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2023-09-19T17:21:26.998Z: java.lang.IllegalArgumentException:
unable to serialize
gs://temp-storage-for-perf-tests/loadtests/BigQueryExtractTemp/fcc5b6087f144f6d9dc06ef6bad2a9e9/000000000000.avro
range [0, 20033550)
at
org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:59)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.serializeSplitToCloudSource(WorkerCustomSources.java:151)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.wrapIntoSourceSplitResponse(WorkerCustomSources.java:339)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:222)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:201)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplit(WorkerCustomSources.java:180)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:82)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:319)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:291)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:221)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:147)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:127)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.io.NotSerializableException:
com.google.api.services.bigquery.model.TableSchema
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at
org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:55)
... 17 more
Sep 19, 2023 5:21:28 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2023-09-19T17:21:27.650Z: java.lang.IllegalArgumentException:
unable to serialize
gs://temp-storage-for-perf-tests/loadtests/BigQueryExtractTemp/fcc5b6087f144f6d9dc06ef6bad2a9e9/000000000000.avro
range [0, 20033550)
at
org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:59)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.serializeSplitToCloudSource(WorkerCustomSources.java:151)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.wrapIntoSourceSplitResponse(WorkerCustomSources.java:339)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:222)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:201)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplit(WorkerCustomSources.java:180)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:82)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:319)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:291)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:221)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:147)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:127)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.io.NotSerializableException:
com.google.api.services.bigquery.model.TableSchema
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at
org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:55)
... 17 more
Sep 19, 2023 5:21:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2023-09-19T17:21:30.727Z: java.lang.IllegalArgumentException:
unable to serialize
gs://temp-storage-for-perf-tests/loadtests/BigQueryExtractTemp/fcc5b6087f144f6d9dc06ef6bad2a9e9/000000000000.avro
range [0, 20033550)
at
org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:59)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.serializeSplitToCloudSource(WorkerCustomSources.java:151)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.wrapIntoSourceSplitResponse(WorkerCustomSources.java:339)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:222)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:201)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplit(WorkerCustomSources.java:180)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:82)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:319)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:291)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:221)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:147)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:127)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.io.NotSerializableException:
com.google.api.services.bigquery.model.TableSchema
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at
org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:55)
... 17 more
Sep 19, 2023 5:21:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2023-09-19T17:21:31.462Z: java.lang.IllegalArgumentException:
unable to serialize
gs://temp-storage-for-perf-tests/loadtests/BigQueryExtractTemp/fcc5b6087f144f6d9dc06ef6bad2a9e9/000000000000.avro
range [0, 20033550)
at
org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:59)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.serializeSplitToCloudSource(WorkerCustomSources.java:151)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.wrapIntoSourceSplitResponse(WorkerCustomSources.java:339)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:222)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:201)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSources.performSplit(WorkerCustomSources.java:180)
at
org.apache.beam.runners.dataflow.****.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:82)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:319)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.doWork(BatchDataflowWorker.java:291)
at
org.apache.beam.runners.dataflow.****.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:221)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:147)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:127)
at
org.apache.beam.runners.dataflow.****.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.io.NotSerializableException:
com.google.api.services.bigquery.model.TableSchema
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
at java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1378)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1174)
at
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548)
at
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509)
at
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at
org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:55)
... 17 more
Sep 19, 2023 5:21:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:21:31.798Z: Finished operation Read from
BQ/Read(BigQueryTableSource)+Read from
BQ/PassThroughThenCleanup/ParMultiDo(Identity)+Gather time+Read from
BQ/PassThroughThenCleanup/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+Counting
element
Sep 19, 2023 5:21:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2023-09-19T17:21:31.881Z: Workflow failed. Causes: S09:Read from
BQ/Read(BigQueryTableSource)+Read from
BQ/PassThroughThenCleanup/ParMultiDo(Identity)+Gather time+Read from
BQ/PassThroughThenCleanup/View.AsIterable/ParDo(ToIsmRecordForGlobalWindow)+Counting
element failed.
Sep 19, 2023 5:21:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:21:31.950Z: Cleaning up.
Sep 19, 2023 5:21:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:21:32.150Z: Stopping **** pool...
Sep 19, 2023 5:23:56 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:23:55.628Z: Autoscaling: Resized **** pool from 5 to 0.
Sep 19, 2023 5:23:56 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-09-19T17:23:55.692Z: Worker pool stopped.
Sep 19, 2023 5:24:31 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2023-09-19_10_18_46-3084518602204166352 failed with status FAILED.
Sep 19, 2023 5:24:31 PM org.apache.beam.sdk.testutils.metrics.MetricsReader
getCounterMetric
SEVERE: Failed to get metric read_count, from namespace
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT > testWriteThenRead FAILED
java.lang.AssertionError: expected:<10485760> but was:<-1>
at org.junit.Assert.fail(Assert.java:89)
at org.junit.Assert.failNotEquals(Assert.java:835)
at org.junit.Assert.assertEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:633)
at
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testRead(BigQueryIOIT.java:222)
at
org.apache.beam.sdk.bigqueryioperftests.BigQueryIOIT.testWriteThenRead(BigQueryIOIT.java:145)
Gradle Test Executor 1 finished executing tests.
> Task :sdks:java:io:bigquery-io-perf-tests:integrationTest FAILED
1 test completed, 1 failed
Finished generating test XML results (0.025 secs) into:
<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Streaming_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.031 secs) into:
<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Streaming_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest>
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:java:io:bigquery-io-perf-tests:integrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Streaming_Java/ws/src/sdks/java/io/bigquery-io-perf-tests/build/reports/tests/integrationTest/index.html>
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.6.2/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 16m 6s
149 actionable tasks: 86 executed, 61 from cache, 2 up-to-date
Publishing build scan...
https://ge.apache.org/s/ncicy7utivha4
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]