See
<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/6661/display/redirect>
Changes:
------------------------------------------
[...truncated 14.93 MB...]
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.access$1100(StreamingDataflowWorker.java:165)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker$7.run(StreamingDataflowWorker.java:1125)
org.apache.beam.runners.dataflow.****.util.BoundedQueueExecutor.lambda$executeLockHeld$0(BoundedQueueExecutor.java:133)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.beam.sdk.util.UserCodeException:
java.lang.NullPointerException
org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn$DoFnInvoker.invokeSetup(Unknown
Source)
org.apache.beam.sdk.transforms.reflect.DoFnInvokers.tryInvokeSetupFor(DoFnInvokers.java:53)
org.apache.beam.runners.dataflow.****.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:86)
org.apache.beam.runners.dataflow.****.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:68)
org.apache.beam.runners.dataflow.****.UserParDoFnFactory.create(UserParDoFnFactory.java:100)
org.apache.beam.runners.dataflow.****.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:75)
org.apache.beam.runners.dataflow.****.IntrinsicMapTaskExecutorFactory.createParDoOperation(IntrinsicMapTaskExecutorFactory.java:267)
org.apache.beam.runners.dataflow.****.IntrinsicMapTaskExecutorFactory.access$000(IntrinsicMapTaskExecutorFactory.java:89)
org.apache.beam.runners.dataflow.****.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:186)
org.apache.beam.runners.dataflow.****.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:168)
org.apache.beam.runners.dataflow.****.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:67)
org.apache.beam.runners.dataflow.****.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:54)
org.apache.beam.runners.dataflow.****.graph.Networks.replaceDirectedNetworkNodes(Networks.java:91)
org.apache.beam.runners.dataflow.****.IntrinsicMapTaskExecutorFactory.create(IntrinsicMapTaskExecutorFactory.java:128)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1352)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker.access$1100(StreamingDataflowWorker.java:165)
org.apache.beam.runners.dataflow.****.StreamingDataflowWorker$7.run(StreamingDataflowWorker.java:1125)
org.apache.beam.runners.dataflow.****.util.BoundedQueueExecutor.lambda$executeLockHeld$0(BoundedQueueExecutor.java:133)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NullPointerException
java.util.Objects.requireNonNull(Objects.java:203)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.setup(JdbcIO.java:2313)
java.lang.RuntimeException: org.apache.beam.sdk.util.UserCodeException:
java.lang.NullPointerException
org.apache.beam.runners.dataflow.****.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:197)
org.apache.beam.runners.dataflow.****.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:168)
org.apache.beam.run
Jul 15, 2022 9:01:02 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
WARNING: No terminal state was returned within allotted timeout. State
value RUNNING
org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithAutosharding FAILED
java.lang.RuntimeException: Write pipeline did not finish
at
org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteWithAutosharding(JdbcIOIT.java:295)
org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithWriteResults STANDARD_ERROR
Jul 15, 2022 9:01:03 PM org.apache.beam.runners.dataflow.DataflowRunner
validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option
--****HarnessContainerImage.
Jul 15, 2022 9:01:03 PM
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory
tryCreateDefaultBucket
INFO: No tempLocation specified, attempting to use default bucket:
dataflow-staging-us-central1-844138762903
Jul 15, 2022 9:01:03 PM
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
handleResponse
WARNING: Request failed with code 409, performed 0 retries due to
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP
framework says request can be retried, (caller responsible for retrying):
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
Jul 15, 2022 9:01:03 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jul 15, 2022 9:01:04 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files
from the classpath: will stage 279 files. Enable logging at DEBUG level to see
which files will be staged.
Jul 15, 2022 9:01:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Jul 15, 2022 9:01:06 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 280 files from PipelineOptions.filesToStage to staging
location to prepare for execution.
Jul 15, 2022 9:01:06 PM
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes
forFileToStage
INFO: Staging custom dataflow-****.jar as
beam-runners-google-cloud-dataflow-java-legacy-****-2.41.0-SNAPSHOT-Gl7GE2A8Qr6nYZqDNhrTwne6T583X1IN1hu9vb6ncBI.jar
Jul 15, 2022 9:01:06 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 280 files cached, 0 files newly uploaded in 0
seconds
Jul 15, 2022 9:01:06 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://dataflow-staging-us-central1-844138762903/temp/staging/
Jul 15, 2022 9:01:06 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <207376 bytes, hash
2a0416ac481354cad2d0072721767ae4146c9f6380667e9beb1e480037547ad2> to
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-KgQWrEgTVMrS0AcnIXZ65BRsn2OAZn6b6x5IADdUetI.pb
Jul 15, 2022 9:01:09 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Create.Values/Read(CreateSource) as step s1
Jul 15, 2022 9:01:09 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JdbcIO.WriteWithResults/ParDo(Anonymous) as step s2
Jul 15, 2022 9:01:09 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JdbcIO.WriteWithResults/ParDo(Write) as step s3
Jul 15, 2022 9:01:09 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step
s4
Jul 15, 2022 9:01:09 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s5
Jul 15, 2022 9:01:09 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as
step s6
Jul 15, 2022 9:01:09 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s7
Jul 15, 2022 9:01:09 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as
step s8
Jul 15, 2022 9:01:09 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s9
Jul 15, 2022 9:01:09 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s10
Jul 15, 2022 9:01:09 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s11
Jul 15, 2022 9:01:09 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s12
Jul 15, 2022 9:01:09 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GetPane/Map as step s13
Jul 15, 2022 9:01:09 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/RunChecks as step s14
Jul 15, 2022 9:01:09 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s15
Jul 15, 2022 9:01:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.41.0-SNAPSHOT
Jul 15, 2022 9:01:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-15_14_01_09-9716140772824944457?project=apache-beam-testing
Jul 15, 2022 9:01:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-07-15_14_01_09-9716140772824944457
Jul 15, 2022 9:01:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2022-07-15_14_01_09-9716140772824944457
org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithWriteResults FAILED
java.lang.AssertionError: expected:<5000> but was:<0>
at org.junit.Assert.fail(Assert.java:89)
at org.junit.Assert.failNotEquals(Assert.java:835)
at org.junit.Assert.assertEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:633)
at
org.apache.beam.sdk.io.common.DatabaseTestHelper.assertRowCount(DatabaseTestHelper.java:166)
at
org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteWithWriteResults(JdbcIOIT.java:334)
org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteThenRead STANDARD_ERROR
Jul 15, 2022 9:01:10 PM org.apache.beam.runners.dataflow.DataflowRunner
validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option
--****HarnessContainerImage.
Jul 15, 2022 9:01:10 PM
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory
tryCreateDefaultBucket
INFO: No tempLocation specified, attempting to use default bucket:
dataflow-staging-us-central1-844138762903
Jul 15, 2022 9:01:10 PM
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
handleResponse
WARNING: Request failed with code 409, performed 0 retries due to
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP
framework says request can be retried, (caller responsible for retrying):
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
Jul 15, 2022 9:01:10 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jul 15, 2022 9:01:10 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files
from the classpath: will stage 279 files. Enable logging at DEBUG level to see
which files will be staged.
Jul 15, 2022 9:01:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Jul 15, 2022 9:01:13 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 280 files from PipelineOptions.filesToStage to staging
location to prepare for execution.
Jul 15, 2022 9:01:13 PM
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes
forFileToStage
INFO: Staging custom dataflow-****.jar as
beam-runners-google-cloud-dataflow-java-legacy-****-2.41.0-SNAPSHOT-Gl7GE2A8Qr6nYZqDNhrTwne6T583X1IN1hu9vb6ncBI.jar
Jul 15, 2022 9:01:13 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 280 files cached, 0 files newly uploaded in 0
seconds
Jul 15, 2022 9:01:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://dataflow-staging-us-central1-844138762903/temp/staging/
Jul 15, 2022 9:01:13 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <141338 bytes, hash
ef0a5c86be77abd3d3db2fea5a12df7fb9a1ab48fa6a724900ab1dd3bc565d6f> to
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-7wpchr53q9PT2y_qWhLff7mhq0j6anJJAKsd07xWXW8.pb
Jul 15, 2022 9:01:16 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Create.Values/Read(CreateSource) as step s1
Jul 15, 2022 9:01:16 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(DeterministicallyConstructTestRow) as step s2
Jul 15, 2022 9:01:16 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s3
Jul 15, 2022 9:01:16 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JdbcIO.Write/ParDo(Anonymous) as step s4
Jul 15, 2022 9:01:16 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JdbcIO.Write/ParDo(Write) as step s5
Jul 15, 2022 9:01:16 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.41.0-SNAPSHOT
Jul 15, 2022 9:01:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-15_14_01_16-3111072262956439582?project=apache-beam-testing
Jul 15, 2022 9:01:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-07-15_14_01_16-3111072262956439582
Jul 15, 2022 9:01:17 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2022-07-15_14_01_16-3111072262956439582
Jul 15, 2022 9:01:32 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:01:31.148Z: Worker configuration: e2-standard-2 in
us-central1-b.
Jul 15, 2022 9:01:33 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:01:32.081Z: Expanding CoGroupByKey operations into
optimizable parts.
Jul 15, 2022 9:01:33 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:01:32.142Z: Expanding GroupByKey operations into
optimizable parts.
Jul 15, 2022 9:01:33 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:01:32.170Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
Jul 15, 2022 9:01:33 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:01:32.241Z: Fusing adjacent ParDo, Read, Write, and
Flatten operations
Jul 15, 2022 9:01:33 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:01:32.276Z: Fusing consumer
ParDo(DeterministicallyConstructTestRow) into Create.Values/Read(CreateSource)
Jul 15, 2022 9:01:33 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:01:32.335Z: Fusing consumer ParDo(TimeMonitor) into
ParDo(DeterministicallyConstructTestRow)
Jul 15, 2022 9:01:33 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:01:32.406Z: Fusing consumer
JdbcIO.Write/ParDo(Anonymous) into ParDo(TimeMonitor)
Jul 15, 2022 9:01:33 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:01:32.441Z: Fusing consumer JdbcIO.Write/ParDo(Write)
into JdbcIO.Write/ParDo(Anonymous)
Jul 15, 2022 9:01:33 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:01:32.781Z: Executing operation
Create.Values/Read(CreateSource)+ParDo(DeterministicallyConstructTestRow)+ParDo(TimeMonitor)+JdbcIO.Write/ParDo(Anonymous)+JdbcIO.Write/ParDo(Write)
Jul 15, 2022 9:01:33 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:01:32.858Z: Starting 5 ****s in us-central1-b...
Jul 15, 2022 9:01:38 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:01:35.855Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jul 15, 2022 9:02:10 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:02:08.734Z: Autoscaling: Raised the number of ****s to
4 based on the rate of progress in the currently running stage(s).
Jul 15, 2022 9:02:10 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:02:08.786Z: Resized **** pool to 4, though goal was 5.
This could be a quota issue.
Jul 15, 2022 9:02:19 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:02:19.019Z: Autoscaling: Raised the number of ****s to
5 based on the rate of progress in the currently running stage(s).
Jul 15, 2022 9:02:36 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:02:36.769Z: Workers have started successfully.
Jul 15, 2022 9:02:58 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:02:57.470Z: Finished operation
Create.Values/Read(CreateSource)+ParDo(DeterministicallyConstructTestRow)+ParDo(TimeMonitor)+JdbcIO.Write/ParDo(Anonymous)+JdbcIO.Write/ParDo(Write)
Jul 15, 2022 9:02:58 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:02:57.605Z: Cleaning up.
Jul 15, 2022 9:02:58 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:02:57.682Z: Stopping **** pool...
Jul 15, 2022 9:03:38 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:03:36.701Z: Autoscaling: Resized **** pool from 5 to 0.
Jul 15, 2022 9:03:38 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T21:03:36.749Z: Worker pool stopped.
Jul 15, 2022 9:03:43 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-07-15_14_01_16-3111072262956439582 finished with status DONE.
Jul 15, 2022 9:03:43 PM org.apache.beam.sdk.io.jdbc.JdbcIO$ReadAll
inferCoder
WARNING: Unable to infer a schema for type
org.apache.beam.sdk.io.common.TestRow. Attempting to infer a coder without a
schema.
org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteThenRead FAILED
java.lang.UnsupportedOperationException: No hash for that record count: 5000
at
org.apache.beam.sdk.io.common.IOITHelper.getHashForRecordCount(IOITHelper.java:40)
at
org.apache.beam.sdk.io.common.TestRow.getExpectedHashForRowCount(TestRow.java:104)
at org.apache.beam.sdk.io.jdbc.JdbcIOIT.runRead(JdbcIOIT.java:254)
at
org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteThenRead(JdbcIOIT.java:140)
Gradle Test Executor 2 finished executing tests.
> Task :sdks:java:io:jdbc:integrationTest FAILED
3 tests completed, 3 failed
Finished generating test XML results (0.299 secs) into:
<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.458 secs) into:
<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/reports/tests/integrationTest>
:sdks:java:io:jdbc:integrationTest (Thread[Execution **** for ':',5,main])
completed. Took 9 mins 58.216 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:jdbc:integrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/reports/tests/integrationTest/index.html>
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 10m 41s
135 actionable tasks: 80 executed, 53 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/prgjslia77s3i
Stopped 1 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]