See
<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/6660/display/redirect>
Changes:
------------------------------------------
[...truncated 12.59 MB...]
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
java.lang.NullPointerException
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
java.lang.NullPointerException
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
java.lang.NullPointerException
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
java.lang.NullPointerException
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
java.lang.NullPointerException
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
java.lang.NullPointerException
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
java.lang.NullPointerException
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
java.lang.NullPointerException
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
java.lang.NullPointerException
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
java.lang.NullPointerException
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
Jul 15, 2022 8:35:17 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
WARNING: No terminal state was returned within allotted timeout. State
value RUNNING
org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithAutosharding FAILED
java.lang.RuntimeException: Write pipeline did not finish
at
org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteWithAutosharding(JdbcIOIT.java:295)
org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithWriteResults STANDARD_ERROR
Jul 15, 2022 8:35:18 PM org.apache.beam.runners.dataflow.DataflowRunner
validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option
--****HarnessContainerImage.
Jul 15, 2022 8:35:18 PM
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory
tryCreateDefaultBucket
INFO: No tempLocation specified, attempting to use default bucket:
dataflow-staging-us-central1-844138762903
Jul 15, 2022 8:35:18 PM
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
handleResponse
WARNING: Request failed with code 409, performed 0 retries due to
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP
framework says request can be retried, (caller responsible for retrying):
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
Jul 15, 2022 8:35:18 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jul 15, 2022 8:35:19 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files
from the classpath: will stage 279 files. Enable logging at DEBUG level to see
which files will be staged.
Jul 15, 2022 8:35:19 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Jul 15, 2022 8:35:21 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 280 files from PipelineOptions.filesToStage to staging
location to prepare for execution.
Jul 15, 2022 8:35:21 PM
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes
forFileToStage
INFO: Staging custom dataflow-****.jar as
beam-runners-google-cloud-dataflow-java-legacy-****-2.41.0-SNAPSHOT-Gl7GE2A8Qr6nYZqDNhrTwne6T583X1IN1hu9vb6ncBI.jar
Jul 15, 2022 8:35:21 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 280 files cached, 0 files newly uploaded in 0
seconds
Jul 15, 2022 8:35:21 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://dataflow-staging-us-central1-844138762903/temp/staging/
Jul 15, 2022 8:35:21 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <207382 bytes, hash
bdf33ec7b81736aa2951eae329ea96085eacc4b3373e11db96bbb1289c33a2d8> to
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-vfM-x7gXNqopUerjKeqWCF6sxLM3PhHblruxKJwzotg.pb
Jul 15, 2022 8:35:24 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Create.Values/Read(CreateSource) as step s1
Jul 15, 2022 8:35:24 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JdbcIO.WriteWithResults/ParDo(Anonymous) as step s2
Jul 15, 2022 8:35:24 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JdbcIO.WriteWithResults/ParDo(Write) as step s3
Jul 15, 2022 8:35:24 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step
s4
Jul 15, 2022 8:35:24 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s5
Jul 15, 2022 8:35:24 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as
step s6
Jul 15, 2022 8:35:24 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s7
Jul 15, 2022 8:35:24 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as
step s8
Jul 15, 2022 8:35:24 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s9
Jul 15, 2022 8:35:24 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s10
Jul 15, 2022 8:35:24 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s11
Jul 15, 2022 8:35:24 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s12
Jul 15, 2022 8:35:24 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/GetPane/Map as step s13
Jul 15, 2022 8:35:24 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/RunChecks as step s14
Jul 15, 2022 8:35:24 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s15
Jul 15, 2022 8:35:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.41.0-SNAPSHOT
Jul 15, 2022 8:35:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-15_13_35_24-8480498522139083676?project=apache-beam-testing
Jul 15, 2022 8:35:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-07-15_13_35_24-8480498522139083676
Jul 15, 2022 8:35:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2022-07-15_13_35_24-8480498522139083676
org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithWriteResults FAILED
java.lang.AssertionError: expected:<5000> but was:<0>
at org.junit.Assert.fail(Assert.java:89)
at org.junit.Assert.failNotEquals(Assert.java:835)
at org.junit.Assert.assertEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:633)
at
org.apache.beam.sdk.io.common.DatabaseTestHelper.assertRowCount(DatabaseTestHelper.java:166)
at
org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteWithWriteResults(JdbcIOIT.java:334)
org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteThenRead STANDARD_ERROR
Jul 15, 2022 8:35:24 PM org.apache.beam.runners.dataflow.DataflowRunner
validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option
--****HarnessContainerImage.
Jul 15, 2022 8:35:25 PM
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory
tryCreateDefaultBucket
INFO: No tempLocation specified, attempting to use default bucket:
dataflow-staging-us-central1-844138762903
Jul 15, 2022 8:35:25 PM
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
handleResponse
WARNING: Request failed with code 409, performed 0 retries due to
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP
framework says request can be retried, (caller responsible for retrying):
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
Jul 15, 2022 8:35:25 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Jul 15, 2022 8:35:25 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files
from the classpath: will stage 279 files. Enable logging at DEBUG level to see
which files will be staged.
Jul 15, 2022 8:35:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Jul 15, 2022 8:35:27 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 280 files from PipelineOptions.filesToStage to staging
location to prepare for execution.
Jul 15, 2022 8:35:27 PM
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes
forFileToStage
INFO: Staging custom dataflow-****.jar as
beam-runners-google-cloud-dataflow-java-legacy-****-2.41.0-SNAPSHOT-Gl7GE2A8Qr6nYZqDNhrTwne6T583X1IN1hu9vb6ncBI.jar
Jul 15, 2022 8:35:28 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 280 files cached, 0 files newly uploaded in 0
seconds
Jul 15, 2022 8:35:28 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://dataflow-staging-us-central1-844138762903/temp/staging/
Jul 15, 2022 8:35:28 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <141342 bytes, hash
5c797c85962115dc4c8ea102e00e45d7bb78c6418d013abad8918e85c388ac6b> to
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-XHl8hZYhFdxMjqEC4A5F17t4xkGNATq62JGOhcOIrGs.pb
Jul 15, 2022 8:35:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Create.Values/Read(CreateSource) as step s1
Jul 15, 2022 8:35:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(DeterministicallyConstructTestRow) as step s2
Jul 15, 2022 8:35:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s3
Jul 15, 2022 8:35:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JdbcIO.Write/ParDo(Anonymous) as step s4
Jul 15, 2022 8:35:30 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding JdbcIO.Write/ParDo(Write) as step s5
Jul 15, 2022 8:35:30 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.41.0-SNAPSHOT
Jul 15, 2022 8:35:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-15_13_35_30-1344447587550324555?project=apache-beam-testing
Jul 15, 2022 8:35:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-07-15_13_35_30-1344447587550324555
Jul 15, 2022 8:35:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2022-07-15_13_35_30-1344447587550324555
Jul 15, 2022 8:35:48 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:35:46.506Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jul 15, 2022 8:35:48 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:35:46.592Z: Worker configuration: e2-standard-2 in
us-central1-b.
Jul 15, 2022 8:35:48 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:35:47.379Z: Expanding CoGroupByKey operations into
optimizable parts.
Jul 15, 2022 8:35:48 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:35:47.413Z: Expanding GroupByKey operations into
optimizable parts.
Jul 15, 2022 8:35:48 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:35:47.442Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
Jul 15, 2022 8:35:48 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:35:47.500Z: Fusing adjacent ParDo, Read, Write, and
Flatten operations
Jul 15, 2022 8:35:48 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:35:47.526Z: Fusing consumer
ParDo(DeterministicallyConstructTestRow) into Create.Values/Read(CreateSource)
Jul 15, 2022 8:35:48 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:35:47.557Z: Fusing consumer ParDo(TimeMonitor) into
ParDo(DeterministicallyConstructTestRow)
Jul 15, 2022 8:35:48 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:35:47.579Z: Fusing consumer
JdbcIO.Write/ParDo(Anonymous) into ParDo(TimeMonitor)
Jul 15, 2022 8:35:48 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:35:47.612Z: Fusing consumer JdbcIO.Write/ParDo(Write)
into JdbcIO.Write/ParDo(Anonymous)
Jul 15, 2022 8:35:48 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:35:47.950Z: Executing operation
Create.Values/Read(CreateSource)+ParDo(DeterministicallyConstructTestRow)+ParDo(TimeMonitor)+JdbcIO.Write/ParDo(Anonymous)+JdbcIO.Write/ParDo(Write)
Jul 15, 2022 8:35:48 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:35:48.071Z: Starting 5 ****s in us-central1-b...
Jul 15, 2022 8:36:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:36:25.415Z: Autoscaling: Raised the number of ****s to
5 based on the rate of progress in the currently running stage(s).
Jul 15, 2022 8:36:55 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:36:53.565Z: Workers have started successfully.
Jul 15, 2022 8:37:16 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:37:13.374Z: Finished operation
Create.Values/Read(CreateSource)+ParDo(DeterministicallyConstructTestRow)+ParDo(TimeMonitor)+JdbcIO.Write/ParDo(Anonymous)+JdbcIO.Write/ParDo(Write)
Jul 15, 2022 8:37:16 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:37:13.505Z: Cleaning up.
Jul 15, 2022 8:37:16 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:37:13.567Z: Stopping **** pool...
Jul 15, 2022 8:37:48 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:37:47.865Z: Autoscaling: Resized **** pool from 5 to 0.
Jul 15, 2022 8:37:48 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-07-15T20:37:47.908Z: Worker pool stopped.
Jul 15, 2022 8:37:54 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2022-07-15_13_35_30-1344447587550324555 finished with status DONE.
Jul 15, 2022 8:37:54 PM org.apache.beam.sdk.io.jdbc.JdbcIO$ReadAll
inferCoder
WARNING: Unable to infer a schema for type
org.apache.beam.sdk.io.common.TestRow. Attempting to infer a coder without a
schema.
org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteThenRead FAILED
java.lang.UnsupportedOperationException: No hash for that record count: 5000
at
org.apache.beam.sdk.io.common.IOITHelper.getHashForRecordCount(IOITHelper.java:40)
at
org.apache.beam.sdk.io.common.TestRow.getExpectedHashForRowCount(TestRow.java:104)
at org.apache.beam.sdk.io.jdbc.JdbcIOIT.runRead(JdbcIOIT.java:254)
at
org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteThenRead(JdbcIOIT.java:140)
Gradle Test Executor 1 finished executing tests.
> Task :sdks:java:io:jdbc:integrationTest FAILED
3 tests completed, 3 failed
Finished generating test XML results (0.36 secs) into:
<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.273 secs) into:
<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/reports/tests/integrationTest>
:sdks:java:io:jdbc:integrationTest (Thread[Execution **** for ':' Thread
4,5,main]) completed. Took 9 mins 53.153 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:jdbc:integrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/reports/tests/integrationTest/index.html>
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 10m 21s
135 actionable tasks: 79 executed, 54 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/6nor55c36mwv2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]