See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/6658/display/redirect>

Changes:


------------------------------------------
[...truncated 13.09 MB...]
    java.lang.NullPointerException
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
    java.lang.NullPointerException
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
    java.lang.NullPointerException
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
    java.lang.NullPointerException
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
    java.lang.NullPointerException
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
    java.lang.NullPointerException
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
    java.lang.NullPointerException
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
    java.lang.NullPointerException
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
    java.lang.NullPointerException
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
    java.lang.NullPointerException
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.getConnection(JdbcIO.java:2325)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.executeBatch(JdbcIO.java:2375)
            
org.apache.beam.sdk.io.jdbc.JdbcIO$WriteFn.processElement(JdbcIO.java:2334)
    Jul 15, 2022 7:37:26 PM 
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    WARNING: No terminal state was returned within allotted timeout. State 
value RUNNING

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithAutosharding FAILED
    java.lang.RuntimeException: Write pipeline did not finish
        at 
org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteWithAutosharding(JdbcIOIT.java:295)

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithWriteResults STANDARD_ERROR
    Jul 15, 2022 7:37:26 PM org.apache.beam.runners.dataflow.DataflowRunner 
validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option 
--****HarnessContainerImage.
    Jul 15, 2022 7:37:26 PM 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory 
tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Jul 15, 2022 7:37:27 PM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Jul 15, 2022 7:37:27 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Jul 15, 2022 7:37:27 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 279 files. Enable logging at DEBUG level to see 
which files will be staged.
    Jul 15, 2022 7:37:27 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Jul 15, 2022 7:37:30 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 280 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Jul 15, 2022 7:37:30 PM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
    INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.41.0-SNAPSHOT-Gl7GE2A8Qr6nYZqDNhrTwne6T583X1IN1hu9vb6ncBI.jar
    Jul 15, 2022 7:37:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 280 files cached, 0 files newly uploaded in 0 
seconds
    Jul 15, 2022 7:37:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Jul 15, 2022 7:37:31 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <207370 bytes, hash 
436293514f93095a2dee1c0dc9c737a1035b182a5a833a4a565c764ca9c6455d> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-Q2KTUU-TCVot7hwNycc3oQNbGCpagzpKVlx2TKnGRV0.pb
    Jul 15, 2022 7:37:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Create.Values/Read(CreateSource) as step s1
    Jul 15, 2022 7:37:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.WriteWithResults/ParDo(Anonymous) as step s2
    Jul 15, 2022 7:37:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.WriteWithResults/ParDo(Write) as step s3
    Jul 15, 2022 7:37:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step 
s4
    Jul 15, 2022 7:37:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s5
    Jul 15, 2022 7:37:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as 
step s6
    Jul 15, 2022 7:37:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s7
    Jul 15, 2022 7:37:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as 
step s8
    Jul 15, 2022 7:37:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s9
    Jul 15, 2022 7:37:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s10
    Jul 15, 2022 7:37:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s11
    Jul 15, 2022 7:37:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s12
    Jul 15, 2022 7:37:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GetPane/Map as step s13
    Jul 15, 2022 7:37:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/RunChecks as step s14
    Jul 15, 2022 7:37:34 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s15
    Jul 15, 2022 7:37:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.41.0-SNAPSHOT
    Jul 15, 2022 7:37:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-15_12_37_34-10542080575964940150?project=apache-beam-testing
    Jul 15, 2022 7:37:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2022-07-15_12_37_34-10542080575964940150
    Jul 15, 2022 7:37:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2022-07-15_12_37_34-10542080575964940150

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithWriteResults FAILED
    java.lang.AssertionError: expected:<5000> but was:<0>
        at org.junit.Assert.fail(Assert.java:89)
        at org.junit.Assert.failNotEquals(Assert.java:835)
        at org.junit.Assert.assertEquals(Assert.java:647)
        at org.junit.Assert.assertEquals(Assert.java:633)
        at 
org.apache.beam.sdk.io.common.DatabaseTestHelper.assertRowCount(DatabaseTestHelper.java:166)
        at 
org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteWithWriteResults(JdbcIOIT.java:334)

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteThenRead STANDARD_ERROR
    Jul 15, 2022 7:37:35 PM org.apache.beam.runners.dataflow.DataflowRunner 
validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option 
--****HarnessContainerImage.
    Jul 15, 2022 7:37:35 PM 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory 
tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Jul 15, 2022 7:37:35 PM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Jul 15, 2022 7:37:35 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Jul 15, 2022 7:37:35 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 279 files. Enable logging at DEBUG level to see 
which files will be staged.
    Jul 15, 2022 7:37:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Jul 15, 2022 7:37:38 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 280 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Jul 15, 2022 7:37:38 PM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
    INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.41.0-SNAPSHOT-Gl7GE2A8Qr6nYZqDNhrTwne6T583X1IN1hu9vb6ncBI.jar
    Jul 15, 2022 7:37:41 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 280 files cached, 0 files newly uploaded in 2 
seconds
    Jul 15, 2022 7:37:41 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Jul 15, 2022 7:37:41 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <141339 bytes, hash 
9c0bfdf38e09a0953ab2dbf27aa91390d9b8717ffb5d2843ed491ee7590fef98> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-nAv9844JoJU6stvyeqkTkNm4cX_7XShD7Uke51kP75g.pb
    Jul 15, 2022 7:37:44 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Create.Values/Read(CreateSource) as step s1
    Jul 15, 2022 7:37:44 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(DeterministicallyConstructTestRow) as step s2
    Jul 15, 2022 7:37:44 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Jul 15, 2022 7:37:44 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.Write/ParDo(Anonymous) as step s4
    Jul 15, 2022 7:37:44 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.Write/ParDo(Write) as step s5
    Jul 15, 2022 7:37:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.41.0-SNAPSHOT
    Jul 15, 2022 7:37:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-07-15_12_37_44-11856993633381049303?project=apache-beam-testing
    Jul 15, 2022 7:37:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2022-07-15_12_37_44-11856993633381049303
    Jul 15, 2022 7:37:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2022-07-15_12_37_44-11856993633381049303
    Jul 15, 2022 7:37:56 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:37:55.511Z: Worker configuration: e2-standard-2 in 
us-central1-b.
    Jul 15, 2022 7:37:56 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:37:56.356Z: Expanding CoGroupByKey operations into 
optimizable parts.
    Jul 15, 2022 7:37:56 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:37:56.397Z: Expanding GroupByKey operations into 
optimizable parts.
    Jul 15, 2022 7:37:56 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:37:56.424Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
    Jul 15, 2022 7:37:56 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:37:56.484Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
    Jul 15, 2022 7:37:56 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:37:56.523Z: Fusing consumer 
ParDo(DeterministicallyConstructTestRow) into Create.Values/Read(CreateSource)
    Jul 15, 2022 7:37:56 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:37:56.553Z: Fusing consumer ParDo(TimeMonitor) into 
ParDo(DeterministicallyConstructTestRow)
    Jul 15, 2022 7:37:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:37:56.584Z: Fusing consumer 
JdbcIO.Write/ParDo(Anonymous) into ParDo(TimeMonitor)
    Jul 15, 2022 7:37:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:37:56.621Z: Fusing consumer JdbcIO.Write/ParDo(Write) 
into JdbcIO.Write/ParDo(Anonymous)
    Jul 15, 2022 7:37:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:37:57.036Z: Executing operation 
Create.Values/Read(CreateSource)+ParDo(DeterministicallyConstructTestRow)+ParDo(TimeMonitor)+JdbcIO.Write/ParDo(Anonymous)+JdbcIO.Write/ParDo(Write)
    Jul 15, 2022 7:37:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:37:57.103Z: Starting 5 ****s in us-central1-b...
    Jul 15, 2022 7:38:18 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:38:17.271Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Jul 15, 2022 7:38:30 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:38:30.372Z: Autoscaling: Raised the number of ****s to 
5 based on the rate of progress in the currently running stage(s).
    Jul 15, 2022 7:39:00 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:38:57.669Z: Workers have started successfully.
    Jul 15, 2022 7:39:18 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:39:17.766Z: Finished operation 
Create.Values/Read(CreateSource)+ParDo(DeterministicallyConstructTestRow)+ParDo(TimeMonitor)+JdbcIO.Write/ParDo(Anonymous)+JdbcIO.Write/ParDo(Write)
    Jul 15, 2022 7:39:18 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:39:17.928Z: Cleaning up.
    Jul 15, 2022 7:39:18 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:39:18.006Z: Stopping **** pool...
    Jul 15, 2022 7:39:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:39:56.375Z: Autoscaling: Resized **** pool from 5 to 0.
    Jul 15, 2022 7:39:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-07-15T19:39:56.422Z: Worker pool stopped.
    Jul 15, 2022 7:40:02 PM 
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2022-07-15_12_37_44-11856993633381049303 finished with status 
DONE.
    Jul 15, 2022 7:40:02 PM org.apache.beam.sdk.io.jdbc.JdbcIO$ReadAll 
inferCoder
    WARNING: Unable to infer a schema for type 
org.apache.beam.sdk.io.common.TestRow. Attempting to infer a coder without a 
schema.

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteThenRead FAILED
    java.lang.UnsupportedOperationException: No hash for that record count: 5000
        at 
org.apache.beam.sdk.io.common.IOITHelper.getHashForRecordCount(IOITHelper.java:40)
        at 
org.apache.beam.sdk.io.common.TestRow.getExpectedHashForRowCount(TestRow.java:104)
        at org.apache.beam.sdk.io.jdbc.JdbcIOIT.runRead(JdbcIOIT.java:254)
        at 
org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteThenRead(JdbcIOIT.java:140)

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:io:jdbc:integrationTest FAILED

3 tests completed, 3 failed
Finished generating test XML results (0.437 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.287 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/reports/tests/integrationTest>
:sdks:java:io:jdbc:integrationTest (Thread[Execution **** for ':' Thread 
2,5,main]) completed. Took 9 mins 55.609 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:jdbc:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/reports/tests/integrationTest/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 10m 29s
135 actionable tasks: 80 executed, 53 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/taxn3v6tuodt6

Stopped 1 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to