See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/6657/display/redirect>

Changes:


------------------------------------------
[...truncated 295.72 KB...]
    INFO: Adding Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable as step s6
    Jul 15, 2022 7:14:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Reshuffle.ViaRandomKey/Values/Values/Map as step s7
    Jul 15, 2022 7:14:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.Write/ParDo(Anonymous) as step s8
    Jul 15, 2022 7:14:25 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.Write/ParDo(Write) as step s9
    Jul 15, 2022 7:14:25 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.41.0-SNAPSHOT
    Jul 15, 2022 7:14:27 PM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 400, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs.
 

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithAutosharding FAILED
    java.lang.OutOfMemoryError: Java heap space
        at java.util.Arrays.copyOf(Arrays.java:3332)
        at 
java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124)
        at 
java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:448)
        at java.lang.StringBuilder.append(StringBuilder.java:136)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractMap.toString(AbstractMap.java:559)
        at com.google.api.client.util.GenericData.toString(GenericData.java:210)
        at com.google.api.client.json.GenericJson.toString(GenericJson.java:68)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractMap.toString(AbstractMap.java:559)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractMap.toString(AbstractMap.java:559)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractMap.toString(AbstractMap.java:559)
        at com.google.api.client.util.GenericData.toString(GenericData.java:210)
        at com.google.api.client.json.GenericJson.toString(GenericJson.java:68)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractCollection.toString(AbstractCollection.java:462)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractMap.toString(AbstractMap.java:559)
        at com.google.api.client.util.GenericData.toString(GenericData.java:210)
        at com.google.api.client.json.GenericJson.toString(GenericJson.java:68)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1383)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:196)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:323)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:399)
        at 
org.apache.beam.sdk.testing.TestPipeline.runWithAdditionalOptionArgs(TestPipeline.java:372)

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithWriteResults STANDARD_ERROR
    Jul 15, 2022 7:17:03 PM org.apache.beam.runners.dataflow.DataflowRunner 
validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option 
--****HarnessContainerImage.
    Jul 15, 2022 7:17:03 PM 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory 
tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Jul 15, 2022 7:17:03 PM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Jul 15, 2022 7:17:03 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Jul 15, 2022 7:17:04 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 279 files. Enable logging at DEBUG level to see 
which files will be staged.
    Jul 15, 2022 7:17:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Jul 15, 2022 7:17:14 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 280 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Jul 15, 2022 7:17:14 PM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
    INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.41.0-SNAPSHOT-Gl7GE2A8Qr6nYZqDNhrTwne6T583X1IN1hu9vb6ncBI.jar
    Jul 15, 2022 7:17:14 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 280 files cached, 0 files newly uploaded in 0 
seconds
    Jul 15, 2022 7:17:14 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Jul 15, 2022 7:17:15 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <56204947 bytes, hash 
b65ea1be3a2282a2592bc7b771f08d25feb3340dc0942c6af1e1d31860b8ec51> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-tl6hvjoigqJZK8e3cfCNJf6zNA3AlCxq8eHTGGC47FE.pb
    Jul 15, 2022 7:17:24 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Create.Values/Read(CreateSource) as step s1
    Jul 15, 2022 7:17:30 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.WriteWithResults/ParDo(Anonymous) as step s2
    Jul 15, 2022 7:17:30 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.WriteWithResults/ParDo(Write) as step s3
    Jul 15, 2022 7:17:30 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step 
s4
    Jul 15, 2022 7:17:30 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s5
    Jul 15, 2022 7:17:30 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as 
step s6
    Jul 15, 2022 7:17:30 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s7
    Jul 15, 2022 7:17:30 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as 
step s8
    Jul 15, 2022 7:17:30 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s9
    Jul 15, 2022 7:17:30 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s10
    Jul 15, 2022 7:17:32 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s11
    Jul 15, 2022 7:17:32 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s12
    Jul 15, 2022 7:17:32 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GetPane/Map as step s13
    Jul 15, 2022 7:17:32 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/RunChecks as step s14
    Jul 15, 2022 7:20:48 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s15
    Jul 15, 2022 7:20:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.41.0-SNAPSHOT
    Jul 15, 2022 7:20:52 PM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 400, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs.
 

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithWriteResults FAILED
    java.lang.OutOfMemoryError: Java heap space
        at java.util.Arrays.copyOf(Arrays.java:3332)
        at 
java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124)
        at 
java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:649)
        at java.lang.StringBuilder.append(StringBuilder.java:202)
        at java.util.AbstractMap.toString(AbstractMap.java:561)
        at com.google.api.client.util.GenericData.toString(GenericData.java:210)
        at com.google.api.client.json.GenericJson.toString(GenericJson.java:68)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractMap.toString(AbstractMap.java:559)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractMap.toString(AbstractMap.java:559)
        at com.google.api.client.util.GenericData.toString(GenericData.java:210)
        at com.google.api.client.json.GenericJson.toString(GenericJson.java:68)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractCollection.toString(AbstractCollection.java:462)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractMap.toString(AbstractMap.java:559)
        at com.google.api.client.util.GenericData.toString(GenericData.java:210)
        at com.google.api.client.json.GenericJson.toString(GenericJson.java:68)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1383)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:196)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:323)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:399)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
        at 
org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteWithWriteResults(JdbcIOIT.java:332)

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteThenRead STANDARD_ERROR
    Jul 15, 2022 7:22:23 PM org.apache.beam.runners.dataflow.DataflowRunner 
validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option 
--****HarnessContainerImage.
    Jul 15, 2022 7:22:23 PM 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory 
tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Jul 15, 2022 7:22:24 PM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Jul 15, 2022 7:22:24 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Jul 15, 2022 7:22:24 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 279 files. Enable logging at DEBUG level to see 
which files will be staged.
    Jul 15, 2022 7:22:24 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Jul 15, 2022 7:22:29 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 280 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Jul 15, 2022 7:22:29 PM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
    INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.41.0-SNAPSHOT-Gl7GE2A8Qr6nYZqDNhrTwne6T583X1IN1hu9vb6ncBI.jar
    Jul 15, 2022 7:22:29 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 280 files cached, 0 files newly uploaded in 0 
seconds
    Jul 15, 2022 7:22:29 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Jul 15, 2022 7:22:30 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <25121928 bytes, hash 
daffc97f256c60568a2317dd5841bb06cd4edca36d4c783eb7d0b9a178f2ec03> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-2v_JfyVsYFaKIxfdWEG7Bs1O3KNtTHg-t9C5oXjy7AM.pb
    Jul 15, 2022 7:22:36 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Create.Values/Read(CreateSource) as step s1
    Jul 15, 2022 7:22:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(DeterministicallyConstructTestRow) as step s2
    Jul 15, 2022 7:22:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Jul 15, 2022 7:22:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.Write/ParDo(Anonymous) as step s4
    Jul 15, 2022 7:22:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.Write/ParDo(Write) as step s5
    Jul 15, 2022 7:22:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.41.0-SNAPSHOT
    Jul 15, 2022 7:22:39 PM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 400, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs.
 

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteThenRead FAILED
    java.lang.RuntimeException: Failed to create a workflow job: The size of 
the serialized JSON representation of the pipeline exceeds the allowable limit. 
For more information, please see the documentation on job submission:
    https://cloud.google.com/dataflow/docs/guides/deploying-a-pipeline#jobs
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1393)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:196)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:323)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:399)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
        at org.apache.beam.sdk.io.jdbc.JdbcIOIT.runWrite(JdbcIOIT.java:215)
        at 
org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteThenRead(JdbcIOIT.java:138)

        Caused by:
        com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 
Bad Request
        POST 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs
        {
          "code" : 400,
          "errors" : [ {
            "domain" : "global",
            "message" : "Request payload size exceeds the limit: 20971520 
bytes.",
            "reason" : "badRequest"
          } ],
          "message" : "Request payload size exceeds the limit: 20971520 bytes.",
          "status" : "INVALID_ARGUMENT"
        }
            at 
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
            at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118)
            at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428)
            at 
com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
            at 
org.apache.beam.runners.dataflow.DataflowClient.createJob(DataflowClient.java:64)
            at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1379)
            ... 6 more

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:io:jdbc:integrationTest FAILED

3 tests completed, 3 failed
Finished generating test XML results (0.026 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/reports/tests/integrationTest>
:sdks:java:io:jdbc:integrationTest (Thread[included builds,5,main]) completed. 
Took 10 mins 40.594 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:jdbc:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/reports/tests/integrationTest/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 11m 28s
135 actionable tasks: 82 executed, 51 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/75viewkfr5mcc

Stopped 2 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to