See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/6532/display/redirect>

Changes:


------------------------------------------
[...truncated 291.69 KB...]
    INFO: Adding Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable as step s6
    Jun 14, 2022 7:00:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Reshuffle.ViaRandomKey/Values/Values/Map as step s7
    Jun 14, 2022 7:00:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.Write/ParDo(Anonymous) as step s8
    Jun 14, 2022 7:00:39 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.Write/ParDo(Write) as step s9
    Jun 14, 2022 7:00:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.40.0-SNAPSHOT
    Jun 14, 2022 7:00:42 PM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 400, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs.
 

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithAutosharding FAILED
    java.lang.OutOfMemoryError: Java heap space
        at java.util.Arrays.copyOf(Arrays.java:3332)
        at 
java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124)
        at 
java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:448)
        at java.lang.StringBuilder.append(StringBuilder.java:136)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractMap.toString(AbstractMap.java:559)
        at com.google.api.client.util.GenericData.toString(GenericData.java:210)
        at com.google.api.client.json.GenericJson.toString(GenericJson.java:68)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractMap.toString(AbstractMap.java:559)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractMap.toString(AbstractMap.java:559)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractMap.toString(AbstractMap.java:559)
        at com.google.api.client.util.GenericData.toString(GenericData.java:210)
        at com.google.api.client.json.GenericJson.toString(GenericJson.java:68)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractCollection.toString(AbstractCollection.java:462)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractMap.toString(AbstractMap.java:559)
        at com.google.api.client.util.GenericData.toString(GenericData.java:210)
        at com.google.api.client.json.GenericJson.toString(GenericJson.java:68)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1331)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:196)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:323)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:399)
        at 
org.apache.beam.sdk.testing.TestPipeline.runWithAdditionalOptionArgs(TestPipeline.java:372)

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithWriteResults STANDARD_ERROR
    Jun 14, 2022 7:03:25 PM org.apache.beam.runners.dataflow.DataflowRunner 
validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option 
--****HarnessContainerImage.
    Jun 14, 2022 7:03:25 PM 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory 
tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Jun 14, 2022 7:03:26 PM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Jun 14, 2022 7:03:26 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Jun 14, 2022 7:03:26 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 276 files. Enable logging at DEBUG level to see 
which files will be staged.
    Jun 14, 2022 7:03:26 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Jun 14, 2022 7:03:38 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 277 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Jun 14, 2022 7:03:39 PM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
    INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.40.0-SNAPSHOT-X3wYuAkOCB7jq9BikZGuT5fwdtRD9FlJbggpq8GQj_4.jar
    Jun 14, 2022 7:03:39 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 277 files cached, 0 files newly uploaded in 0 
seconds
    Jun 14, 2022 7:03:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Jun 14, 2022 7:03:40 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <56203755 bytes, hash 
8eabae3d9bf25d10dc6db92fb7b4e44a779b7e0653e3ccc7f7f9702c7a4cdd69> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-jquuPZvyXRDcbbkvt7TkSnebfgZT48zH9_lwLHpM3Wk.pb
    Jun 14, 2022 7:03:55 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Create.Values/Read(CreateSource) as step s1
    Jun 14, 2022 7:04:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.WriteWithResults/ParDo(Anonymous) as step s2
    Jun 14, 2022 7:04:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.WriteWithResults/ParDo(Write) as step s3
    Jun 14, 2022 7:04:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step 
s4
    Jun 14, 2022 7:04:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s5
    Jun 14, 2022 7:04:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as 
step s6
    Jun 14, 2022 7:04:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s7
    Jun 14, 2022 7:04:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as 
step s8
    Jun 14, 2022 7:04:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s9
    Jun 14, 2022 7:04:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s10
    Jun 14, 2022 7:04:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s11
    Jun 14, 2022 7:04:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s12
    Jun 14, 2022 7:04:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GetPane/Map as step s13
    Jun 14, 2022 7:04:09 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/RunChecks as step s14
    Jun 14, 2022 7:07:22 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s15
    Jun 14, 2022 7:07:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.40.0-SNAPSHOT
    Jun 14, 2022 7:07:29 PM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 400, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs.
 

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithWriteResults FAILED
    java.lang.OutOfMemoryError: Java heap space
        at java.util.Arrays.copyOf(Arrays.java:3332)
        at 
java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124)
        at 
java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:649)
        at java.lang.StringBuilder.append(StringBuilder.java:202)
        at java.util.AbstractMap.toString(AbstractMap.java:561)
        at com.google.api.client.util.GenericData.toString(GenericData.java:210)
        at com.google.api.client.json.GenericJson.toString(GenericJson.java:68)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractMap.toString(AbstractMap.java:559)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractMap.toString(AbstractMap.java:559)
        at com.google.api.client.util.GenericData.toString(GenericData.java:210)
        at com.google.api.client.json.GenericJson.toString(GenericJson.java:68)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractCollection.toString(AbstractCollection.java:462)
        at java.lang.String.valueOf(String.java:2994)
        at java.lang.StringBuilder.append(StringBuilder.java:131)
        at java.util.AbstractMap.toString(AbstractMap.java:559)
        at com.google.api.client.util.GenericData.toString(GenericData.java:210)
        at com.google.api.client.json.GenericJson.toString(GenericJson.java:68)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1331)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:196)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:323)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:399)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
        at 
org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteWithWriteResults(JdbcIOIT.java:327)

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteThenRead STANDARD_ERROR
    Jun 14, 2022 7:09:00 PM org.apache.beam.runners.dataflow.DataflowRunner 
validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option 
--****HarnessContainerImage.
    Jun 14, 2022 7:09:00 PM 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory 
tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Jun 14, 2022 7:09:01 PM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Jun 14, 2022 7:09:01 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Jun 14, 2022 7:09:01 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 276 files. Enable logging at DEBUG level to see 
which files will be staged.
    Jun 14, 2022 7:09:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Jun 14, 2022 7:09:09 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 277 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Jun 14, 2022 7:09:09 PM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
    INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.40.0-SNAPSHOT-X3wYuAkOCB7jq9BikZGuT5fwdtRD9FlJbggpq8GQj_4.jar
    Jun 14, 2022 7:09:10 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 277 files cached, 0 files newly uploaded in 0 
seconds
    Jun 14, 2022 7:09:10 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Jun 14, 2022 7:09:10 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <25120728 bytes, hash 
a2d172de3867cc7a0469df97977e7e8c1e9e0cae50f71ff8711081dca870cca3> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-otFy3jhnzHoEad-Xl35-jB6eDK5Q9x_4cRCB3KhwzKM.pb
    Jun 14, 2022 7:09:18 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Create.Values/Read(CreateSource) as step s1
    Jun 14, 2022 7:09:22 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(DeterministicallyConstructTestRow) as step s2
    Jun 14, 2022 7:09:22 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Jun 14, 2022 7:09:22 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.Write/ParDo(Anonymous) as step s4
    Jun 14, 2022 7:09:22 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.Write/ParDo(Write) as step s5
    Jun 14, 2022 7:09:22 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.40.0-SNAPSHOT
    Jun 14, 2022 7:09:22 PM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 400, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs.
 

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteThenRead FAILED
    java.lang.RuntimeException: Failed to create a workflow job: The size of 
the serialized JSON representation of the pipeline exceeds the allowable limit. 
For more information, please see the documentation on job submission:
    https://cloud.google.com/dataflow/docs/guides/deploying-a-pipeline#jobs
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1341)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:196)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:323)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:399)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
        at org.apache.beam.sdk.io.jdbc.JdbcIOIT.runWrite(JdbcIOIT.java:210)
        at 
org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteThenRead(JdbcIOIT.java:133)

        Caused by:
        com.google.api.client.googleapis.json.GoogleJsonResponseException: 400 
Bad Request
        POST 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs
        {
          "code" : 400,
          "errors" : [ {
            "domain" : "global",
            "message" : "Request payload size exceeds the limit: 20971520 
bytes.",
            "reason" : "badRequest"
          } ],
          "message" : "Request payload size exceeds the limit: 20971520 bytes.",
          "status" : "INVALID_ARGUMENT"
        }
            at 
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
            at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:118)
            at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:37)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:428)
            at 
com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1111)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:514)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:455)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:565)
            at 
org.apache.beam.runners.dataflow.DataflowClient.createJob(DataflowClient.java:64)
            at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1327)
            ... 6 more

Gradle Test Executor 2 finished executing tests.

> Task :sdks:java:io:jdbc:integrationTest FAILED

3 tests completed, 3 failed
Finished generating test XML results (0.032 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/reports/tests/integrationTest>
:sdks:java:io:jdbc:integrationTest (Thread[included builds,5,main]) completed. 
Took 12 mins 41.352 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:jdbc:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/ws/src/sdks/java/io/jdbc/build/reports/tests/integrationTest/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 15m 11s
129 actionable tasks: 77 executed, 50 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dtdhhn7acvqw4

Stopped 1 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to