See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_JDBC/6062/display/redirect?page=changes>

Changes:

[Pablo Estrada] Simplify README for new users

[mmack] [BEAM-13563] Introducing common AWS ClientBuilderFactory to unify

[laraschmidt] Fix final allowskew error to properly handle a large allowedSkew

[noreply] [BEAM-13946] Add get_dummies(), a non-deferred column operation on


------------------------------------------
[...truncated 279.78 KB...]

> Task :sdks:java:io:jdbc:integrationTest
Custom actions are attached to task ':sdks:java:io:jdbc:integrationTest'.
Build cache key for task ':sdks:java:io:jdbc:integrationTest' is 
cc94584860cdb22464d25f9f864f9b6e
Task ':sdks:java:io:jdbc:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: 
/home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_JDBC/src/sdks/java/io/jdbc
 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java 
-DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--numberOfRecords=5000000","--bigQueryDataset=beam_performance","--bigQueryTable=jdbcioit_results","--influxMeasurement=jdbcioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--postgresUsername=postgres","--postgresPassword=uuinkks","--postgresDatabaseName=postgres","--postgresServerName=35.224.18.90","--postgresSsl=false","--postgresPort=5432","--autoscalingAlgorithm=NONE","--numWorkers=5","--****HarnessContainerImage=","--dataflowWorkerJar=/home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_JDBC/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.38.0-SNAPSHOT.jar","--region=us-central1";]
 
-Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager
 
-Dorg.gradle.internal.****.tmpdir=/home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_JDBC/src/sdks/java/io/jdbc/build/tmp/integrationTest/work
 -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/7.3.2/****Main/gradle-****.jar 
****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.io.jdbc.JdbcIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in 
[jar:file:/home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_JDBC/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.38.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-jdk14/1.7.30/d35953dd2fe54ebe39fdf18cfd82fe6eb35b25ed/slf4j-jdk14-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-simple/1.7.30/e606eac955f55ecf1d8edcccba04eb8ac98088dd/slf4j-simple-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.JDK14LoggerFactory]

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithAutosharding STANDARD_ERROR
    Feb 18, 2022 12:24:41 AM org.apache.beam.runners.dataflow.DataflowRunner 
validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option 
--****HarnessContainerImage.
    Feb 18, 2022 12:24:42 AM 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory 
tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Feb 18, 2022 12:24:43 AM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Feb 18, 2022 12:24:43 AM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Feb 18, 2022 12:24:43 AM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 255 files. Enable logging at DEBUG level to see 
which files will be staged.
    Feb 18, 2022 12:24:44 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Feb 18, 2022 12:24:48 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 256 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Feb 18, 2022 12:24:48 AM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
    INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.38.0-SNAPSHOT-QuGnur7irnXUBunkr5-Fg8-cA-O2Pbxk01JwQWgcxT4.jar
    Feb 18, 2022 12:24:49 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading /tmp/main8542329210285238638.zip to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/main-poiqxzYMNtSQvR5c8Irt_3syZga0BsmSci7l-JAEygw.jar
    Feb 18, 2022 12:24:49 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading /tmp/test3294015744357321912.zip to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/test-esZjrlAFEgm-OgQPzGIwrH8kNJixjK6qByWpUagBRWk.jar
    Feb 18, 2022 12:24:49 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 254 files cached, 2 files newly uploaded in 0 
seconds
    Feb 18, 2022 12:24:49 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Feb 18, 2022 12:24:49 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <122664 bytes, hash 
29b4df005c16310719821104c6319d16e809ddc007474131ca81d227212707a5> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-KbTfAFwWMQcZghEExjGdFugJ3cAHR0ExyoHSJyEnB6U.pb

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithAutosharding FAILED
    java.lang.IllegalArgumentException: Runner determined sharding not 
available in Dataflow for GroupIntoBatches for non-Streaming-Engine jobs. In 
order to use runner determined sharding, please use --streaming 
--enable_streaming_engine
        at 
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:141)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.maybeRecordPCollectionWithAutoSharding(DataflowRunner.java:1580)
        at 
org.apache.beam.runners.dataflow.GroupIntoBatchesOverride$StreamingGroupIntoBatchesWithShardedKey.expand(GroupIntoBatchesOverride.java:326)
        at 
org.apache.beam.runners.dataflow.GroupIntoBatchesOverride$StreamingGroupIntoBatchesWithShardedKey.expand(GroupIntoBatchesOverride.java:305)
        at org.apache.beam.sdk.Pipeline.applyReplacement(Pipeline.java:576)
        at org.apache.beam.sdk.Pipeline.replace(Pipeline.java:300)
        at org.apache.beam.sdk.Pipeline.replaceAll(Pipeline.java:218)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.replaceV1Transforms(DataflowRunner.java:1473)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1099)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:194)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:323)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:399)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
        at 
org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteWithAutosharding(JdbcIOIT.java:293)

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithWriteResults STANDARD_ERROR
    Feb 18, 2022 12:24:50 AM org.apache.beam.runners.dataflow.DataflowRunner 
validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option 
--****HarnessContainerImage.
    Feb 18, 2022 12:24:50 AM 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory 
tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Feb 18, 2022 12:24:50 AM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Feb 18, 2022 12:24:50 AM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Feb 18, 2022 12:24:51 AM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 255 files. Enable logging at DEBUG level to see 
which files will be staged.
    Feb 18, 2022 12:24:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Feb 18, 2022 12:24:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 256 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Feb 18, 2022 12:24:54 AM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
    INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.38.0-SNAPSHOT-QuGnur7irnXUBunkr5-Fg8-cA-O2Pbxk01JwQWgcxT4.jar
    Feb 18, 2022 12:24:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 256 files cached, 0 files newly uploaded in 0 
seconds
    Feb 18, 2022 12:24:54 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Feb 18, 2022 12:24:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <156067 bytes, hash 
4ee597d320b3e9d59e10707117964d702eada3e0d3689c11532b6b459ecf6f83> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-TuWX0yCz6dWeEHBxF5ZNcC6to-DTaJwRUytrRZ7Pb4M.pb
    Feb 18, 2022 12:24:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Create.Values/Read(CreateSource) as step s1
    Feb 18, 2022 12:24:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.WriteWithResults/ParDo(Anonymous) as step s2
    Feb 18, 2022 12:24:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.WriteWithResults/ParDo(Write) as step s3
    Feb 18, 2022 12:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Reify.Window/ParDo(Anonymous) as step 
s4
    Feb 18, 2022 12:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(ToSingletonIterables) as step s5
    Feb 18, 2022 12:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Create.Values/Read(CreateSource) as 
step s6
    Feb 18, 2022 12:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Flatten.PCollections as step s7
    Feb 18, 2022 12:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Window.Into()/Flatten.PCollections as 
step s8
    Feb 18, 2022 12:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/WithKeys/AddKeys/Map as step s9
    Feb 18, 2022 12:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/GroupByKey as step s10
    Feb 18, 2022 12:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s11
    Feb 18, 2022 12:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s12
    Feb 18, 2022 12:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/GetPane/Map as step s13
    Feb 18, 2022 12:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/RunChecks as step s14
    Feb 18, 2022 12:24:57 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s15
    Feb 18, 2022 12:24:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.38.0-SNAPSHOT
    Feb 18, 2022 12:24:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-17_16_24_57-435481743343791143?project=apache-beam-testing
    Feb 18, 2022 12:24:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2022-02-17_16_24_57-435481743343791143
    Feb 18, 2022 12:24:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2022-02-17_16_24_57-435481743343791143

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteWithWriteResults FAILED
    java.lang.AssertionError: expected:<1000> but was:<0>
        at org.junit.Assert.fail(Assert.java:89)
        at org.junit.Assert.failNotEquals(Assert.java:835)
        at org.junit.Assert.assertEquals(Assert.java:647)
        at org.junit.Assert.assertEquals(Assert.java:633)
        at 
org.apache.beam.sdk.io.common.DatabaseTestHelper.assertRowCount(DatabaseTestHelper.java:166)
        at 
org.apache.beam.sdk.io.jdbc.JdbcIOIT.testWriteWithWriteResults(JdbcIOIT.java:330)

org.apache.beam.sdk.io.jdbc.JdbcIOIT > testWriteThenRead STANDARD_ERROR
    Feb 18, 2022 12:24:57 AM org.apache.beam.runners.dataflow.DataflowRunner 
validateSdkContainerImageOptions
    WARNING: Prefer --sdkContainerImage over deprecated legacy option 
--****HarnessContainerImage.
    Feb 18, 2022 12:24:57 AM 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory 
tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Feb 18, 2022 12:24:58 AM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Feb 18, 2022 12:24:58 AM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Feb 18, 2022 12:24:58 AM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 255 files. Enable logging at DEBUG level to see 
which files will be staged.
    Feb 18, 2022 12:24:58 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Feb 18, 2022 12:25:00 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 256 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Feb 18, 2022 12:25:00 AM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
    INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.38.0-SNAPSHOT-QuGnur7irnXUBunkr5-Fg8-cA-O2Pbxk01JwQWgcxT4.jar
    Feb 18, 2022 12:25:01 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 256 files cached, 0 files newly uploaded in 0 
seconds
    Feb 18, 2022 12:25:01 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Feb 18, 2022 12:25:01 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <112947 bytes, hash 
b4e446b1f50d28dc9b214553534e5e677227eaca6b7e096567b5251c450de6d5> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-tORGsfUNKNybIUVTU05eZ3In6sprfgllZ7UlHEUN5tU.pb
    Feb 18, 2022 12:25:03 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding GenerateSequence/Read(BoundedCountingSource) as step s1
    Feb 18, 2022 12:25:03 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(DeterministicallyConstructTestRow) as step s2
    Feb 18, 2022 12:25:03 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding ParDo(TimeMonitor) as step s3
    Feb 18, 2022 12:25:03 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.Write/ParDo(Anonymous) as step s4
    Feb 18, 2022 12:25:03 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding JdbcIO.Write/ParDo(Write) as step s5
    Feb 18, 2022 12:25:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.38.0-SNAPSHOT
    Feb 18, 2022 12:25:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-17_16_25_03-3742823591429932119?project=apache-beam-testing
    Feb 18, 2022 12:25:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2022-02-17_16_25_03-3742823591429932119
    Feb 18, 2022 12:25:04 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2022-02-17_16_25_03-3742823591429932119
    Feb 18, 2022 12:25:16 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-02-18T00:25:15.835Z: Worker configuration: e2-standard-2 in 
us-central1-b.
    Feb 18, 2022 12:25:18 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-02-18T00:25:16.364Z: Expanding CoGroupByKey operations into 
optimizable parts.
    Feb 18, 2022 12:25:18 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-02-18T00:25:16.406Z: Expanding GroupByKey operations into 
optimizable parts.
    Feb 18, 2022 12:25:18 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-02-18T00:25:16.424Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
    Feb 18, 2022 12:25:18 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-02-18T00:25:16.478Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
    Feb 18, 2022 12:25:18 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-02-18T00:25:16.501Z: Fusing consumer 
ParDo(DeterministicallyConstructTestRow) into 
GenerateSequence/Read(BoundedCountingSource)
    Feb 18, 2022 12:25:18 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-02-18T00:25:16.533Z: Fusing consumer ParDo(TimeMonitor) into 
ParDo(DeterministicallyConstructTestRow)
    Feb 18, 2022 12:25:18 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-02-18T00:25:16.559Z: Fusing consumer 
JdbcIO.Write/ParDo(Anonymous) into ParDo(TimeMonitor)
    Feb 18, 2022 12:25:18 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-02-18T00:25:16.586Z: Fusing consumer JdbcIO.Write/ParDo(Write) 
into JdbcIO.Write/ParDo(Anonymous)
    Feb 18, 2022 12:25:18 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-02-18T00:25:16.931Z: Executing operation 
GenerateSequence/Read(BoundedCountingSource)+ParDo(DeterministicallyConstructTestRow)+ParDo(TimeMonitor)+JdbcIO.Write/ParDo(Anonymous)+JdbcIO.Write/ParDo(Write)
    Feb 18, 2022 12:25:18 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-02-18T00:25:16.996Z: Starting 5 ****s in us-central1-b...
    Feb 18, 2022 12:25:25 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-02-18T00:25:23.978Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Feb 18, 2022 12:26:05 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-02-18T00:26:04.863Z: Autoscaling: Raised the number of ****s to 
5 based on the rate of progress in the currently running stage(s).
FATAL: command execution failed
java.io.IOException: Backing channel 'apache-beam-jenkins-4' is disconnected.
        at 
hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:216)
        at 
hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:286)
        at com.sun.proxy.$Proxy126.isAlive(Unknown Source)
        at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1213)
        at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1205)
        at hudson.Launcher$ProcStarter.join(Launcher.java:522)
        at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
        at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:806)
        at hudson.model.Build$BuildExecution.build(Build.java:198)
        at hudson.model.Build$BuildExecution.doRun(Build.java:163)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:514)
        at hudson.model.Run.execute(Run.java:1888)
        at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
        at hudson.model.ResourceController.execute(ResourceController.java:99)
        at hudson.model.Executor.run(Executor.java:432)
Caused by: java.io.IOException: Pipe closed after 0 cycles
        at 
org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:126)
        at 
org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:105)
        at 
hudson.remoting.FlightRecorderInputStream.read(FlightRecorderInputStream.java:93)
        at 
hudson.remoting.ChunkedInputStream.readHeader(ChunkedInputStream.java:74)
        at 
hudson.remoting.ChunkedInputStream.readUntilBreak(ChunkedInputStream.java:104)
        at 
hudson.remoting.ChunkedCommandTransport.readBlock(ChunkedCommandTransport.java:39)
        at 
hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34)
        at 
hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:61)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
FATAL: Channel "hudson.remoting.Channel@2564655:apache-beam-jenkins-4": Remote 
call on apache-beam-jenkins-4 failed. The channel is closing down or has closed 
down
java.io.IOException: Pipe closed after 0 cycles
        at 
org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:126)
        at 
org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:105)
        at 
hudson.remoting.FlightRecorderInputStream.read(FlightRecorderInputStream.java:93)
        at 
hudson.remoting.ChunkedInputStream.readHeader(ChunkedInputStream.java:74)
        at 
hudson.remoting.ChunkedInputStream.readUntilBreak(ChunkedInputStream.java:104)
        at 
hudson.remoting.ChunkedCommandTransport.readBlock(ChunkedCommandTransport.java:39)
        at 
hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34)
        at 
hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:61)
Caused: hudson.remoting.ChannelClosedException: Channel 
"hudson.remoting.Channel@2564655:apache-beam-jenkins-4": Remote call on 
apache-beam-jenkins-4 failed. The channel is closing down or has closed down
        at hudson.remoting.Channel.call(Channel.java:994)
        at hudson.Launcher$RemoteLauncher.kill(Launcher.java:1148)
        at 
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:526)
        at hudson.model.Run.execute(Run.java:1888)
        at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
        at hudson.model.ResourceController.execute(ResourceController.java:99)
        at hudson.model.Executor.run(Executor.java:432)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to