See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/547/display/redirect>

Changes:


------------------------------------------
[...truncated 237.42 KB...]
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :sdks:java:expansion-service:compileJava UP-TO-DATE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :sdks:java:testing:load-tests:compileJava UP-TO-DATE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:compileJava UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:jar UP-TO-DATE

> Task :sdks:java:container:java11:copySdkHarnessLauncher
Execution optimizations have been disabled for task 
':sdks:java:container:java11:copySdkHarnessLauncher' to ensure correctness due 
to the following reasons:
  - Gradle detected a problem with the following location: 
'<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.>
 Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this 
output of task ':sdks:java:container:downloadCloudProfilerAgent' without 
declaring an explicit or implicit dependency. This can lead to incorrect 
results being produced, depending on what order the tasks are executed. Please 
refer to 
https://docs.gradle.org/7.5.1/userguide/validation_problems.html#implicit_dependency
 for more details about this problem.
  - Gradle detected a problem with the following location: 
'<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.>
 Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this 
output of task ':sdks:java:container:pullLicenses' without declaring an 
explicit or implicit dependency. This can lead to incorrect results being 
produced, depending on what order the tasks are executed. Please refer to 
https://docs.gradle.org/7.5.1/userguide/validation_problems.html#implicit_dependency
 for more details about this problem.

> Task :sdks:java:container:java11:copyDockerfileDependencies UP-TO-DATE
> Task :sdks:java:container:java11:dockerPrepare
> Task :sdks:java:container:java11:docker

> Task :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 
18.03.

As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.

See: 
https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker

The push refers to repository 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java]
5b4f9049d051: Preparing
3234a2dacf48: Preparing
d9ecd80d3270: Preparing
8f41bb10b1bc: Preparing
2a3058b8ba99: Preparing
92de474725e9: Preparing
93de70da606f: Preparing
35992d770366: Preparing
6eb54d72e4da: Preparing
8e5d735fcb87: Preparing
eba354083eba: Preparing
c52b4b580260: Preparing
958c4e7a050c: Preparing
1c91c3e39756: Preparing
84483be482eb: Preparing
7b7f3078e1db: Preparing
826c3ddbb29c: Preparing
b626401ef603: Preparing
9b55156abf26: Preparing
293d5db30c9f: Preparing
03127cdb479b: Preparing
9c742cd6c7a5: Preparing
84483be482eb: Waiting
92de474725e9: Waiting
93de70da606f: Waiting
293d5db30c9f: Waiting
b626401ef603: Waiting
03127cdb479b: Waiting
7b7f3078e1db: Waiting
9b55156abf26: Waiting
9c742cd6c7a5: Waiting
826c3ddbb29c: Waiting
1c91c3e39756: Waiting
958c4e7a050c: Waiting
eba354083eba: Waiting
8e5d735fcb87: Waiting
c52b4b580260: Waiting
6eb54d72e4da: Waiting
d9ecd80d3270: Pushed
3234a2dacf48: Pushed
8f41bb10b1bc: Pushed
2a3058b8ba99: Pushed
5b4f9049d051: Pushed
35992d770366: Pushed
93de70da606f: Pushed
8e5d735fcb87: Pushed
c52b4b580260: Pushed
eba354083eba: Pushed
6eb54d72e4da: Pushed
7b7f3078e1db: Layer already exists
92de474725e9: Pushed
826c3ddbb29c: Layer already exists
b626401ef603: Layer already exists
9b55156abf26: Layer already exists
293d5db30c9f: Layer already exists
03127cdb479b: Layer already exists
9c742cd6c7a5: Layer already exists
958c4e7a050c: Pushed
1c91c3e39756: Pushed
84483be482eb: Pushed
20221219125504: digest: 
sha256:de3a644f2104eeda14ef6afaa6c9e5051fc6c3d641f9f19826271f1b65acc762 size: 
4934

> Task :sdks:java:testing:load-tests:run
Dec 19, 2022 12:55:30 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Dec 19, 2022 12:55:31 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 232 files. Enable logging at DEBUG level to see which 
files will be staged.
Dec 19, 2022 12:55:31 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: 
ParDo(TimeMonitor)
Dec 19, 2022 12:55:31 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Dec 19, 2022 12:55:33 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 232 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Dec 19, 2022 12:55:34 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 232 files cached, 0 files newly uploaded in 0 
seconds
Dec 19, 2022 12:55:34 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Dec 19, 2022 12:55:34 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <99562 bytes, hash 
3e1747db41196c78f954c276fbf602da71adf2d8d28d4e135a2b92d1a9bfce84> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-PhdH20EZbHj5VMJ2-_YC2nGt8tjSjU4TWiuS0am_zoQ.pb
Dec 19, 2022 12:55:36 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s1
Dec 19, 2022 12:55:36 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0, 
endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000, 
endOffset=2000000}, SyntheticUnboundedSource{startOffset=2000000, 
endOffset=3000000}, SyntheticUnboundedSource{startOffset=3000000, 
endOffset=4000000}, SyntheticUnboundedSource{startOffset=4000000, 
endOffset=5000000}, SyntheticUnboundedSource{startOffset=5000000, 
endOffset=6000000}, SyntheticUnboundedSource{startOffset=6000000, 
endOffset=7000000}, SyntheticUnboundedSource{startOffset=7000000, 
endOffset=8000000}, SyntheticUnboundedSource{startOffset=8000000, 
endOffset=9000000}, SyntheticUnboundedSource{startOffset=9000000, 
endOffset=10000000}, SyntheticUnboundedSource{startOffset=10000000, 
endOffset=11000000}, SyntheticUnboundedSource{startOffset=11000000, 
endOffset=12000000}, SyntheticUnboundedSource{startOffset=12000000, 
endOffset=13000000}, SyntheticUnboundedSource{startOffset=13000000, 
endOffset=14000000}, SyntheticUnboundedSource{startOffset=14000000, 
endOffset=15000000}, SyntheticUnboundedSource{startOffset=15000000, 
endOffset=16000000}, SyntheticUnboundedSource{startOffset=16000000, 
endOffset=17000000}, SyntheticUnboundedSource{startOffset=17000000, 
endOffset=18000000}, SyntheticUnboundedSource{startOffset=18000000, 
endOffset=19000000}, SyntheticUnboundedSource{startOffset=19000000, 
endOffset=20000000}]
Dec 19, 2022 12:55:36 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Dec 19, 2022 12:55:36 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s3
Dec 19, 2022 12:55:36 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(ByteMonitor) as step s4
Dec 19, 2022 12:55:36 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 0 as step s5
Dec 19, 2022 12:55:36 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor)2 as step s6
Dec 19, 2022 12:55:36 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.45.0-SNAPSHOT
Dec 19, 2022 12:55:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-12-19_04_55_36-292440412000617621?project=apache-beam-testing
Dec 19, 2022 12:55:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-12-19_04_55_36-292440412000617621
Dec 19, 2022 12:55:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2022-12-19_04_55_36-292440412000617621
Dec 19, 2022 12:55:43 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-12-19T12:55:42.254Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java110dataflow0v20streaming0pardo03-jenkins-12-yjrq. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 19, 2022 12:55:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:50.149Z: Worker configuration: e2-standard-2 in 
us-central1-b.
Dec 19, 2022 12:55:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:51.488Z: Expanding SplittableParDo operations into 
optimizable parts.
Dec 19, 2022 12:55:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:51.512Z: Expanding CollectionToSingleton operations into 
optimizable parts.
Dec 19, 2022 12:55:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:51.555Z: Expanding CoGroupByKey operations into 
optimizable parts.
Dec 19, 2022 12:55:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:51.585Z: Expanding SplittableProcessKeyed operations 
into optimizable parts.
Dec 19, 2022 12:55:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:51.603Z: Expanding GroupByKey operations into streaming 
Read/Write steps
Dec 19, 2022 12:55:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:51.624Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Dec 19, 2022 12:55:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:51.678Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
Dec 19, 2022 12:55:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:51.704Z: Fusing consumer Read 
input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read 
input/Impulse
Dec 19, 2022 12:55:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:51.725Z: Fusing consumer 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
 into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Dec 19, 2022 12:55:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:51.747Z: Fusing consumer 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing
 into 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction
Dec 19, 2022 12:55:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:51.772Z: Fusing consumer Read 
input/ParDo(StripIds)/ParMultiDo(StripIds) into 
Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing
Dec 19, 2022 12:55:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:51.791Z: Fusing consumer 
ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) into Read 
input/ParDo(StripIds)/ParMultiDo(StripIds)
Dec 19, 2022 12:55:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:51.819Z: Fusing consumer 
ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) into 
ParDo(TimeMonitor)/ParMultiDo(TimeMonitor)
Dec 19, 2022 12:55:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:51.843Z: Fusing consumer Step: 
0/ParMultiDo(CounterOperation) into ParDo(ByteMonitor)/ParMultiDo(ByteMonitor)
Dec 19, 2022 12:55:51 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:51.867Z: Fusing consumer 
ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor) into Step: 
0/ParMultiDo(CounterOperation)
Dec 19, 2022 12:55:53 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:51.939Z: Running job using Streaming Engine
Dec 19, 2022 12:55:53 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:53.168Z: Starting 5 ****s in us-central1-b...
Dec 19, 2022 12:55:56 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:55:55.596Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Dec 19, 2022 12:56:37 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:56:35.993Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Dec 19, 2022 12:57:36 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:57:35.675Z: Workers have started successfully.
Dec 19, 2022 12:57:38 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:57:36.878Z: All ****s have finished the startup processes 
and began to receive work requests.
Dec 19, 2022 1:00:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:59:59.448Z: Cleaning up.
Dec 19, 2022 1:00:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:59:59.515Z: Stopping **** pool...
Dec 19, 2022 1:00:01 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T12:59:59.573Z: Stopping **** pool...
Dec 19, 2022 1:02:16 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T13:02:15.565Z: Autoscaling: Reduced the number of ****s to 0 
based on low average **** CPU utilization, and the pipeline having sufficiently 
low backlog and keeping up with input rate.
Dec 19, 2022 1:02:16 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-12-19T13:02:15.650Z: Worker pool stopped.
Dec 19, 2022 1:02:21 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2022-12-19_04_55_36-292440412000617621 finished with status DONE.
Load test results for test (ID): 59cfa2c2-4ac5-447a-8575-124520cecd95 and 
timestamp: 2022-12-19T12:55:31.569000000Z:
                 Metric:                    Value:
dataflow_v2_java11_runtime_sec                    27.051
dataflow_v2_java11_total_bytes_count                1.999998E9

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221219125504
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:de3a644f2104eeda14ef6afaa6c9e5051fc6c3d641f9f19826271f1b65acc762
ERROR: (gcloud.container.images.untag) Image could not be found: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20221219125504]

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'>
 line: 301

* What went wrong:
Execution failed for task 
':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 7m 33s
109 actionable tasks: 9 executed, 100 up-to-date

Publishing build scan...
https://gradle.com/s/gjvxnnz7hy66m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to