See <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/350/display/redirect?page=changes>
Changes: [noreply] [BEAM-14441] Enable GitHub issues (#17812) [Pablo Estrada] Revert "Merge pull request #17492 from [BEAM-13945] (FIX) Update Java BQ [noreply] Alias worker_harness_container_image to sdk_container_image (#17817) [noreply] [BEAM-14546] Fix errant pass for empty collections in Count (#17813) [noreply] Merge pull request #17741 from [BEAM-14504] Add support for including [noreply] Merge pull request #18374 from [BEAM-13945] Roll forward JSON support [noreply] Merge pull request #17792 from [BEAM-13756] [Playground] Merge Log and [noreply] Merge pull request #17779: [BEAM-14529] Add integer to float64 [noreply] [BEAM-14556] Honor the formatter installed on the root handler. (#17820) ------------------------------------------ [...truncated 229.88 KB...] > Task :sdks:java:harness:compileJava UP-TO-DATE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:harness:jar UP-TO-DATE > Task :runners:java-fn-execution:compileJava UP-TO-DATE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :runners:java-fn-execution:jar UP-TO-DATE > Task :sdks:java:harness:shadowJar UP-TO-DATE > Task :sdks:java:expansion-service:compileJava UP-TO-DATE > Task :sdks:java:expansion-service:classes UP-TO-DATE > Task :sdks:java:expansion-service:jar UP-TO-DATE > Task :sdks:java:io:kafka:compileJava UP-TO-DATE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :sdks:java:io:kafka:jar UP-TO-DATE > Task :sdks:java:container:java11:copyDockerfileDependencies UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE > Task :sdks:java:testing:load-tests:compileJava UP-TO-DATE > Task :sdks:java:testing:load-tests:classes UP-TO-DATE > Task :sdks:java:testing:load-tests:jar UP-TO-DATE > Task :runners:google-cloud-dataflow-java:compileJava UP-TO-DATE > Task :runners:google-cloud-dataflow-java:classes UP-TO-DATE > Task :runners:google-cloud-dataflow-java:jar UP-TO-DATE > Task :sdks:java:container:goBuild /home/jenkins/go/bin/go1.18.1 build -o ./build/target/linux_amd64/boot boot.go boot_test.go > Task :sdks:java:container:java11:copySdkHarnessLauncher Execution optimizations have been disabled for task ':sdks:java:container:java11:copySdkHarnessLauncher' to ensure correctness due to the following reasons: - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:downloadCloudProfilerAgent' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.4/userguide/validation_problems.html#implicit_dependency for more details about this problem. - Gradle detected a problem with the following location: '<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/sdks/java/container/build/target'.> Reason: Task ':sdks:java:container:java11:copySdkHarnessLauncher' uses this output of task ':sdks:java:container:pullLicenses' without declaring an explicit or implicit dependency. This can lead to incorrect results being produced, depending on what order the tasks are executed. Please refer to https://docs.gradle.org/7.4/userguide/validation_problems.html#implicit_dependency for more details about this problem. > Task :sdks:java:container:java11:dockerPrepare > Task :sdks:java:container:java11:docker > Task :runners:google-cloud-dataflow-java:buildAndPushDockerJavaContainer WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03. As an alternative, use `gcloud auth configure-docker` to configure `docker` to use `gcloud` as a credential helper, then use `docker` as you would for non-GCR registries, e.g. `docker pull gcr.io/project-id/my-image`. Add `--verbosity=error` to silence this warning: `gcloud docker --verbosity=error -- pull gcr.io/project-id/my-image`. See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java] cd21161bb694: Preparing cdfe215509fd: Preparing 626911c73cab: Preparing ef497a2908f1: Preparing 016de91418ea: Preparing 81f26c95ad9f: Preparing f8449623f1a1: Preparing e291b67e5e89: Preparing 4bf9c1031a3d: Preparing 0a7f0ae0aa45: Preparing dd875320607e: Preparing 6db75f895834: Preparing 95aeed01ac89: Preparing c72fcda8715f: Preparing 63685fcc254d: Preparing e5ce43743a3d: Preparing d744b7303bde: Preparing 817e710a8d04: Preparing ee509ed6e976: Preparing 9177197c67d0: Preparing 7dbadf2b9bd8: Preparing e7597c345c2e: Preparing 95aeed01ac89: Waiting c72fcda8715f: Waiting 4bf9c1031a3d: Waiting 63685fcc254d: Waiting e5ce43743a3d: Waiting dd875320607e: Waiting 6db75f895834: Waiting d744b7303bde: Waiting 0a7f0ae0aa45: Waiting 817e710a8d04: Waiting 81f26c95ad9f: Waiting ee509ed6e976: Waiting 9177197c67d0: Waiting f8449623f1a1: Waiting e291b67e5e89: Waiting e7597c345c2e: Waiting 7dbadf2b9bd8: Waiting ef497a2908f1: Pushed 016de91418ea: Pushed cdfe215509fd: Pushed 626911c73cab: Pushed cd21161bb694: Pushed f8449623f1a1: Pushed e291b67e5e89: Pushed 0a7f0ae0aa45: Pushed 81f26c95ad9f: Pushed dd875320607e: Pushed 6db75f895834: Pushed e5ce43743a3d: Layer already exists d744b7303bde: Layer already exists 817e710a8d04: Layer already exists ee509ed6e976: Layer already exists 9177197c67d0: Layer already exists 7dbadf2b9bd8: Layer already exists e7597c345c2e: Layer already exists 4bf9c1031a3d: Pushed c72fcda8715f: Pushed 63685fcc254d: Pushed 95aeed01ac89: Pushed 20220604125217: digest: sha256:e1d9d269fc49099089b24cc6f7240dbc34941ca466251080f25b9eddee4b1a10 size: 4935 > Task :sdks:java:testing:load-tests:run Jun 04, 2022 12:52:47 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Jun 04, 2022 12:52:48 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 222 files. Enable logging at DEBUG level to see which files will be staged. Jun 04, 2022 12:52:48 PM org.apache.beam.sdk.Pipeline validate WARNING: The following transforms do not have stable unique names: ParDo(TimeMonitor) Jun 04, 2022 12:52:48 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Jun 04, 2022 12:52:50 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 222 files from PipelineOptions.filesToStage to staging location to prepare for execution. Jun 04, 2022 12:52:51 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 222 files cached, 0 files newly uploaded in 0 seconds Jun 04, 2022 12:52:51 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Jun 04, 2022 12:52:51 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <96143 bytes, hash 2e7fe81d067d89e5efca578f81690f08d186bc2cc058fe62492204a5102ac83c> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Ln_oHQZ9ieXvylePgWkPCNGGvCzAWP5iSSIEpRAqyDw.pb Jun 04, 2022 12:52:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Jun 04, 2022 12:52:53 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles: [SyntheticUnboundedSource{startOffset=0, endOffset=1000000}, SyntheticUnboundedSource{startOffset=1000000, endOffset=2000000}, SyntheticUnboundedSource{startOffset=2000000, endOffset=3000000}, SyntheticUnboundedSource{startOffset=3000000, endOffset=4000000}, SyntheticUnboundedSource{startOffset=4000000, endOffset=5000000}, SyntheticUnboundedSource{startOffset=5000000, endOffset=6000000}, SyntheticUnboundedSource{startOffset=6000000, endOffset=7000000}, SyntheticUnboundedSource{startOffset=7000000, endOffset=8000000}, SyntheticUnboundedSource{startOffset=8000000, endOffset=9000000}, SyntheticUnboundedSource{startOffset=9000000, endOffset=10000000}, SyntheticUnboundedSource{startOffset=10000000, endOffset=11000000}, SyntheticUnboundedSource{startOffset=11000000, endOffset=12000000}, SyntheticUnboundedSource{startOffset=12000000, endOffset=13000000}, SyntheticUnboundedSource{startOffset=13000000, endOffset=14000000}, SyntheticUnboundedSource{startOffset=14000000, endOffset=15000000}, SyntheticUnboundedSource{startOffset=15000000, endOffset=16000000}, SyntheticUnboundedSource{startOffset=16000000, endOffset=17000000}, SyntheticUnboundedSource{startOffset=17000000, endOffset=18000000}, SyntheticUnboundedSource{startOffset=18000000, endOffset=19000000}, SyntheticUnboundedSource{startOffset=19000000, endOffset=20000000}] Jun 04, 2022 12:52:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Jun 04, 2022 12:52:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(TimeMonitor) as step s3 Jun 04, 2022 12:52:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(ByteMonitor) as step s4 Jun 04, 2022 12:52:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 0 as step s5 Jun 04, 2022 12:52:53 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(TimeMonitor)2 as step s6 Jun 04, 2022 12:52:53 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.40.0-SNAPSHOT Jun 04, 2022 12:52:54 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2022-06-04_05_52_53-12863792853358855932?project=apache-beam-testing Jun 04, 2022 12:52:54 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2022-06-04_05_52_53-12863792853358855932 Jun 04, 2022 12:52:54 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2022-06-04_05_52_53-12863792853358855932 Jun 04, 2022 12:53:01 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2022-06-04T12:53:00.243Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20streaming0pardo03-jenkins-06-9fb4. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Jun 04, 2022 12:53:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:07.324Z: Worker configuration: e2-standard-2 in us-central1-b. Jun 04, 2022 12:53:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:08.138Z: Expanding SplittableParDo operations into optimizable parts. Jun 04, 2022 12:53:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:08.175Z: Expanding CollectionToSingleton operations into optimizable parts. Jun 04, 2022 12:53:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:08.237Z: Expanding CoGroupByKey operations into optimizable parts. Jun 04, 2022 12:53:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:08.293Z: Expanding SplittableProcessKeyed operations into optimizable parts. Jun 04, 2022 12:53:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:08.323Z: Expanding GroupByKey operations into streaming Read/Write steps Jun 04, 2022 12:53:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:08.348Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns Jun 04, 2022 12:53:08 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:08.425Z: Fusing adjacent ParDo, Read, Write, and Flatten operations Jun 04, 2022 12:53:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:08.460Z: Fusing consumer Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Read input/Impulse Jun 04, 2022 12:53:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:08.497Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction into Read input/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) Jun 04, 2022 12:53:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:08.527Z: Fusing consumer Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/SplitWithSizing into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/PairWithRestriction Jun 04, 2022 12:53:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:08.578Z: Fusing consumer Read input/ParDo(StripIds)/ParMultiDo(StripIds) into Read-input-ParDo-UnboundedSourceAsSDFWrapper--ParMultiDo-UnboundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing Jun 04, 2022 12:53:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:08.611Z: Fusing consumer ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) into Read input/ParDo(StripIds)/ParMultiDo(StripIds) Jun 04, 2022 12:53:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:08.655Z: Fusing consumer ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) into ParDo(TimeMonitor)/ParMultiDo(TimeMonitor) Jun 04, 2022 12:53:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:08.681Z: Fusing consumer Step: 0/ParMultiDo(CounterOperation) into ParDo(ByteMonitor)/ParMultiDo(ByteMonitor) Jun 04, 2022 12:53:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:08.712Z: Fusing consumer ParDo(TimeMonitor)2/ParMultiDo(TimeMonitor) into Step: 0/ParMultiDo(CounterOperation) Jun 04, 2022 12:53:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:08.868Z: Running job using Streaming Engine Jun 04, 2022 12:53:10 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:09.121Z: Starting 5 ****s in us-central1-b... Jun 04, 2022 12:53:23 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:21.803Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Jun 04, 2022 12:53:32 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:53:31.248Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. Jun 04, 2022 12:54:42 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:54:41.041Z: Workers have started successfully. Jun 04, 2022 12:56:28 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:56:27.803Z: Cleaning up. Jun 04, 2022 12:56:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:56:28.839Z: Stopping **** pool... Jun 04, 2022 12:56:31 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:56:28.995Z: Stopping **** pool... Jun 04, 2022 12:57:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:57:06.579Z: Autoscaling: Reduced the number of ****s to 0 based on low average **** CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate. Jun 04, 2022 12:57:06 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2022-06-04T12:57:06.641Z: Worker pool stopped. Jun 04, 2022 12:57:17 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState INFO: Job 2022-06-04_05_52_53-12863792853358855932 finished with status DONE. Load test results for test (ID): ba5b2fe6-6758-40aa-bd2c-9df929269d54 and timestamp: 2022-06-04T12:52:48.518000000Z: Metric: Value: dataflow_v2_java11_runtime_sec 25.514 dataflow_v2_java11_total_bytes_count 1.999998E9 > Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages FAILED Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220604125217 Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e1d9d269fc49099089b24cc6f7240dbc34941ca466251080f25b9eddee4b1a10 ERROR: (gcloud.container.images.untag) Image could not be found: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220604125217] FAILURE: Build failed with an exception. * Where: Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Streaming_Java11/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 294 * What went wrong: Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'. > Process 'command 'gcloud'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings Execution optimizations have been disabled for 1 invalid unit(s) of work during this build to ensure correctness. Please consult deprecation warnings for more details. BUILD FAILED in 5m 3s 105 actionable tasks: 8 executed, 97 up-to-date Publishing build scan... https://gradle.com/s/yqachd2yxxwiu Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
