See <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_Java11/86/display/redirect?page=changes>
Changes: [noreply] Added type annotations to some combiners missing it. (#15414) [noreply] [BEAM-12634] JmsIO auto scaling feature (#15464) [noreply] [BEAM-12662] Get Flink version from cluster. (#15223) [noreply] Port changes from Pub/Sub Lite to beam (#15418) [heejong] [BEAM-12805] Fix XLang CombinePerKey test by explicitly assigning the [BenWhitehead] [BEAM-8376] Google Cloud Firestore Connector - Add handling for [noreply] Decreasing peak memory usage for beam.TupleCombineFn (#15494) [noreply] [BEAM-12802] Add support for prefetch through data layers down through [noreply] [BEAM-11097] Add implementation of side input cache (#15483) ------------------------------------------ [...truncated 39.38 KB...] > Task :sdks:java:container:buildLinuxAmd64 > Task :sdks:java:container:goBuild > Task :sdks:java:io:google-cloud-platform:jar > Task :sdks:java:testing:load-tests:compileJava FROM-CACHE > Task :sdks:java:testing:load-tests:classes UP-TO-DATE > Task :sdks:java:testing:load-tests:jar > Task :sdks:java:container:java11:copyDockerfileDependencies > Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE > Task :runners:google-cloud-dataflow-java:classes > Task :sdks:java:container:java11:copySdkHarnessLauncher > Task :runners:google-cloud-dataflow-java:jar > Task :sdks:java:container:generateLicenseReport > Task :sdks:java:container:pullLicenses Copying already-fetched licenses from <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_Java11/ws/src/sdks/java/container/build/reports/dependency-license> to <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses> Already using interpreter /usr/bin/python3 Using base prefix '/usr' New python executable in <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python3> Also creating executable in <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python> Installing setuptools, pip, wheel... done. DEPRECATION: Python 3.5 reached the end of its life on September 13th, 2020. Please upgrade your Python as Python 3.5 is no longer maintained. pip 21.0 will drop support for Python 3.5 in January 2021. pip 21.0 will remove support for this functionality. Collecting beautifulsoup4<5.0,>=4.9.0 Using cached beautifulsoup4-4.10.0-py3-none-any.whl (97 kB) Collecting future<1.0.0,>=0.16.0 Using cached future-0.18.2-py3-none-any.whl Collecting pyyaml<6.0.0,>=3.12 Using cached PyYAML-5.3.1-cp35-cp35m-linux_x86_64.whl Collecting tenacity<6.0,>=5.0.2 Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB) Collecting soupsieve>1.2 Using cached soupsieve-2.1-py3-none-any.whl (32 kB) Collecting six>=1.9.0 Using cached six-1.16.0-py2.py3-none-any.whl (11 kB) Installing collected packages: soupsieve, six, tenacity, pyyaml, future, beautifulsoup4 Successfully installed beautifulsoup4-4.10.0 future-0.18.2 pyyaml-5.3.1 six-1.16.0 soupsieve-2.1 tenacity-5.1.5 Executing <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_Java11/ws/src/sdks/java/container/build/virtualenv/bin/python> <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_Java11/ws/src/sdks/java/container/license_scripts/pull_licenses_java.py> --license_index=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_Java11/ws/src/sdks/java/container/build/reports/dependency-license/index.json> --output_dir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses> --dep_url_yaml=<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_Java11/ws/src/sdks/java/container/license_scripts/dep_urls_java.yaml> INFO:root:Pulling license for 207 dependencies using 16 threads. INFO:root:pull_licenses_java.py succeed. It took 1.901557 seconds with 16 threads. Copying licenses from <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_Java11/ws/src/sdks/java/container/build/target/java_third_party_licenses> to <https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_V2_Batch_Java11/ws/src/sdks/java/container/build/target/third_party_licenses.> Finished license_scripts.sh > Task :sdks:java:container:java11:copyJavaThirdPartyLicenses > Task :release:go-licenses:java:dockerRun + go-licenses save github.com/apache/beam/sdks/java/container --save_path=/output/licenses + tee /output/licenses/list.csv + go-licenses csv github.com/apache/beam/sdks/java/container google.golang.org/genproto/googleapis/rpc/status,https://github.com/googleapis/go-genproto/blob/master/LICENSE,Apache-2.0 golang.org/x/text,https://go.googlesource.com/text/+/refs/heads/master/LICENSE,BSD-3-Clause github.com/apache/beam/sdks/java/container,https://github.com/apache/beam/blob/master/LICENSE,Apache-2.0 google.golang.org/protobuf,https://go.googlesource.com/protobuf/+/refs/heads/master/LICENSE,BSD-3-Clause golang.org/x/net,https://go.googlesource.com/net/+/refs/heads/master/LICENSE,BSD-3-Clause github.com/golang/protobuf,https://github.com/golang/protobuf/blob/master/LICENSE,BSD-3-Clause golang.org/x/sys,https://go.googlesource.com/sys/+/refs/heads/master/LICENSE,BSD-3-Clause github.com/apache/beam/sdks/go/pkg/beam,https://github.com/apache/beam/blob/master/sdks/go/README.md,Apache-2.0 google.golang.org/grpc,https://github.com/grpc/grpc-go/blob/master/LICENSE,Apache-2.0 + chmod -R a+w /output/licenses > Task :release:go-licenses:java:createLicenses > Task :sdks:java:container:java11:copyGolangLicenses > Task :sdks:java:container:java11:dockerPrepare > Task :sdks:java:container:java11:docker > Task :runners:google-cloud-dataflow-java:buildAndPushDockerContainer WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03. As an alternative, use `gcloud auth configure-docker` to configure `docker` to use `gcloud` as a credential helper, then use `docker` as you would for non-GCR registries, e.g. `docker pull gcr.io/project-id/my-image`. Add `--verbosity=error` to silence this warning: `gcloud docker --verbosity=error -- pull gcr.io/project-id/my-image`. See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker The push refers to repository [us.gcr.io/apache-beam-testing/java-postcommit-it/java] d85d85fc8471: Preparing b9e6c9ef50b9: Preparing 4e61b87589ad: Preparing 59a72b9662aa: Preparing 7d39305ae273: Preparing 4b7f8c7e6ff5: Preparing 9eeffc2fb27f: Preparing db3448b9683f: Preparing 5cdfd59132d3: Preparing a174b3724a73: Preparing dfb348c0ad6f: Preparing 0ecf8942c06e: Preparing 3891808a925b: Preparing d402f4f1b906: Preparing 00ef5416d927: Preparing 8555e663f65b: Preparing d00da3cd7763: Preparing 4e61e63529c2: Preparing 799760671c38: Preparing 4b7f8c7e6ff5: Waiting d402f4f1b906: Waiting 9eeffc2fb27f: Waiting 00ef5416d927: Waiting db3448b9683f: Waiting 8555e663f65b: Waiting 5cdfd59132d3: Waiting d00da3cd7763: Waiting 4e61e63529c2: Waiting 0ecf8942c06e: Waiting 3891808a925b: Waiting 799760671c38: Waiting a174b3724a73: Waiting dfb348c0ad6f: Waiting b9e6c9ef50b9: Pushed 4e61b87589ad: Pushed 7d39305ae273: Pushed 4b7f8c7e6ff5: Pushed d85d85fc8471: Pushed 59a72b9662aa: Pushed db3448b9683f: Pushed 5cdfd59132d3: Pushed 3891808a925b: Layer already exists dfb348c0ad6f: Pushed d402f4f1b906: Layer already exists 8555e663f65b: Layer already exists 00ef5416d927: Layer already exists 0ecf8942c06e: Pushed d00da3cd7763: Layer already exists 4e61e63529c2: Layer already exists 9eeffc2fb27f: Pushed 799760671c38: Layer already exists a174b3724a73: Pushed 20210911124138: digest: sha256:9a7e0c2f1d6751965c9c2cdedce143382f0d1e06215c9fde02c5df3a655d3404 size: 4311 > Task :sdks:java:testing:load-tests:run Sep 11, 2021 12:43:50 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Sep 11, 2021 12:43:51 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 195 files. Enable logging at DEBUG level to see which files will be staged. Sep 11, 2021 12:43:51 PM org.apache.beam.sdk.Pipeline validate WARNING: The following transforms do not have stable unique names: ParDo(TimeMonitor) Sep 11, 2021 12:43:51 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Sep 11, 2021 12:43:55 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 195 files from PipelineOptions.filesToStage to staging location to prepare for execution. Sep 11, 2021 12:43:55 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 195 files cached, 0 files newly uploaded in 0 seconds Sep 11, 2021 12:43:55 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Sep 11, 2021 12:43:55 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <98992 bytes, hash 39ac49dcc0295133773d37596c30c228b20fa679a5f4180b2cea5b6c53a1dabe> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-OaxJ3MApUTN3PTdZbDDCKLIPpnml9BgLLOpbbFOh2r4.pb Sep 11, 2021 12:43:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input as step s1 Sep 11, 2021 12:43:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(TimeMonitor) as step s2 Sep 11, 2021 12:43:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(ByteMonitor) as step s3 Sep 11, 2021 12:43:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 0 as step s4 Sep 11, 2021 12:43:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 1 as step s5 Sep 11, 2021 12:43:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 2 as step s6 Sep 11, 2021 12:43:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 3 as step s7 Sep 11, 2021 12:43:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 4 as step s8 Sep 11, 2021 12:43:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 5 as step s9 Sep 11, 2021 12:43:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 6 as step s10 Sep 11, 2021 12:43:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 7 as step s11 Sep 11, 2021 12:43:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 8 as step s12 Sep 11, 2021 12:43:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Step: 9 as step s13 Sep 11, 2021 12:43:57 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding ParDo(TimeMonitor)2 as step s14 Sep 11, 2021 12:43:57 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.34.0-SNAPSHOT Sep 11, 2021 12:43:58 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-11_05_43_57-11879419014159715296?project=apache-beam-testing Sep 11, 2021 12:43:58 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2021-09-11_05_43_57-11879419014159715296 Sep 11, 2021 12:43:58 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2021-09-11_05_43_57-11879419014159715296 Sep 11, 2021 12:44:05 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2021-09-11T12:44:04.615Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java110dataflow0v20batch0pardo01-jenkins-091112-mwup. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Sep 11, 2021 12:44:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-11T12:44:09.089Z: Worker configuration: e2-standard-2 in us-central1-c. Sep 11, 2021 12:44:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process SEVERE: 2021-09-11T12:44:09.809Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/24357 instances, 2/0 CPUs, 25/176516 disk GB, 0/2397 SSD disk GB, 1/223 instance groups, 1/226 managed instance groups, 1/452 instance templates, 1/586 in-use IP addresses. Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota. Sep 11, 2021 12:44:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-11T12:44:09.855Z: Cleaning up. Sep 11, 2021 12:44:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-11T12:44:09.909Z: Worker pool stopped. Sep 11, 2021 12:44:11 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-11T12:44:11.165Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Sep 11, 2021 12:44:14 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState INFO: Job 2021-09-11_05_43_57-11879419014159715296 failed with status FAILED. Sep 11, 2021 12:44:14 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric SEVERE: Failed to get metric totalBytes.count, from namespace pardo Load test results for test (ID): 436db79e-d893-4d94-85b4-33cfc4688d06 and timestamp: 2021-09-11T12:43:51.492000000Z: Metric: Value: dataflow_v2_java11_runtime_sec 0.0 dataflow_v2_java11_total_bytes_count -1.0 Exception in thread "main" java.lang.RuntimeException: Invalid job state: FAILED. at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55) at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141) at org.apache.beam.sdk.loadtests.ParDoLoadTest.run(ParDoLoadTest.java:53) at org.apache.beam.sdk.loadtests.ParDoLoadTest.main(ParDoLoadTest.java:103) > Task :sdks:java:testing:load-tests:run FAILED > Task :runners:google-cloud-dataflow-java:cleanUpDockerImages Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210911124138 Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9a7e0c2f1d6751965c9c2cdedce143382f0d1e06215c9fde02c5df3a655d3404 Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210911124138] - referencing digest: [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9a7e0c2f1d6751965c9c2cdedce143382f0d1e06215c9fde02c5df3a655d3404] Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210911124138] (referencing [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9a7e0c2f1d6751965c9c2cdedce143382f0d1e06215c9fde02c5df3a655d3404])]. Removing untagged image us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9a7e0c2f1d6751965c9c2cdedce143382f0d1e06215c9fde02c5df3a655d3404 Digests: - us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9a7e0c2f1d6751965c9c2cdedce143382f0d1e06215c9fde02c5df3a655d3404 Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:9a7e0c2f1d6751965c9c2cdedce143382f0d1e06215c9fde02c5df3a655d3404]. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:testing:load-tests:run'. > Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with > non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 2m 57s 101 actionable tasks: 70 executed, 29 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/okzhzbkyzna6g Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
